Apr 22 19:57:44.563134 ip-10-0-143-253 systemd[1]: Starting Kubernetes Kubelet... Apr 22 19:57:45.029535 ip-10-0-143-253 kubenswrapper[2577]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 19:57:45.029535 ip-10-0-143-253 kubenswrapper[2577]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 22 19:57:45.029535 ip-10-0-143-253 kubenswrapper[2577]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 19:57:45.029535 ip-10-0-143-253 kubenswrapper[2577]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 22 19:57:45.029535 ip-10-0-143-253 kubenswrapper[2577]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 19:57:45.031455 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.031357 2577 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 22 19:57:45.040083 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.040055 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 19:57:45.040083 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.040078 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 19:57:45.040083 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.040083 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 19:57:45.040083 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.040086 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 19:57:45.040083 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.040090 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 19:57:45.040083 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.040093 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 19:57:45.040329 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.040097 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 19:57:45.040329 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.040100 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 19:57:45.040329 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.040102 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 19:57:45.040329 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.040105 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 19:57:45.040329 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.040108 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 19:57:45.040329 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.040110 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 19:57:45.040329 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.040113 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 19:57:45.040329 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.040115 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 19:57:45.040329 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.040118 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 19:57:45.040329 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.040121 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 19:57:45.040329 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.040124 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 19:57:45.040329 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.040127 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 19:57:45.040329 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.040130 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 19:57:45.040329 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.040133 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 19:57:45.040329 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.040135 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 19:57:45.040329 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.040138 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 19:57:45.040329 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.040141 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 19:57:45.040329 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.040144 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 19:57:45.040329 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.040146 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 19:57:45.040329 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.040149 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 19:57:45.040801 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.040151 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 19:57:45.040801 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.040154 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 19:57:45.040801 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.040156 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 19:57:45.040801 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.040159 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 19:57:45.040801 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.040161 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 19:57:45.040801 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.040164 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 19:57:45.040801 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.040170 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 19:57:45.040801 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.040173 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 19:57:45.040801 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.040175 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 19:57:45.040801 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.040178 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 22 19:57:45.040801 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.040181 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 19:57:45.040801 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.040183 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 19:57:45.040801 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.040186 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 19:57:45.040801 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.040189 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 19:57:45.040801 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.040193 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 19:57:45.040801 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.040196 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 19:57:45.040801 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.040198 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 19:57:45.040801 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.040201 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 19:57:45.040801 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.040204 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 19:57:45.040801 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.040207 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 19:57:45.040801 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.040209 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 19:57:45.041335 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.040212 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 19:57:45.041335 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.040214 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 19:57:45.041335 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.040217 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 19:57:45.041335 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.040220 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 19:57:45.041335 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.040222 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 19:57:45.041335 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.040225 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 19:57:45.041335 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.040227 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 19:57:45.041335 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.040230 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 19:57:45.041335 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.040233 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 19:57:45.041335 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.040236 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 19:57:45.041335 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.040238 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 19:57:45.041335 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.040241 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 19:57:45.041335 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.040244 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 19:57:45.041335 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.040246 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 19:57:45.041335 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.040250 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 19:57:45.041335 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.040254 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 19:57:45.041335 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.040257 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 19:57:45.041335 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.040259 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 19:57:45.041335 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.040262 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 19:57:45.041827 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.040264 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 19:57:45.041827 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.040267 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 19:57:45.041827 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.040269 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 19:57:45.041827 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.040272 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 19:57:45.041827 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.040275 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 19:57:45.041827 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.040279 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 19:57:45.041827 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.040282 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 19:57:45.041827 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.040285 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 19:57:45.041827 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.040287 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 19:57:45.041827 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.040291 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 19:57:45.041827 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.040294 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 19:57:45.041827 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.040296 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 19:57:45.041827 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.040300 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 19:57:45.041827 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.040303 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 19:57:45.041827 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.040305 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 19:57:45.041827 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.040308 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 19:57:45.041827 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.040310 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 19:57:45.041827 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.040313 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 19:57:45.041827 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.040316 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 19:57:45.041827 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.040318 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 19:57:45.042327 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.040729 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 19:57:45.042327 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.040733 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 19:57:45.042327 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.040736 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 19:57:45.042327 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.040739 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 19:57:45.042327 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.040742 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 22 19:57:45.042327 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.040744 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 19:57:45.042327 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.040747 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 19:57:45.042327 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.040750 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 19:57:45.042327 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.040753 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 19:57:45.042327 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.040756 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 19:57:45.042327 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.040759 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 19:57:45.042327 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.040762 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 19:57:45.042327 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.040765 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 19:57:45.042327 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.040768 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 19:57:45.042327 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.040771 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 19:57:45.042327 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.040773 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 19:57:45.042327 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.040776 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 19:57:45.042327 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.040779 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 19:57:45.042327 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.040782 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 19:57:45.042327 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.040784 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 19:57:45.042851 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.040788 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 19:57:45.042851 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.040790 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 19:57:45.042851 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.040794 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 19:57:45.042851 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.040797 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 19:57:45.042851 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.040800 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 19:57:45.042851 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.040803 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 19:57:45.042851 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.040806 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 19:57:45.042851 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.040809 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 19:57:45.042851 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.040812 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 19:57:45.042851 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.040814 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 19:57:45.042851 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.040817 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 19:57:45.042851 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.040820 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 19:57:45.042851 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.040823 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 19:57:45.042851 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.040825 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 19:57:45.042851 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.040828 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 19:57:45.042851 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.040830 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 19:57:45.042851 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.040852 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 19:57:45.042851 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.040856 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 19:57:45.042851 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.040859 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 19:57:45.042851 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.040861 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 19:57:45.043385 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.040865 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 19:57:45.043385 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.040869 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 19:57:45.043385 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.040873 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 19:57:45.043385 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.040883 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 19:57:45.043385 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.040888 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 19:57:45.043385 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.040891 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 19:57:45.043385 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.040894 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 19:57:45.043385 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.040896 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 19:57:45.043385 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.040899 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 19:57:45.043385 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.040902 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 19:57:45.043385 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.040904 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 19:57:45.043385 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.040907 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 19:57:45.043385 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.040909 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 19:57:45.043385 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.040912 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 19:57:45.043385 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.040914 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 19:57:45.043385 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.040917 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 19:57:45.043385 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.040919 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 19:57:45.043385 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.040923 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 19:57:45.043385 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.040925 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 19:57:45.043942 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.040928 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 19:57:45.043942 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.040930 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 19:57:45.043942 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.040934 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 19:57:45.043942 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.040938 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 19:57:45.043942 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.040941 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 19:57:45.043942 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.040944 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 19:57:45.043942 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.040947 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 19:57:45.043942 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.040950 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 19:57:45.043942 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.040953 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 19:57:45.043942 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.040956 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 19:57:45.043942 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.040958 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 19:57:45.043942 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.040961 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 19:57:45.043942 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.040963 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 19:57:45.043942 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.040967 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 19:57:45.043942 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.040969 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 19:57:45.043942 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.040972 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 19:57:45.043942 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.040974 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 19:57:45.043942 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.040977 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 19:57:45.043942 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.040980 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 19:57:45.044402 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.040982 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 19:57:45.044402 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.040985 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 19:57:45.044402 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.040987 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 19:57:45.044402 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.040990 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 19:57:45.044402 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.040992 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 19:57:45.044402 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.040995 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 19:57:45.044402 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.040997 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 19:57:45.044402 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.041000 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 19:57:45.044402 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.042217 2577 flags.go:64] FLAG: --address="0.0.0.0" Apr 22 19:57:45.044402 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.042229 2577 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 22 19:57:45.044402 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.042240 2577 flags.go:64] FLAG: --anonymous-auth="true" Apr 22 19:57:45.044402 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.042244 2577 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 22 19:57:45.044402 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.042254 2577 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 22 19:57:45.044402 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.042258 2577 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 22 19:57:45.044402 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.042262 2577 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 22 19:57:45.044402 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.042267 2577 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 22 19:57:45.044402 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.042270 2577 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 22 19:57:45.044402 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.042273 2577 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 22 19:57:45.044402 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.042277 2577 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 22 19:57:45.044402 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.042280 2577 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 22 19:57:45.044402 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.042283 2577 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 22 19:57:45.044402 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.042287 2577 flags.go:64] FLAG: --cgroup-root="" Apr 22 19:57:45.044959 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.042290 2577 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 22 19:57:45.044959 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.042293 2577 flags.go:64] FLAG: --client-ca-file="" Apr 22 19:57:45.044959 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.042296 2577 flags.go:64] FLAG: --cloud-config="" Apr 22 19:57:45.044959 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.042299 2577 flags.go:64] FLAG: --cloud-provider="external" Apr 22 19:57:45.044959 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.042302 2577 flags.go:64] FLAG: --cluster-dns="[]" Apr 22 19:57:45.044959 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.042307 2577 flags.go:64] FLAG: --cluster-domain="" Apr 22 19:57:45.044959 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.042310 2577 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 22 19:57:45.044959 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.042313 2577 flags.go:64] FLAG: --config-dir="" Apr 22 19:57:45.044959 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.042316 2577 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 22 19:57:45.044959 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.042319 2577 flags.go:64] FLAG: --container-log-max-files="5" Apr 22 19:57:45.044959 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.042323 2577 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 22 19:57:45.044959 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.042326 2577 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 22 19:57:45.044959 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.042329 2577 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 22 19:57:45.044959 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.042333 2577 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 22 19:57:45.044959 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.042336 2577 flags.go:64] FLAG: --contention-profiling="false" Apr 22 19:57:45.044959 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.042339 2577 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 22 19:57:45.044959 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.042343 2577 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 22 19:57:45.044959 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.042346 2577 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 22 19:57:45.044959 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.042349 2577 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 22 19:57:45.044959 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.042354 2577 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 22 19:57:45.044959 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.042357 2577 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 22 19:57:45.044959 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.042360 2577 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 22 19:57:45.044959 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.042363 2577 flags.go:64] FLAG: --enable-load-reader="false" Apr 22 19:57:45.044959 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.042366 2577 flags.go:64] FLAG: --enable-server="true" Apr 22 19:57:45.044959 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.042369 2577 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 22 19:57:45.045558 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.042374 2577 flags.go:64] FLAG: --event-burst="100" Apr 22 19:57:45.045558 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.042378 2577 flags.go:64] FLAG: --event-qps="50" Apr 22 19:57:45.045558 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.042380 2577 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 22 19:57:45.045558 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.042383 2577 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 22 19:57:45.045558 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.042387 2577 flags.go:64] FLAG: --eviction-hard="" Apr 22 19:57:45.045558 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.042391 2577 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 22 19:57:45.045558 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.042394 2577 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 22 19:57:45.045558 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.042397 2577 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 22 19:57:45.045558 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.042400 2577 flags.go:64] FLAG: --eviction-soft="" Apr 22 19:57:45.045558 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.042403 2577 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 22 19:57:45.045558 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.042406 2577 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 22 19:57:45.045558 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.042409 2577 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 22 19:57:45.045558 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.042412 2577 flags.go:64] FLAG: --experimental-mounter-path="" Apr 22 19:57:45.045558 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.042415 2577 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 22 19:57:45.045558 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.042418 2577 flags.go:64] FLAG: --fail-swap-on="true" Apr 22 19:57:45.045558 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.042421 2577 flags.go:64] FLAG: --feature-gates="" Apr 22 19:57:45.045558 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.042425 2577 flags.go:64] FLAG: --file-check-frequency="20s" Apr 22 19:57:45.045558 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.042428 2577 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 22 19:57:45.045558 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.042432 2577 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 22 19:57:45.045558 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.042435 2577 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 22 19:57:45.045558 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.042438 2577 flags.go:64] FLAG: --healthz-port="10248" Apr 22 19:57:45.045558 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.042442 2577 flags.go:64] FLAG: --help="false" Apr 22 19:57:45.045558 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.042445 2577 flags.go:64] FLAG: --hostname-override="ip-10-0-143-253.ec2.internal" Apr 22 19:57:45.045558 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.042448 2577 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 22 19:57:45.045558 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.042451 2577 flags.go:64] FLAG: --http-check-frequency="20s" Apr 22 19:57:45.046163 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.042454 2577 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 22 19:57:45.046163 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.042457 2577 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 22 19:57:45.046163 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.042461 2577 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 22 19:57:45.046163 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.042464 2577 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 22 19:57:45.046163 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.042466 2577 flags.go:64] FLAG: --image-service-endpoint="" Apr 22 19:57:45.046163 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.042469 2577 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 22 19:57:45.046163 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.042473 2577 flags.go:64] FLAG: --kube-api-burst="100" Apr 22 19:57:45.046163 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.042476 2577 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 22 19:57:45.046163 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.042479 2577 flags.go:64] FLAG: --kube-api-qps="50" Apr 22 19:57:45.046163 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.042482 2577 flags.go:64] FLAG: --kube-reserved="" Apr 22 19:57:45.046163 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.042489 2577 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 22 19:57:45.046163 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.042491 2577 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 22 19:57:45.046163 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.042495 2577 flags.go:64] FLAG: --kubelet-cgroups="" Apr 22 19:57:45.046163 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.042497 2577 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 22 19:57:45.046163 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.042500 2577 flags.go:64] FLAG: --lock-file="" Apr 22 19:57:45.046163 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.042503 2577 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 22 19:57:45.046163 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.042506 2577 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 22 19:57:45.046163 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.042510 2577 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 22 19:57:45.046163 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.042515 2577 flags.go:64] FLAG: --log-json-split-stream="false" Apr 22 19:57:45.046163 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.042518 2577 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 22 19:57:45.046163 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.042521 2577 flags.go:64] FLAG: --log-text-split-stream="false" Apr 22 19:57:45.046163 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.042524 2577 flags.go:64] FLAG: --logging-format="text" Apr 22 19:57:45.046163 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.042527 2577 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 22 19:57:45.046710 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.042530 2577 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 22 19:57:45.046710 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.042533 2577 flags.go:64] FLAG: --manifest-url="" Apr 22 19:57:45.046710 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.042536 2577 flags.go:64] FLAG: --manifest-url-header="" Apr 22 19:57:45.046710 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.042541 2577 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 22 19:57:45.046710 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.042544 2577 flags.go:64] FLAG: --max-open-files="1000000" Apr 22 19:57:45.046710 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.042548 2577 flags.go:64] FLAG: --max-pods="110" Apr 22 19:57:45.046710 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.042551 2577 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 22 19:57:45.046710 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.042554 2577 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 22 19:57:45.046710 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.042557 2577 flags.go:64] FLAG: --memory-manager-policy="None" Apr 22 19:57:45.046710 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.042560 2577 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 22 19:57:45.046710 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.042563 2577 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 22 19:57:45.046710 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.042566 2577 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 22 19:57:45.046710 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.042569 2577 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 22 19:57:45.046710 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.042577 2577 flags.go:64] FLAG: --node-status-max-images="50" Apr 22 19:57:45.046710 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.042579 2577 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 22 19:57:45.046710 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.042583 2577 flags.go:64] FLAG: --oom-score-adj="-999" Apr 22 19:57:45.046710 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.042587 2577 flags.go:64] FLAG: --pod-cidr="" Apr 22 19:57:45.046710 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.042590 2577 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 22 19:57:45.046710 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.042597 2577 flags.go:64] FLAG: --pod-manifest-path="" Apr 22 19:57:45.046710 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.042600 2577 flags.go:64] FLAG: --pod-max-pids="-1" Apr 22 19:57:45.046710 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.042604 2577 flags.go:64] FLAG: --pods-per-core="0" Apr 22 19:57:45.046710 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.042607 2577 flags.go:64] FLAG: --port="10250" Apr 22 19:57:45.046710 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.042611 2577 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 22 19:57:45.046710 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.042614 2577 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-00294ac8449843ad5" Apr 22 19:57:45.047296 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.042617 2577 flags.go:64] FLAG: --qos-reserved="" Apr 22 19:57:45.047296 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.042620 2577 flags.go:64] FLAG: --read-only-port="10255" Apr 22 19:57:45.047296 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.042623 2577 flags.go:64] FLAG: --register-node="true" Apr 22 19:57:45.047296 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.042626 2577 flags.go:64] FLAG: --register-schedulable="true" Apr 22 19:57:45.047296 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.042629 2577 flags.go:64] FLAG: --register-with-taints="" Apr 22 19:57:45.047296 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.042632 2577 flags.go:64] FLAG: --registry-burst="10" Apr 22 19:57:45.047296 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.042635 2577 flags.go:64] FLAG: --registry-qps="5" Apr 22 19:57:45.047296 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.042638 2577 flags.go:64] FLAG: --reserved-cpus="" Apr 22 19:57:45.047296 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.042641 2577 flags.go:64] FLAG: --reserved-memory="" Apr 22 19:57:45.047296 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.042644 2577 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 22 19:57:45.047296 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.042647 2577 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 22 19:57:45.047296 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.042651 2577 flags.go:64] FLAG: --rotate-certificates="false" Apr 22 19:57:45.047296 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.042653 2577 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 22 19:57:45.047296 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.042656 2577 flags.go:64] FLAG: --runonce="false" Apr 22 19:57:45.047296 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.042659 2577 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 22 19:57:45.047296 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.042662 2577 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 22 19:57:45.047296 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.042665 2577 flags.go:64] FLAG: --seccomp-default="false" Apr 22 19:57:45.047296 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.042668 2577 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 22 19:57:45.047296 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.042685 2577 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 22 19:57:45.047296 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.042689 2577 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 22 19:57:45.047296 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.042692 2577 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 22 19:57:45.047296 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.042696 2577 flags.go:64] FLAG: --storage-driver-password="root" Apr 22 19:57:45.047296 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.042699 2577 flags.go:64] FLAG: --storage-driver-secure="false" Apr 22 19:57:45.047296 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.042702 2577 flags.go:64] FLAG: --storage-driver-table="stats" Apr 22 19:57:45.047296 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.042705 2577 flags.go:64] FLAG: --storage-driver-user="root" Apr 22 19:57:45.047296 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.042708 2577 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 22 19:57:45.047933 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.042712 2577 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 22 19:57:45.047933 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.042715 2577 flags.go:64] FLAG: --system-cgroups="" Apr 22 19:57:45.047933 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.042719 2577 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 22 19:57:45.047933 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.042725 2577 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 22 19:57:45.047933 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.042728 2577 flags.go:64] FLAG: --tls-cert-file="" Apr 22 19:57:45.047933 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.042731 2577 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 22 19:57:45.047933 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.042735 2577 flags.go:64] FLAG: --tls-min-version="" Apr 22 19:57:45.047933 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.042738 2577 flags.go:64] FLAG: --tls-private-key-file="" Apr 22 19:57:45.047933 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.042741 2577 flags.go:64] FLAG: --topology-manager-policy="none" Apr 22 19:57:45.047933 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.042745 2577 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 22 19:57:45.047933 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.042748 2577 flags.go:64] FLAG: --topology-manager-scope="container" Apr 22 19:57:45.047933 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.042751 2577 flags.go:64] FLAG: --v="2" Apr 22 19:57:45.047933 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.042756 2577 flags.go:64] FLAG: --version="false" Apr 22 19:57:45.047933 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.042760 2577 flags.go:64] FLAG: --vmodule="" Apr 22 19:57:45.047933 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.042764 2577 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 22 19:57:45.047933 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.042768 2577 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 22 19:57:45.047933 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.042886 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 19:57:45.047933 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.042891 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 19:57:45.047933 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.042895 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 19:57:45.047933 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.042900 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 19:57:45.047933 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.042903 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 19:57:45.047933 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.042906 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 19:57:45.047933 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.042909 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 19:57:45.048487 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.042912 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 19:57:45.048487 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.042915 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 19:57:45.048487 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.042918 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 19:57:45.048487 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.042920 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 19:57:45.048487 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.042923 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 19:57:45.048487 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.042926 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 19:57:45.048487 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.042928 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 19:57:45.048487 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.042931 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 19:57:45.048487 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.042934 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 19:57:45.048487 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.042937 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 19:57:45.048487 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.042940 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 19:57:45.048487 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.042943 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 19:57:45.048487 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.042946 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 19:57:45.048487 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.042948 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 19:57:45.048487 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.042951 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 19:57:45.048487 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.042953 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 19:57:45.048487 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.042956 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 19:57:45.048487 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.042958 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 19:57:45.048487 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.042962 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 19:57:45.049029 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.042965 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 19:57:45.049029 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.042968 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 19:57:45.049029 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.042970 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 19:57:45.049029 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.042973 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 19:57:45.049029 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.042975 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 19:57:45.049029 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.042978 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 19:57:45.049029 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.042980 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 19:57:45.049029 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.042983 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 19:57:45.049029 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.042985 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 19:57:45.049029 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.042988 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 19:57:45.049029 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.042990 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 19:57:45.049029 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.042993 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 19:57:45.049029 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.042996 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 19:57:45.049029 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.042998 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 19:57:45.049029 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.043001 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 19:57:45.049029 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.043004 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 19:57:45.049029 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.043006 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 19:57:45.049029 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.043009 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 19:57:45.049029 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.043012 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 19:57:45.049029 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.043014 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 19:57:45.049532 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.043017 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 19:57:45.049532 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.043020 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 19:57:45.049532 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.043022 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 19:57:45.049532 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.043025 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 19:57:45.049532 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.043028 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 19:57:45.049532 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.043031 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 19:57:45.049532 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.043033 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 22 19:57:45.049532 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.043036 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 19:57:45.049532 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.043039 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 19:57:45.049532 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.043041 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 19:57:45.049532 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.043043 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 19:57:45.049532 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.043048 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 19:57:45.049532 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.043050 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 19:57:45.049532 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.043053 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 19:57:45.049532 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.043055 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 19:57:45.049532 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.043058 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 19:57:45.049532 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.043061 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 19:57:45.049532 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.043063 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 19:57:45.049532 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.043066 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 19:57:45.049532 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.043068 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 19:57:45.050032 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.043071 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 19:57:45.050032 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.043073 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 19:57:45.050032 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.043076 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 19:57:45.050032 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.043078 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 19:57:45.050032 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.043081 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 19:57:45.050032 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.043083 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 19:57:45.050032 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.043086 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 19:57:45.050032 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.043088 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 19:57:45.050032 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.043092 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 19:57:45.050032 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.043095 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 19:57:45.050032 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.043098 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 19:57:45.050032 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.043101 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 19:57:45.050032 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.043104 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 19:57:45.050032 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.043107 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 19:57:45.050032 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.043109 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 19:57:45.050032 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.043112 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 19:57:45.050032 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.043115 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 19:57:45.050032 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.043118 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 19:57:45.050032 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.043120 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 19:57:45.050496 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.043123 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 19:57:45.050496 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.043129 2577 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 19:57:45.050496 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.050006 2577 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 22 19:57:45.050496 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.050026 2577 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 22 19:57:45.050496 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.050076 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 19:57:45.050496 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.050082 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 19:57:45.050496 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.050085 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 19:57:45.050496 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.050089 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 19:57:45.050496 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.050092 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 19:57:45.050496 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.050095 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 19:57:45.050496 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.050097 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 19:57:45.050496 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.050100 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 19:57:45.050496 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.050103 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 19:57:45.050496 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.050105 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 19:57:45.050496 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.050109 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 19:57:45.050895 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.050112 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 19:57:45.050895 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.050114 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 19:57:45.050895 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.050117 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 19:57:45.050895 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.050119 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 19:57:45.050895 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.050122 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 19:57:45.050895 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.050125 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 19:57:45.050895 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.050128 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 19:57:45.050895 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.050131 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 19:57:45.050895 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.050133 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 19:57:45.050895 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.050136 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 19:57:45.050895 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.050139 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 19:57:45.050895 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.050141 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 19:57:45.050895 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.050144 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 19:57:45.050895 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.050146 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 19:57:45.050895 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.050149 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 19:57:45.050895 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.050151 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 19:57:45.050895 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.050154 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 19:57:45.050895 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.050156 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 19:57:45.050895 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.050159 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 19:57:45.050895 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.050162 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 19:57:45.051382 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.050166 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 19:57:45.051382 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.050169 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 19:57:45.051382 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.050171 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 19:57:45.051382 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.050174 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 19:57:45.051382 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.050177 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 19:57:45.051382 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.050179 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 19:57:45.051382 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.050182 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 19:57:45.051382 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.050185 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 19:57:45.051382 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.050187 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 19:57:45.051382 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.050189 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 19:57:45.051382 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.050192 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 19:57:45.051382 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.050194 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 19:57:45.051382 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.050197 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 19:57:45.051382 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.050200 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 19:57:45.051382 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.050203 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 19:57:45.051382 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.050208 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 19:57:45.051382 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.050210 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 19:57:45.051382 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.050213 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 19:57:45.051382 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.050216 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 19:57:45.051382 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.050218 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 19:57:45.051884 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.050221 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 19:57:45.051884 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.050224 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 19:57:45.051884 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.050226 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 19:57:45.051884 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.050229 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 19:57:45.051884 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.050231 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 22 19:57:45.051884 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.050234 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 19:57:45.051884 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.050237 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 19:57:45.051884 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.050239 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 19:57:45.051884 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.050242 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 19:57:45.051884 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.050245 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 19:57:45.051884 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.050248 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 19:57:45.051884 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.050251 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 19:57:45.051884 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.050254 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 19:57:45.051884 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.050258 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 19:57:45.051884 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.050260 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 19:57:45.051884 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.050263 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 19:57:45.051884 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.050266 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 19:57:45.051884 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.050269 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 19:57:45.051884 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.050272 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 19:57:45.051884 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.050274 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 19:57:45.052376 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.050278 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 19:57:45.052376 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.050282 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 19:57:45.052376 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.050285 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 19:57:45.052376 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.050289 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 19:57:45.052376 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.050292 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 19:57:45.052376 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.050294 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 19:57:45.052376 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.050296 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 19:57:45.052376 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.050299 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 19:57:45.052376 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.050301 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 19:57:45.052376 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.050304 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 19:57:45.052376 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.050307 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 19:57:45.052376 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.050310 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 19:57:45.052376 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.050313 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 19:57:45.052376 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.050315 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 19:57:45.052376 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.050318 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 19:57:45.052778 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.050323 2577 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 19:57:45.052778 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.050454 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 19:57:45.052778 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.050459 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 19:57:45.052778 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.050463 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 19:57:45.052778 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.050465 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 19:57:45.052778 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.050468 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 19:57:45.052778 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.050471 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 19:57:45.052778 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.050474 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 19:57:45.052778 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.050477 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 19:57:45.052778 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.050481 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 19:57:45.052778 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.050484 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 19:57:45.052778 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.050487 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 19:57:45.052778 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.050490 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 19:57:45.052778 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.050493 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 19:57:45.052778 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.050495 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 19:57:45.052778 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.050498 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 19:57:45.053232 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.050501 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 19:57:45.053232 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.050504 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 19:57:45.053232 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.050506 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 19:57:45.053232 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.050509 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 19:57:45.053232 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.050512 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 19:57:45.053232 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.050515 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 19:57:45.053232 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.050517 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 19:57:45.053232 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.050520 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 19:57:45.053232 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.050522 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 19:57:45.053232 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.050525 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 19:57:45.053232 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.050527 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 19:57:45.053232 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.050529 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 19:57:45.053232 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.050532 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 19:57:45.053232 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.050535 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 19:57:45.053232 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.050537 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 19:57:45.053232 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.050539 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 19:57:45.053232 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.050543 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 19:57:45.053232 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.050547 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 19:57:45.053232 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.050550 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 19:57:45.053232 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.050553 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 19:57:45.053718 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.050555 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 19:57:45.053718 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.050558 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 19:57:45.053718 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.050562 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 19:57:45.053718 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.050565 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 19:57:45.053718 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.050569 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 19:57:45.053718 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.050572 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 19:57:45.053718 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.050574 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 19:57:45.053718 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.050577 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 19:57:45.053718 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.050581 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 19:57:45.053718 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.050583 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 19:57:45.053718 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.050586 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 19:57:45.053718 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.050589 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 19:57:45.053718 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.050591 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 19:57:45.053718 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.050594 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 19:57:45.053718 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.050597 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 19:57:45.053718 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.050599 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 19:57:45.053718 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.050602 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 19:57:45.053718 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.050604 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 19:57:45.053718 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.050607 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 19:57:45.054198 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.050609 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 19:57:45.054198 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.050612 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 19:57:45.054198 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.050614 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 19:57:45.054198 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.050617 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 19:57:45.054198 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.050619 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 19:57:45.054198 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.050622 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 19:57:45.054198 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.050624 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 19:57:45.054198 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.050627 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 19:57:45.054198 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.050629 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 19:57:45.054198 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.050632 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 19:57:45.054198 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.050635 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 19:57:45.054198 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.050637 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 19:57:45.054198 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.050640 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 19:57:45.054198 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.050642 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 19:57:45.054198 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.050644 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 19:57:45.054198 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.050647 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 19:57:45.054198 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.050649 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 19:57:45.054198 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.050652 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 19:57:45.054198 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.050654 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 19:57:45.054198 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.050657 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 19:57:45.054689 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.050659 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 19:57:45.054689 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.050662 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 19:57:45.054689 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.050665 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 19:57:45.054689 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.050668 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 19:57:45.054689 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.050670 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 19:57:45.054689 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.050673 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 19:57:45.054689 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.050675 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 22 19:57:45.054689 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.050678 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 19:57:45.054689 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.050681 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 19:57:45.054689 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.050683 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 19:57:45.054689 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.050686 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 19:57:45.054689 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:45.050688 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 19:57:45.054689 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.050693 2577 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 19:57:45.054689 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.050830 2577 server.go:962] "Client rotation is on, will bootstrap in background" Apr 22 19:57:45.054689 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.054593 2577 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 22 19:57:45.055670 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.055657 2577 server.go:1019] "Starting client certificate rotation" Apr 22 19:57:45.055772 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.055752 2577 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 22 19:57:45.055891 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.055796 2577 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 22 19:57:45.082222 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.082194 2577 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 22 19:57:45.085198 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.085169 2577 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 22 19:57:45.105099 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.105068 2577 log.go:25] "Validated CRI v1 runtime API" Apr 22 19:57:45.110700 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.110681 2577 log.go:25] "Validated CRI v1 image API" Apr 22 19:57:45.111946 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.111931 2577 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 22 19:57:45.114246 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.114221 2577 fs.go:135] Filesystem UUIDs: map[131c7691-0a07-4bd8-ad36-93b59234dc67:/dev/nvme0n1p3 7B77-95E7:/dev/nvme0n1p2 e350f6fe-4cb2-48d0-8679-6063d8d34a8a:/dev/nvme0n1p4] Apr 22 19:57:45.114294 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.114245 2577 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 22 19:57:45.118323 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.118306 2577 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 22 19:57:45.119757 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.119642 2577 manager.go:217] Machine: {Timestamp:2026-04-22 19:57:45.11829529 +0000 UTC m=+0.433387513 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3100046 MemoryCapacity:33164492800 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2b9f367ef211d803ebd136978c7232 SystemUUID:ec2b9f36-7ef2-11d8-03eb-d136978c7232 BootID:f0508e8c-5d0d-49b8-ad66-656ceeba067b Filesystems:[{Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:c2:71:a4:f9:45 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:c2:71:a4:f9:45 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:6e:62:dd:48:8c:88 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164492800 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 22 19:57:45.119757 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.119753 2577 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 22 19:57:45.119876 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.119851 2577 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 22 19:57:45.120212 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.120188 2577 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 22 19:57:45.120360 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.120213 2577 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-143-253.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 22 19:57:45.121177 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.121166 2577 topology_manager.go:138] "Creating topology manager with none policy" Apr 22 19:57:45.121219 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.121179 2577 container_manager_linux.go:306] "Creating device plugin manager" Apr 22 19:57:45.121219 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.121197 2577 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 22 19:57:45.122803 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.122791 2577 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 22 19:57:45.124342 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.124331 2577 state_mem.go:36] "Initialized new in-memory state store" Apr 22 19:57:45.124449 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.124440 2577 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 22 19:57:45.127004 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.126994 2577 kubelet.go:491] "Attempting to sync node with API server" Apr 22 19:57:45.127038 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.127012 2577 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 22 19:57:45.127038 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.127025 2577 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 22 19:57:45.127038 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.127035 2577 kubelet.go:397] "Adding apiserver pod source" Apr 22 19:57:45.127185 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.127049 2577 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 22 19:57:45.128193 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.128176 2577 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 22 19:57:45.128272 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.128206 2577 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 22 19:57:45.131447 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.131425 2577 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 22 19:57:45.133103 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.133087 2577 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 22 19:57:45.134967 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.134955 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 22 19:57:45.135017 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.134974 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 22 19:57:45.135017 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.134979 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 22 19:57:45.135017 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.134985 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 22 19:57:45.135017 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.134991 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 22 19:57:45.135017 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.134997 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 22 19:57:45.135017 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.135004 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 22 19:57:45.135017 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.135012 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 22 19:57:45.135017 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.135020 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 22 19:57:45.135220 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.135026 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 22 19:57:45.135220 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.135043 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 22 19:57:45.135220 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.135052 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 22 19:57:45.135476 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.135459 2577 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-pzmpp" Apr 22 19:57:45.136062 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.136051 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 22 19:57:45.136100 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.136063 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 22 19:57:45.138045 ip-10-0-143-253 kubenswrapper[2577]: E0422 19:57:45.138023 2577 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 22 19:57:45.138111 ip-10-0-143-253 kubenswrapper[2577]: E0422 19:57:45.138031 2577 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-143-253.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 22 19:57:45.139923 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.139910 2577 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 22 19:57:45.139972 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.139949 2577 server.go:1295] "Started kubelet" Apr 22 19:57:45.140051 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.140015 2577 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 22 19:57:45.140172 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.140127 2577 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 22 19:57:45.140206 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.140199 2577 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 22 19:57:45.140819 ip-10-0-143-253 systemd[1]: Started Kubernetes Kubelet. Apr 22 19:57:45.141379 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.141302 2577 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 22 19:57:45.142684 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.142670 2577 server.go:317] "Adding debug handlers to kubelet server" Apr 22 19:57:45.144716 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.144697 2577 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-pzmpp" Apr 22 19:57:45.147043 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.147023 2577 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-143-253.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 22 19:57:45.149444 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.149427 2577 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 22 19:57:45.149444 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.149437 2577 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 22 19:57:45.150327 ip-10-0-143-253 kubenswrapper[2577]: E0422 19:57:45.150310 2577 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 22 19:57:45.150464 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.150450 2577 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 22 19:57:45.150500 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.150466 2577 factory.go:55] Registering systemd factory Apr 22 19:57:45.150500 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.150474 2577 factory.go:223] Registration of the systemd container factory successfully Apr 22 19:57:45.150575 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.150547 2577 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 22 19:57:45.150632 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.150618 2577 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 22 19:57:45.150678 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.150632 2577 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 22 19:57:45.150778 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.150767 2577 reconstruct.go:97] "Volume reconstruction finished" Apr 22 19:57:45.150778 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.150778 2577 reconciler.go:26] "Reconciler: start to sync state" Apr 22 19:57:45.150880 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.150871 2577 factory.go:153] Registering CRI-O factory Apr 22 19:57:45.150920 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.150882 2577 factory.go:223] Registration of the crio container factory successfully Apr 22 19:57:45.150920 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.150899 2577 factory.go:103] Registering Raw factory Apr 22 19:57:45.150920 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.150909 2577 manager.go:1196] Started watching for new ooms in manager Apr 22 19:57:45.151353 ip-10-0-143-253 kubenswrapper[2577]: E0422 19:57:45.146950 2577 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-143-253.ec2.internal.18a8c619575e0b6a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-143-253.ec2.internal,UID:ip-10-0-143-253.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-143-253.ec2.internal,},FirstTimestamp:2026-04-22 19:57:45.139923818 +0000 UTC m=+0.455016041,LastTimestamp:2026-04-22 19:57:45.139923818 +0000 UTC m=+0.455016041,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-143-253.ec2.internal,}" Apr 22 19:57:45.151492 ip-10-0-143-253 kubenswrapper[2577]: E0422 19:57:45.151473 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-253.ec2.internal\" not found" Apr 22 19:57:45.152417 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.152401 2577 manager.go:319] Starting recovery of all containers Apr 22 19:57:45.158883 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.158864 2577 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 19:57:45.161436 ip-10-0-143-253 kubenswrapper[2577]: E0422 19:57:45.161269 2577 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-143-253.ec2.internal\" not found" node="ip-10-0-143-253.ec2.internal" Apr 22 19:57:45.162188 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.162170 2577 manager.go:324] Recovery completed Apr 22 19:57:45.163473 ip-10-0-143-253 kubenswrapper[2577]: E0422 19:57:45.163445 2577 watcher.go:152] Failed to watch directory "/sys/fs/cgroup/system.slice/systemd-update-utmp-runlevel.service": inotify_add_watch /sys/fs/cgroup/system.slice/systemd-update-utmp-runlevel.service: no such file or directory Apr 22 19:57:45.167243 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.167230 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 19:57:45.169527 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.169512 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-253.ec2.internal" event="NodeHasSufficientMemory" Apr 22 19:57:45.169584 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.169542 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-253.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 19:57:45.169584 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.169554 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-253.ec2.internal" event="NodeHasSufficientPID" Apr 22 19:57:45.170110 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.170094 2577 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 22 19:57:45.170110 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.170108 2577 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 22 19:57:45.170229 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.170125 2577 state_mem.go:36] "Initialized new in-memory state store" Apr 22 19:57:45.172503 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.172486 2577 policy_none.go:49] "None policy: Start" Apr 22 19:57:45.172561 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.172509 2577 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 22 19:57:45.172561 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.172523 2577 state_mem.go:35] "Initializing new in-memory state store" Apr 22 19:57:45.216481 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.216463 2577 manager.go:341] "Starting Device Plugin manager" Apr 22 19:57:45.218354 ip-10-0-143-253 kubenswrapper[2577]: E0422 19:57:45.216497 2577 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 22 19:57:45.218354 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.216507 2577 server.go:85] "Starting device plugin registration server" Apr 22 19:57:45.218354 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.216731 2577 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 22 19:57:45.218354 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.216740 2577 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 22 19:57:45.218354 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.216831 2577 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 22 19:57:45.218354 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.216929 2577 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 22 19:57:45.218354 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.216941 2577 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 22 19:57:45.218354 ip-10-0-143-253 kubenswrapper[2577]: E0422 19:57:45.217540 2577 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 22 19:57:45.218354 ip-10-0-143-253 kubenswrapper[2577]: E0422 19:57:45.217579 2577 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-143-253.ec2.internal\" not found" Apr 22 19:57:45.246398 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.246356 2577 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 22 19:57:45.247522 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.247503 2577 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 22 19:57:45.247579 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.247539 2577 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 22 19:57:45.247619 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.247596 2577 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 22 19:57:45.247619 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.247606 2577 kubelet.go:2451] "Starting kubelet main sync loop" Apr 22 19:57:45.247707 ip-10-0-143-253 kubenswrapper[2577]: E0422 19:57:45.247663 2577 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 22 19:57:45.251806 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.251783 2577 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 19:57:45.317699 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.317628 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 19:57:45.318657 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.318639 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-253.ec2.internal" event="NodeHasSufficientMemory" Apr 22 19:57:45.318752 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.318675 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-253.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 19:57:45.318752 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.318698 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-253.ec2.internal" event="NodeHasSufficientPID" Apr 22 19:57:45.318752 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.318729 2577 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-143-253.ec2.internal" Apr 22 19:57:45.327180 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.326554 2577 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-143-253.ec2.internal" Apr 22 19:57:45.327180 ip-10-0-143-253 kubenswrapper[2577]: E0422 19:57:45.326613 2577 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-143-253.ec2.internal\": node \"ip-10-0-143-253.ec2.internal\" not found" Apr 22 19:57:45.342904 ip-10-0-143-253 kubenswrapper[2577]: E0422 19:57:45.342879 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-253.ec2.internal\" not found" Apr 22 19:57:45.347911 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.347873 2577 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-253.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-143-253.ec2.internal"] Apr 22 19:57:45.347991 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.347969 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 19:57:45.348858 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.348826 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-253.ec2.internal" event="NodeHasSufficientMemory" Apr 22 19:57:45.348950 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.348871 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-253.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 19:57:45.348950 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.348890 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-253.ec2.internal" event="NodeHasSufficientPID" Apr 22 19:57:45.350091 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.350078 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 19:57:45.350293 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.350276 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-253.ec2.internal" Apr 22 19:57:45.350344 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.350308 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 19:57:45.350774 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.350757 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-253.ec2.internal" event="NodeHasSufficientMemory" Apr 22 19:57:45.350864 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.350775 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-253.ec2.internal" event="NodeHasSufficientMemory" Apr 22 19:57:45.350864 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.350789 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-253.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 19:57:45.350864 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.350798 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-253.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 19:57:45.350864 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.350802 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-253.ec2.internal" event="NodeHasSufficientPID" Apr 22 19:57:45.350864 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.350814 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-253.ec2.internal" event="NodeHasSufficientPID" Apr 22 19:57:45.351106 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.351090 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/6a8d971fc4211db7d5e8400be76011e5-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-143-253.ec2.internal\" (UID: \"6a8d971fc4211db7d5e8400be76011e5\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-253.ec2.internal" Apr 22 19:57:45.351145 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.351114 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6a8d971fc4211db7d5e8400be76011e5-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-143-253.ec2.internal\" (UID: \"6a8d971fc4211db7d5e8400be76011e5\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-253.ec2.internal" Apr 22 19:57:45.351145 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.351138 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/68c5e58877595fc451d476fd9e217735-config\") pod \"kube-apiserver-proxy-ip-10-0-143-253.ec2.internal\" (UID: \"68c5e58877595fc451d476fd9e217735\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-143-253.ec2.internal" Apr 22 19:57:45.352075 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.352064 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-143-253.ec2.internal" Apr 22 19:57:45.352118 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.352087 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 19:57:45.352723 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.352709 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-253.ec2.internal" event="NodeHasSufficientMemory" Apr 22 19:57:45.352767 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.352740 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-253.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 19:57:45.352767 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.352755 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-253.ec2.internal" event="NodeHasSufficientPID" Apr 22 19:57:45.379805 ip-10-0-143-253 kubenswrapper[2577]: E0422 19:57:45.379782 2577 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-143-253.ec2.internal\" not found" node="ip-10-0-143-253.ec2.internal" Apr 22 19:57:45.384313 ip-10-0-143-253 kubenswrapper[2577]: E0422 19:57:45.384296 2577 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-143-253.ec2.internal\" not found" node="ip-10-0-143-253.ec2.internal" Apr 22 19:57:45.443751 ip-10-0-143-253 kubenswrapper[2577]: E0422 19:57:45.443723 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-253.ec2.internal\" not found" Apr 22 19:57:45.451449 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.451425 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/6a8d971fc4211db7d5e8400be76011e5-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-143-253.ec2.internal\" (UID: \"6a8d971fc4211db7d5e8400be76011e5\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-253.ec2.internal" Apr 22 19:57:45.451576 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.451456 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6a8d971fc4211db7d5e8400be76011e5-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-143-253.ec2.internal\" (UID: \"6a8d971fc4211db7d5e8400be76011e5\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-253.ec2.internal" Apr 22 19:57:45.451576 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.451473 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/68c5e58877595fc451d476fd9e217735-config\") pod \"kube-apiserver-proxy-ip-10-0-143-253.ec2.internal\" (UID: \"68c5e58877595fc451d476fd9e217735\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-143-253.ec2.internal" Apr 22 19:57:45.451576 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.451508 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/68c5e58877595fc451d476fd9e217735-config\") pod \"kube-apiserver-proxy-ip-10-0-143-253.ec2.internal\" (UID: \"68c5e58877595fc451d476fd9e217735\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-143-253.ec2.internal" Apr 22 19:57:45.451576 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.451515 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/6a8d971fc4211db7d5e8400be76011e5-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-143-253.ec2.internal\" (UID: \"6a8d971fc4211db7d5e8400be76011e5\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-253.ec2.internal" Apr 22 19:57:45.451576 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.451522 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6a8d971fc4211db7d5e8400be76011e5-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-143-253.ec2.internal\" (UID: \"6a8d971fc4211db7d5e8400be76011e5\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-253.ec2.internal" Apr 22 19:57:45.543906 ip-10-0-143-253 kubenswrapper[2577]: E0422 19:57:45.543861 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-253.ec2.internal\" not found" Apr 22 19:57:45.644595 ip-10-0-143-253 kubenswrapper[2577]: E0422 19:57:45.644513 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-253.ec2.internal\" not found" Apr 22 19:57:45.682999 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.682971 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-253.ec2.internal" Apr 22 19:57:45.686529 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:45.686498 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-143-253.ec2.internal" Apr 22 19:57:45.745547 ip-10-0-143-253 kubenswrapper[2577]: E0422 19:57:45.745512 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-253.ec2.internal\" not found" Apr 22 19:57:45.846048 ip-10-0-143-253 kubenswrapper[2577]: E0422 19:57:45.846006 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-253.ec2.internal\" not found" Apr 22 19:57:45.946628 ip-10-0-143-253 kubenswrapper[2577]: E0422 19:57:45.946553 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-253.ec2.internal\" not found" Apr 22 19:57:46.047070 ip-10-0-143-253 kubenswrapper[2577]: E0422 19:57:46.047038 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-253.ec2.internal\" not found" Apr 22 19:57:46.056485 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:46.056465 2577 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 22 19:57:46.056628 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:46.056604 2577 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 22 19:57:46.056698 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:46.056648 2577 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 22 19:57:46.147536 ip-10-0-143-253 kubenswrapper[2577]: E0422 19:57:46.147497 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-253.ec2.internal\" not found" Apr 22 19:57:46.147710 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:46.147539 2577 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-21 19:52:45 +0000 UTC" deadline="2027-11-11 14:29:11.976574777 +0000 UTC" Apr 22 19:57:46.147710 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:46.147567 2577 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="13626h31m25.829010712s" Apr 22 19:57:46.149552 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:46.149534 2577 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 22 19:57:46.166969 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:46.166939 2577 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 22 19:57:46.191110 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:46.191088 2577 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-bbhmh" Apr 22 19:57:46.196955 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:46.196892 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod68c5e58877595fc451d476fd9e217735.slice/crio-17a98ce0e4d0e729937ef16e47b2b2c030e6c6d814a843707c395cf5efadfe6b WatchSource:0}: Error finding container 17a98ce0e4d0e729937ef16e47b2b2c030e6c6d814a843707c395cf5efadfe6b: Status 404 returned error can't find the container with id 17a98ce0e4d0e729937ef16e47b2b2c030e6c6d814a843707c395cf5efadfe6b Apr 22 19:57:46.197699 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:46.197680 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6a8d971fc4211db7d5e8400be76011e5.slice/crio-c08f5403e73594ce684686c40a55c46d57260a8e72ea4960db9018c959146c52 WatchSource:0}: Error finding container c08f5403e73594ce684686c40a55c46d57260a8e72ea4960db9018c959146c52: Status 404 returned error can't find the container with id c08f5403e73594ce684686c40a55c46d57260a8e72ea4960db9018c959146c52 Apr 22 19:57:46.199393 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:46.199358 2577 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-bbhmh" Apr 22 19:57:46.201238 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:46.201224 2577 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 19:57:46.247911 ip-10-0-143-253 kubenswrapper[2577]: E0422 19:57:46.247877 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-253.ec2.internal\" not found" Apr 22 19:57:46.250914 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:46.250860 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-143-253.ec2.internal" event={"ID":"68c5e58877595fc451d476fd9e217735","Type":"ContainerStarted","Data":"17a98ce0e4d0e729937ef16e47b2b2c030e6c6d814a843707c395cf5efadfe6b"} Apr 22 19:57:46.251804 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:46.251773 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-253.ec2.internal" event={"ID":"6a8d971fc4211db7d5e8400be76011e5","Type":"ContainerStarted","Data":"c08f5403e73594ce684686c40a55c46d57260a8e72ea4960db9018c959146c52"} Apr 22 19:57:46.348188 ip-10-0-143-253 kubenswrapper[2577]: E0422 19:57:46.348156 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-253.ec2.internal\" not found" Apr 22 19:57:46.448737 ip-10-0-143-253 kubenswrapper[2577]: E0422 19:57:46.448677 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-253.ec2.internal\" not found" Apr 22 19:57:46.549207 ip-10-0-143-253 kubenswrapper[2577]: E0422 19:57:46.549170 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-253.ec2.internal\" not found" Apr 22 19:57:46.649420 ip-10-0-143-253 kubenswrapper[2577]: E0422 19:57:46.649393 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-253.ec2.internal\" not found" Apr 22 19:57:46.649572 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:46.649522 2577 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 19:57:46.739513 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:46.739276 2577 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 19:57:46.750401 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:46.750370 2577 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-253.ec2.internal" Apr 22 19:57:46.761770 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:46.761708 2577 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 22 19:57:46.762778 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:46.762754 2577 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-143-253.ec2.internal" Apr 22 19:57:46.771195 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:46.771170 2577 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 22 19:57:47.129011 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.128978 2577 apiserver.go:52] "Watching apiserver" Apr 22 19:57:47.134619 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.134591 2577 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 22 19:57:47.137307 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.137212 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-msxbb","openshift-image-registry/node-ca-rpfnc","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-253.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tsrrb","openshift-multus/multus-additional-cni-plugins-p4l6b","openshift-multus/multus-pkpcm","openshift-multus/network-metrics-daemon-5dv89","openshift-network-diagnostics/network-check-target-t68sf","openshift-network-operator/iptables-alerter-kpwmj","openshift-ovn-kubernetes/ovnkube-node-wrbxl","kube-system/global-pull-secret-syncer-x55sm","kube-system/konnectivity-agent-wzwch","kube-system/kube-apiserver-proxy-ip-10-0-143-253.ec2.internal","openshift-cluster-node-tuning-operator/tuned-46d9d"] Apr 22 19:57:47.139423 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.139390 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5dv89" Apr 22 19:57:47.139545 ip-10-0-143-253 kubenswrapper[2577]: E0422 19:57:47.139480 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5dv89" podUID="d77c3bfb-9ca8-4f41-8c75-d15f2c1ab3b8" Apr 22 19:57:47.140604 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.140584 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-x55sm" Apr 22 19:57:47.140710 ip-10-0-143-253 kubenswrapper[2577]: E0422 19:57:47.140649 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-x55sm" podUID="0763314b-12d3-4771-844c-120f25ae1bc3" Apr 22 19:57:47.141693 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.141675 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-46d9d" Apr 22 19:57:47.142941 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.142922 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tsrrb" Apr 22 19:57:47.144156 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.144073 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-88jdn\"" Apr 22 19:57:47.144156 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.144099 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 22 19:57:47.144342 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.144300 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 22 19:57:47.145124 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.145104 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 22 19:57:47.145206 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.145111 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 22 19:57:47.145206 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.145148 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 22 19:57:47.145446 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.145427 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-4rlrc\"" Apr 22 19:57:47.145543 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.145523 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-p4l6b" Apr 22 19:57:47.146703 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.146686 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-pkpcm" Apr 22 19:57:47.146875 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.146826 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-kpwmj" Apr 22 19:57:47.147610 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.147595 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 22 19:57:47.147695 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.147628 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 22 19:57:47.147856 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.147815 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 22 19:57:47.147856 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.147854 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-lkmrr\"" Apr 22 19:57:47.148001 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.147864 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 22 19:57:47.148001 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.147898 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 22 19:57:47.148173 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.148154 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-t68sf" Apr 22 19:57:47.148252 ip-10-0-143-253 kubenswrapper[2577]: E0422 19:57:47.148220 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-t68sf" podUID="73129b41-d555-4a74-9f2a-640a35e9625f" Apr 22 19:57:47.148688 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.148672 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-f6lmc\"" Apr 22 19:57:47.149131 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.149117 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 22 19:57:47.149198 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.149152 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 22 19:57:47.149266 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.149254 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 22 19:57:47.149455 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.149428 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-tsf74\"" Apr 22 19:57:47.149805 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.149539 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-rpfnc" Apr 22 19:57:47.149805 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.149737 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 22 19:57:47.151253 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.151231 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-wrbxl" Apr 22 19:57:47.151594 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.151460 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 22 19:57:47.151759 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.151736 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 22 19:57:47.151866 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.151850 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-txvmq\"" Apr 22 19:57:47.152155 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.152138 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 22 19:57:47.152674 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.152655 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-msxbb" Apr 22 19:57:47.153477 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.153350 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 22 19:57:47.153477 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.153431 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 22 19:57:47.153477 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.153446 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 22 19:57:47.153690 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.153654 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 22 19:57:47.153774 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.153718 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 22 19:57:47.153907 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.153872 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-2h9bq\"" Apr 22 19:57:47.154012 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.153913 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 22 19:57:47.154166 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.154151 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-wzwch" Apr 22 19:57:47.154823 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.154805 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 22 19:57:47.154925 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.154812 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-4fjn6\"" Apr 22 19:57:47.154925 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.154851 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 22 19:57:47.156528 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.156511 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 22 19:57:47.156616 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.156557 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-x4jwj\"" Apr 22 19:57:47.157533 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.157512 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 22 19:57:47.160939 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.160914 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/90351fbd-9ae1-41d6-bb18-5239e60b2a9d-etc-sysctl-d\") pod \"tuned-46d9d\" (UID: \"90351fbd-9ae1-41d6-bb18-5239e60b2a9d\") " pod="openshift-cluster-node-tuning-operator/tuned-46d9d" Apr 22 19:57:47.161034 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.160953 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n5fwz\" (UniqueName: \"kubernetes.io/projected/e34993a1-0c8c-4395-9202-5841e22c2788-kube-api-access-n5fwz\") pod \"aws-ebs-csi-driver-node-tsrrb\" (UID: \"e34993a1-0c8c-4395-9202-5841e22c2788\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tsrrb" Apr 22 19:57:47.161034 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.160979 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8dlq4\" (UniqueName: \"kubernetes.io/projected/8b248b8e-1022-47ab-b16f-e3e4f3ee7abb-kube-api-access-8dlq4\") pod \"node-ca-rpfnc\" (UID: \"8b248b8e-1022-47ab-b16f-e3e4f3ee7abb\") " pod="openshift-image-registry/node-ca-rpfnc" Apr 22 19:57:47.161034 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.160999 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e24f6b8a-d137-4b5b-94b4-011f680ada1d-host-run-netns\") pod \"multus-pkpcm\" (UID: \"e24f6b8a-d137-4b5b-94b4-011f680ada1d\") " pod="openshift-multus/multus-pkpcm" Apr 22 19:57:47.161034 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.161022 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/36e9b580-270c-4cbb-b3e6-78fde6f244ec-host-run-ovn-kubernetes\") pod \"ovnkube-node-wrbxl\" (UID: \"36e9b580-270c-4cbb-b3e6-78fde6f244ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-wrbxl" Apr 22 19:57:47.161225 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.161043 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vzzfk\" (UniqueName: \"kubernetes.io/projected/36e9b580-270c-4cbb-b3e6-78fde6f244ec-kube-api-access-vzzfk\") pod \"ovnkube-node-wrbxl\" (UID: \"36e9b580-270c-4cbb-b3e6-78fde6f244ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-wrbxl" Apr 22 19:57:47.161225 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.161085 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/90351fbd-9ae1-41d6-bb18-5239e60b2a9d-etc-sysctl-conf\") pod \"tuned-46d9d\" (UID: \"90351fbd-9ae1-41d6-bb18-5239e60b2a9d\") " pod="openshift-cluster-node-tuning-operator/tuned-46d9d" Apr 22 19:57:47.161225 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.161117 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/90351fbd-9ae1-41d6-bb18-5239e60b2a9d-etc-tuned\") pod \"tuned-46d9d\" (UID: \"90351fbd-9ae1-41d6-bb18-5239e60b2a9d\") " pod="openshift-cluster-node-tuning-operator/tuned-46d9d" Apr 22 19:57:47.161225 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.161147 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e34993a1-0c8c-4395-9202-5841e22c2788-kubelet-dir\") pod \"aws-ebs-csi-driver-node-tsrrb\" (UID: \"e34993a1-0c8c-4395-9202-5841e22c2788\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tsrrb" Apr 22 19:57:47.161225 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.161172 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/e24f6b8a-d137-4b5b-94b4-011f680ada1d-os-release\") pod \"multus-pkpcm\" (UID: \"e24f6b8a-d137-4b5b-94b4-011f680ada1d\") " pod="openshift-multus/multus-pkpcm" Apr 22 19:57:47.161225 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.161196 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/36e9b580-270c-4cbb-b3e6-78fde6f244ec-log-socket\") pod \"ovnkube-node-wrbxl\" (UID: \"36e9b580-270c-4cbb-b3e6-78fde6f244ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-wrbxl" Apr 22 19:57:47.161506 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.161234 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/36e9b580-270c-4cbb-b3e6-78fde6f244ec-ovnkube-script-lib\") pod \"ovnkube-node-wrbxl\" (UID: \"36e9b580-270c-4cbb-b3e6-78fde6f244ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-wrbxl" Apr 22 19:57:47.161506 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.161261 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/90351fbd-9ae1-41d6-bb18-5239e60b2a9d-etc-modprobe-d\") pod \"tuned-46d9d\" (UID: \"90351fbd-9ae1-41d6-bb18-5239e60b2a9d\") " pod="openshift-cluster-node-tuning-operator/tuned-46d9d" Apr 22 19:57:47.161506 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.161286 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/90351fbd-9ae1-41d6-bb18-5239e60b2a9d-etc-sysconfig\") pod \"tuned-46d9d\" (UID: \"90351fbd-9ae1-41d6-bb18-5239e60b2a9d\") " pod="openshift-cluster-node-tuning-operator/tuned-46d9d" Apr 22 19:57:47.161506 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.161320 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/90351fbd-9ae1-41d6-bb18-5239e60b2a9d-etc-kubernetes\") pod \"tuned-46d9d\" (UID: \"90351fbd-9ae1-41d6-bb18-5239e60b2a9d\") " pod="openshift-cluster-node-tuning-operator/tuned-46d9d" Apr 22 19:57:47.161506 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.161348 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/90351fbd-9ae1-41d6-bb18-5239e60b2a9d-sys\") pod \"tuned-46d9d\" (UID: \"90351fbd-9ae1-41d6-bb18-5239e60b2a9d\") " pod="openshift-cluster-node-tuning-operator/tuned-46d9d" Apr 22 19:57:47.161506 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.161369 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/e34993a1-0c8c-4395-9202-5841e22c2788-registration-dir\") pod \"aws-ebs-csi-driver-node-tsrrb\" (UID: \"e34993a1-0c8c-4395-9202-5841e22c2788\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tsrrb" Apr 22 19:57:47.161506 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.161391 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/0763314b-12d3-4771-844c-120f25ae1bc3-dbus\") pod \"global-pull-secret-syncer-x55sm\" (UID: \"0763314b-12d3-4771-844c-120f25ae1bc3\") " pod="kube-system/global-pull-secret-syncer-x55sm" Apr 22 19:57:47.161506 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.161418 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/36e9b580-270c-4cbb-b3e6-78fde6f244ec-ovnkube-config\") pod \"ovnkube-node-wrbxl\" (UID: \"36e9b580-270c-4cbb-b3e6-78fde6f244ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-wrbxl" Apr 22 19:57:47.161506 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.161434 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8prlz\" (UniqueName: \"kubernetes.io/projected/90351fbd-9ae1-41d6-bb18-5239e60b2a9d-kube-api-access-8prlz\") pod \"tuned-46d9d\" (UID: \"90351fbd-9ae1-41d6-bb18-5239e60b2a9d\") " pod="openshift-cluster-node-tuning-operator/tuned-46d9d" Apr 22 19:57:47.161506 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.161461 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8b248b8e-1022-47ab-b16f-e3e4f3ee7abb-host\") pod \"node-ca-rpfnc\" (UID: \"8b248b8e-1022-47ab-b16f-e3e4f3ee7abb\") " pod="openshift-image-registry/node-ca-rpfnc" Apr 22 19:57:47.161506 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.161489 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5zggk\" (UniqueName: \"kubernetes.io/projected/8f5cde8a-b2da-4205-8a72-53560841ac3b-kube-api-access-5zggk\") pod \"iptables-alerter-kpwmj\" (UID: \"8f5cde8a-b2da-4205-8a72-53560841ac3b\") " pod="openshift-network-operator/iptables-alerter-kpwmj" Apr 22 19:57:47.161985 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.161518 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/90351fbd-9ae1-41d6-bb18-5239e60b2a9d-lib-modules\") pod \"tuned-46d9d\" (UID: \"90351fbd-9ae1-41d6-bb18-5239e60b2a9d\") " pod="openshift-cluster-node-tuning-operator/tuned-46d9d" Apr 22 19:57:47.161985 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.161541 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c28089f2-d625-4e69-b372-16c2a540e3a1-os-release\") pod \"multus-additional-cni-plugins-p4l6b\" (UID: \"c28089f2-d625-4e69-b372-16c2a540e3a1\") " pod="openshift-multus/multus-additional-cni-plugins-p4l6b" Apr 22 19:57:47.161985 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.161562 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/c28089f2-d625-4e69-b372-16c2a540e3a1-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-p4l6b\" (UID: \"c28089f2-d625-4e69-b372-16c2a540e3a1\") " pod="openshift-multus/multus-additional-cni-plugins-p4l6b" Apr 22 19:57:47.161985 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.161577 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e24f6b8a-d137-4b5b-94b4-011f680ada1d-etc-kubernetes\") pod \"multus-pkpcm\" (UID: \"e24f6b8a-d137-4b5b-94b4-011f680ada1d\") " pod="openshift-multus/multus-pkpcm" Apr 22 19:57:47.161985 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.161613 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/36e9b580-270c-4cbb-b3e6-78fde6f244ec-host-cni-netd\") pod \"ovnkube-node-wrbxl\" (UID: \"36e9b580-270c-4cbb-b3e6-78fde6f244ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-wrbxl" Apr 22 19:57:47.161985 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.161645 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/36e9b580-270c-4cbb-b3e6-78fde6f244ec-ovn-node-metrics-cert\") pod \"ovnkube-node-wrbxl\" (UID: \"36e9b580-270c-4cbb-b3e6-78fde6f244ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-wrbxl" Apr 22 19:57:47.161985 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.161671 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/e34993a1-0c8c-4395-9202-5841e22c2788-device-dir\") pod \"aws-ebs-csi-driver-node-tsrrb\" (UID: \"e34993a1-0c8c-4395-9202-5841e22c2788\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tsrrb" Apr 22 19:57:47.161985 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.161695 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/e34993a1-0c8c-4395-9202-5841e22c2788-etc-selinux\") pod \"aws-ebs-csi-driver-node-tsrrb\" (UID: \"e34993a1-0c8c-4395-9202-5841e22c2788\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tsrrb" Apr 22 19:57:47.161985 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.161718 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c28089f2-d625-4e69-b372-16c2a540e3a1-cni-binary-copy\") pod \"multus-additional-cni-plugins-p4l6b\" (UID: \"c28089f2-d625-4e69-b372-16c2a540e3a1\") " pod="openshift-multus/multus-additional-cni-plugins-p4l6b" Apr 22 19:57:47.161985 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.161743 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c28089f2-d625-4e69-b372-16c2a540e3a1-tuning-conf-dir\") pod \"multus-additional-cni-plugins-p4l6b\" (UID: \"c28089f2-d625-4e69-b372-16c2a540e3a1\") " pod="openshift-multus/multus-additional-cni-plugins-p4l6b" Apr 22 19:57:47.161985 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.161766 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/36e9b580-270c-4cbb-b3e6-78fde6f244ec-env-overrides\") pod \"ovnkube-node-wrbxl\" (UID: \"36e9b580-270c-4cbb-b3e6-78fde6f244ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-wrbxl" Apr 22 19:57:47.161985 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.161789 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/36e9b580-270c-4cbb-b3e6-78fde6f244ec-etc-openvswitch\") pod \"ovnkube-node-wrbxl\" (UID: \"36e9b580-270c-4cbb-b3e6-78fde6f244ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-wrbxl" Apr 22 19:57:47.161985 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.161820 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l9mmc\" (UniqueName: \"kubernetes.io/projected/73129b41-d555-4a74-9f2a-640a35e9625f-kube-api-access-l9mmc\") pod \"network-check-target-t68sf\" (UID: \"73129b41-d555-4a74-9f2a-640a35e9625f\") " pod="openshift-network-diagnostics/network-check-target-t68sf" Apr 22 19:57:47.161985 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.161870 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/0763314b-12d3-4771-844c-120f25ae1bc3-kubelet-config\") pod \"global-pull-secret-syncer-x55sm\" (UID: \"0763314b-12d3-4771-844c-120f25ae1bc3\") " pod="kube-system/global-pull-secret-syncer-x55sm" Apr 22 19:57:47.161985 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.161896 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c28089f2-d625-4e69-b372-16c2a540e3a1-system-cni-dir\") pod \"multus-additional-cni-plugins-p4l6b\" (UID: \"c28089f2-d625-4e69-b372-16c2a540e3a1\") " pod="openshift-multus/multus-additional-cni-plugins-p4l6b" Apr 22 19:57:47.161985 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.161918 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/e24f6b8a-d137-4b5b-94b4-011f680ada1d-host-run-k8s-cni-cncf-io\") pod \"multus-pkpcm\" (UID: \"e24f6b8a-d137-4b5b-94b4-011f680ada1d\") " pod="openshift-multus/multus-pkpcm" Apr 22 19:57:47.162568 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.161963 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e24f6b8a-d137-4b5b-94b4-011f680ada1d-host-var-lib-cni-bin\") pod \"multus-pkpcm\" (UID: \"e24f6b8a-d137-4b5b-94b4-011f680ada1d\") " pod="openshift-multus/multus-pkpcm" Apr 22 19:57:47.162568 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.161998 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/e24f6b8a-d137-4b5b-94b4-011f680ada1d-multus-daemon-config\") pod \"multus-pkpcm\" (UID: \"e24f6b8a-d137-4b5b-94b4-011f680ada1d\") " pod="openshift-multus/multus-pkpcm" Apr 22 19:57:47.162568 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.162027 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/90351fbd-9ae1-41d6-bb18-5239e60b2a9d-etc-systemd\") pod \"tuned-46d9d\" (UID: \"90351fbd-9ae1-41d6-bb18-5239e60b2a9d\") " pod="openshift-cluster-node-tuning-operator/tuned-46d9d" Apr 22 19:57:47.162568 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.162054 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/36e9b580-270c-4cbb-b3e6-78fde6f244ec-run-ovn\") pod \"ovnkube-node-wrbxl\" (UID: \"36e9b580-270c-4cbb-b3e6-78fde6f244ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-wrbxl" Apr 22 19:57:47.162568 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.162072 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/e34993a1-0c8c-4395-9202-5841e22c2788-sys-fs\") pod \"aws-ebs-csi-driver-node-tsrrb\" (UID: \"e34993a1-0c8c-4395-9202-5841e22c2788\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tsrrb" Apr 22 19:57:47.162568 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.162101 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/0763314b-12d3-4771-844c-120f25ae1bc3-original-pull-secret\") pod \"global-pull-secret-syncer-x55sm\" (UID: \"0763314b-12d3-4771-844c-120f25ae1bc3\") " pod="kube-system/global-pull-secret-syncer-x55sm" Apr 22 19:57:47.162568 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.162125 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e24f6b8a-d137-4b5b-94b4-011f680ada1d-multus-cni-dir\") pod \"multus-pkpcm\" (UID: \"e24f6b8a-d137-4b5b-94b4-011f680ada1d\") " pod="openshift-multus/multus-pkpcm" Apr 22 19:57:47.162568 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.162148 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/e24f6b8a-d137-4b5b-94b4-011f680ada1d-host-var-lib-cni-multus\") pod \"multus-pkpcm\" (UID: \"e24f6b8a-d137-4b5b-94b4-011f680ada1d\") " pod="openshift-multus/multus-pkpcm" Apr 22 19:57:47.162568 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.162180 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/90351fbd-9ae1-41d6-bb18-5239e60b2a9d-run\") pod \"tuned-46d9d\" (UID: \"90351fbd-9ae1-41d6-bb18-5239e60b2a9d\") " pod="openshift-cluster-node-tuning-operator/tuned-46d9d" Apr 22 19:57:47.162568 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.162203 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/90351fbd-9ae1-41d6-bb18-5239e60b2a9d-var-lib-kubelet\") pod \"tuned-46d9d\" (UID: \"90351fbd-9ae1-41d6-bb18-5239e60b2a9d\") " pod="openshift-cluster-node-tuning-operator/tuned-46d9d" Apr 22 19:57:47.162568 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.162233 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/c28089f2-d625-4e69-b372-16c2a540e3a1-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-p4l6b\" (UID: \"c28089f2-d625-4e69-b372-16c2a540e3a1\") " pod="openshift-multus/multus-additional-cni-plugins-p4l6b" Apr 22 19:57:47.162568 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.162257 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/e24f6b8a-d137-4b5b-94b4-011f680ada1d-multus-socket-dir-parent\") pod \"multus-pkpcm\" (UID: \"e24f6b8a-d137-4b5b-94b4-011f680ada1d\") " pod="openshift-multus/multus-pkpcm" Apr 22 19:57:47.162568 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.162297 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/36e9b580-270c-4cbb-b3e6-78fde6f244ec-node-log\") pod \"ovnkube-node-wrbxl\" (UID: \"36e9b580-270c-4cbb-b3e6-78fde6f244ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-wrbxl" Apr 22 19:57:47.162568 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.162329 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c28089f2-d625-4e69-b372-16c2a540e3a1-cnibin\") pod \"multus-additional-cni-plugins-p4l6b\" (UID: \"c28089f2-d625-4e69-b372-16c2a540e3a1\") " pod="openshift-multus/multus-additional-cni-plugins-p4l6b" Apr 22 19:57:47.162568 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.162354 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/e24f6b8a-d137-4b5b-94b4-011f680ada1d-cni-binary-copy\") pod \"multus-pkpcm\" (UID: \"e24f6b8a-d137-4b5b-94b4-011f680ada1d\") " pod="openshift-multus/multus-pkpcm" Apr 22 19:57:47.162568 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.162378 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/36e9b580-270c-4cbb-b3e6-78fde6f244ec-host-kubelet\") pod \"ovnkube-node-wrbxl\" (UID: \"36e9b580-270c-4cbb-b3e6-78fde6f244ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-wrbxl" Apr 22 19:57:47.163216 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.162399 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/90351fbd-9ae1-41d6-bb18-5239e60b2a9d-host\") pod \"tuned-46d9d\" (UID: \"90351fbd-9ae1-41d6-bb18-5239e60b2a9d\") " pod="openshift-cluster-node-tuning-operator/tuned-46d9d" Apr 22 19:57:47.163216 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.162422 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e24f6b8a-d137-4b5b-94b4-011f680ada1d-system-cni-dir\") pod \"multus-pkpcm\" (UID: \"e24f6b8a-d137-4b5b-94b4-011f680ada1d\") " pod="openshift-multus/multus-pkpcm" Apr 22 19:57:47.163216 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.162446 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/e24f6b8a-d137-4b5b-94b4-011f680ada1d-cnibin\") pod \"multus-pkpcm\" (UID: \"e24f6b8a-d137-4b5b-94b4-011f680ada1d\") " pod="openshift-multus/multus-pkpcm" Apr 22 19:57:47.163216 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.162487 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2rmhd\" (UniqueName: \"kubernetes.io/projected/e24f6b8a-d137-4b5b-94b4-011f680ada1d-kube-api-access-2rmhd\") pod \"multus-pkpcm\" (UID: \"e24f6b8a-d137-4b5b-94b4-011f680ada1d\") " pod="openshift-multus/multus-pkpcm" Apr 22 19:57:47.163216 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.162513 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/36e9b580-270c-4cbb-b3e6-78fde6f244ec-systemd-units\") pod \"ovnkube-node-wrbxl\" (UID: \"36e9b580-270c-4cbb-b3e6-78fde6f244ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-wrbxl" Apr 22 19:57:47.163216 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.162537 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/36e9b580-270c-4cbb-b3e6-78fde6f244ec-run-systemd\") pod \"ovnkube-node-wrbxl\" (UID: \"36e9b580-270c-4cbb-b3e6-78fde6f244ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-wrbxl" Apr 22 19:57:47.163216 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.162559 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/36e9b580-270c-4cbb-b3e6-78fde6f244ec-var-lib-openvswitch\") pod \"ovnkube-node-wrbxl\" (UID: \"36e9b580-270c-4cbb-b3e6-78fde6f244ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-wrbxl" Apr 22 19:57:47.163216 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.162585 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/36e9b580-270c-4cbb-b3e6-78fde6f244ec-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-wrbxl\" (UID: \"36e9b580-270c-4cbb-b3e6-78fde6f244ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-wrbxl" Apr 22 19:57:47.163216 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.162618 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/8b248b8e-1022-47ab-b16f-e3e4f3ee7abb-serviceca\") pod \"node-ca-rpfnc\" (UID: \"8b248b8e-1022-47ab-b16f-e3e4f3ee7abb\") " pod="openshift-image-registry/node-ca-rpfnc" Apr 22 19:57:47.163216 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.162647 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/8f5cde8a-b2da-4205-8a72-53560841ac3b-host-slash\") pod \"iptables-alerter-kpwmj\" (UID: \"8f5cde8a-b2da-4205-8a72-53560841ac3b\") " pod="openshift-network-operator/iptables-alerter-kpwmj" Apr 22 19:57:47.163216 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.162670 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/36e9b580-270c-4cbb-b3e6-78fde6f244ec-host-run-netns\") pod \"ovnkube-node-wrbxl\" (UID: \"36e9b580-270c-4cbb-b3e6-78fde6f244ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-wrbxl" Apr 22 19:57:47.163216 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.162701 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/90351fbd-9ae1-41d6-bb18-5239e60b2a9d-tmp\") pod \"tuned-46d9d\" (UID: \"90351fbd-9ae1-41d6-bb18-5239e60b2a9d\") " pod="openshift-cluster-node-tuning-operator/tuned-46d9d" Apr 22 19:57:47.163216 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.162727 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qvqr8\" (UniqueName: \"kubernetes.io/projected/d77c3bfb-9ca8-4f41-8c75-d15f2c1ab3b8-kube-api-access-qvqr8\") pod \"network-metrics-daemon-5dv89\" (UID: \"d77c3bfb-9ca8-4f41-8c75-d15f2c1ab3b8\") " pod="openshift-multus/network-metrics-daemon-5dv89" Apr 22 19:57:47.163216 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.162749 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/e24f6b8a-d137-4b5b-94b4-011f680ada1d-host-run-multus-certs\") pod \"multus-pkpcm\" (UID: \"e24f6b8a-d137-4b5b-94b4-011f680ada1d\") " pod="openshift-multus/multus-pkpcm" Apr 22 19:57:47.163216 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.162772 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/36e9b580-270c-4cbb-b3e6-78fde6f244ec-run-openvswitch\") pod \"ovnkube-node-wrbxl\" (UID: \"36e9b580-270c-4cbb-b3e6-78fde6f244ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-wrbxl" Apr 22 19:57:47.163216 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.162797 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d77c3bfb-9ca8-4f41-8c75-d15f2c1ab3b8-metrics-certs\") pod \"network-metrics-daemon-5dv89\" (UID: \"d77c3bfb-9ca8-4f41-8c75-d15f2c1ab3b8\") " pod="openshift-multus/network-metrics-daemon-5dv89" Apr 22 19:57:47.163775 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.162822 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q4bl4\" (UniqueName: \"kubernetes.io/projected/c28089f2-d625-4e69-b372-16c2a540e3a1-kube-api-access-q4bl4\") pod \"multus-additional-cni-plugins-p4l6b\" (UID: \"c28089f2-d625-4e69-b372-16c2a540e3a1\") " pod="openshift-multus/multus-additional-cni-plugins-p4l6b" Apr 22 19:57:47.163775 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.162868 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/36e9b580-270c-4cbb-b3e6-78fde6f244ec-host-slash\") pod \"ovnkube-node-wrbxl\" (UID: \"36e9b580-270c-4cbb-b3e6-78fde6f244ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-wrbxl" Apr 22 19:57:47.163775 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.162912 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/e24f6b8a-d137-4b5b-94b4-011f680ada1d-hostroot\") pod \"multus-pkpcm\" (UID: \"e24f6b8a-d137-4b5b-94b4-011f680ada1d\") " pod="openshift-multus/multus-pkpcm" Apr 22 19:57:47.163775 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.162947 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e24f6b8a-d137-4b5b-94b4-011f680ada1d-multus-conf-dir\") pod \"multus-pkpcm\" (UID: \"e24f6b8a-d137-4b5b-94b4-011f680ada1d\") " pod="openshift-multus/multus-pkpcm" Apr 22 19:57:47.163775 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.162973 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/e34993a1-0c8c-4395-9202-5841e22c2788-socket-dir\") pod \"aws-ebs-csi-driver-node-tsrrb\" (UID: \"e34993a1-0c8c-4395-9202-5841e22c2788\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tsrrb" Apr 22 19:57:47.163775 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.162988 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e24f6b8a-d137-4b5b-94b4-011f680ada1d-host-var-lib-kubelet\") pod \"multus-pkpcm\" (UID: \"e24f6b8a-d137-4b5b-94b4-011f680ada1d\") " pod="openshift-multus/multus-pkpcm" Apr 22 19:57:47.163775 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.163002 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/8f5cde8a-b2da-4205-8a72-53560841ac3b-iptables-alerter-script\") pod \"iptables-alerter-kpwmj\" (UID: \"8f5cde8a-b2da-4205-8a72-53560841ac3b\") " pod="openshift-network-operator/iptables-alerter-kpwmj" Apr 22 19:57:47.163775 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.163050 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/36e9b580-270c-4cbb-b3e6-78fde6f244ec-host-cni-bin\") pod \"ovnkube-node-wrbxl\" (UID: \"36e9b580-270c-4cbb-b3e6-78fde6f244ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-wrbxl" Apr 22 19:57:47.199991 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.199957 2577 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-21 19:52:46 +0000 UTC" deadline="2028-01-02 02:45:44.16490549 +0000 UTC" Apr 22 19:57:47.199991 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.199987 2577 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14862h47m56.964921847s" Apr 22 19:57:47.242460 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.242426 2577 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 19:57:47.245238 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.245210 2577 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 19:57:47.251348 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.251328 2577 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 22 19:57:47.263239 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.263217 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c28089f2-d625-4e69-b372-16c2a540e3a1-os-release\") pod \"multus-additional-cni-plugins-p4l6b\" (UID: \"c28089f2-d625-4e69-b372-16c2a540e3a1\") " pod="openshift-multus/multus-additional-cni-plugins-p4l6b" Apr 22 19:57:47.263239 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.263242 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/c28089f2-d625-4e69-b372-16c2a540e3a1-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-p4l6b\" (UID: \"c28089f2-d625-4e69-b372-16c2a540e3a1\") " pod="openshift-multus/multus-additional-cni-plugins-p4l6b" Apr 22 19:57:47.263419 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.263259 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e24f6b8a-d137-4b5b-94b4-011f680ada1d-etc-kubernetes\") pod \"multus-pkpcm\" (UID: \"e24f6b8a-d137-4b5b-94b4-011f680ada1d\") " pod="openshift-multus/multus-pkpcm" Apr 22 19:57:47.263419 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.263274 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/36e9b580-270c-4cbb-b3e6-78fde6f244ec-host-cni-netd\") pod \"ovnkube-node-wrbxl\" (UID: \"36e9b580-270c-4cbb-b3e6-78fde6f244ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-wrbxl" Apr 22 19:57:47.263419 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.263293 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/36e9b580-270c-4cbb-b3e6-78fde6f244ec-ovn-node-metrics-cert\") pod \"ovnkube-node-wrbxl\" (UID: \"36e9b580-270c-4cbb-b3e6-78fde6f244ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-wrbxl" Apr 22 19:57:47.263419 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.263318 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/e34993a1-0c8c-4395-9202-5841e22c2788-device-dir\") pod \"aws-ebs-csi-driver-node-tsrrb\" (UID: \"e34993a1-0c8c-4395-9202-5841e22c2788\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tsrrb" Apr 22 19:57:47.263419 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.263361 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e24f6b8a-d137-4b5b-94b4-011f680ada1d-etc-kubernetes\") pod \"multus-pkpcm\" (UID: \"e24f6b8a-d137-4b5b-94b4-011f680ada1d\") " pod="openshift-multus/multus-pkpcm" Apr 22 19:57:47.263419 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.263366 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c28089f2-d625-4e69-b372-16c2a540e3a1-os-release\") pod \"multus-additional-cni-plugins-p4l6b\" (UID: \"c28089f2-d625-4e69-b372-16c2a540e3a1\") " pod="openshift-multus/multus-additional-cni-plugins-p4l6b" Apr 22 19:57:47.263419 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.263389 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/e34993a1-0c8c-4395-9202-5841e22c2788-device-dir\") pod \"aws-ebs-csi-driver-node-tsrrb\" (UID: \"e34993a1-0c8c-4395-9202-5841e22c2788\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tsrrb" Apr 22 19:57:47.263419 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.263383 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/36e9b580-270c-4cbb-b3e6-78fde6f244ec-host-cni-netd\") pod \"ovnkube-node-wrbxl\" (UID: \"36e9b580-270c-4cbb-b3e6-78fde6f244ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-wrbxl" Apr 22 19:57:47.263886 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.263465 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/e34993a1-0c8c-4395-9202-5841e22c2788-etc-selinux\") pod \"aws-ebs-csi-driver-node-tsrrb\" (UID: \"e34993a1-0c8c-4395-9202-5841e22c2788\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tsrrb" Apr 22 19:57:47.263886 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.263503 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c28089f2-d625-4e69-b372-16c2a540e3a1-cni-binary-copy\") pod \"multus-additional-cni-plugins-p4l6b\" (UID: \"c28089f2-d625-4e69-b372-16c2a540e3a1\") " pod="openshift-multus/multus-additional-cni-plugins-p4l6b" Apr 22 19:57:47.263886 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.263531 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c28089f2-d625-4e69-b372-16c2a540e3a1-tuning-conf-dir\") pod \"multus-additional-cni-plugins-p4l6b\" (UID: \"c28089f2-d625-4e69-b372-16c2a540e3a1\") " pod="openshift-multus/multus-additional-cni-plugins-p4l6b" Apr 22 19:57:47.263886 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.263574 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/36e9b580-270c-4cbb-b3e6-78fde6f244ec-env-overrides\") pod \"ovnkube-node-wrbxl\" (UID: \"36e9b580-270c-4cbb-b3e6-78fde6f244ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-wrbxl" Apr 22 19:57:47.263886 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.263576 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/e34993a1-0c8c-4395-9202-5841e22c2788-etc-selinux\") pod \"aws-ebs-csi-driver-node-tsrrb\" (UID: \"e34993a1-0c8c-4395-9202-5841e22c2788\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tsrrb" Apr 22 19:57:47.263886 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.263605 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zvwp4\" (UniqueName: \"kubernetes.io/projected/3d105cfe-1e71-45ef-b072-4f6de04ca9c1-kube-api-access-zvwp4\") pod \"node-resolver-msxbb\" (UID: \"3d105cfe-1e71-45ef-b072-4f6de04ca9c1\") " pod="openshift-dns/node-resolver-msxbb" Apr 22 19:57:47.263886 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.263634 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/36e9b580-270c-4cbb-b3e6-78fde6f244ec-etc-openvswitch\") pod \"ovnkube-node-wrbxl\" (UID: \"36e9b580-270c-4cbb-b3e6-78fde6f244ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-wrbxl" Apr 22 19:57:47.263886 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.263660 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l9mmc\" (UniqueName: \"kubernetes.io/projected/73129b41-d555-4a74-9f2a-640a35e9625f-kube-api-access-l9mmc\") pod \"network-check-target-t68sf\" (UID: \"73129b41-d555-4a74-9f2a-640a35e9625f\") " pod="openshift-network-diagnostics/network-check-target-t68sf" Apr 22 19:57:47.263886 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.263686 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/0763314b-12d3-4771-844c-120f25ae1bc3-kubelet-config\") pod \"global-pull-secret-syncer-x55sm\" (UID: \"0763314b-12d3-4771-844c-120f25ae1bc3\") " pod="kube-system/global-pull-secret-syncer-x55sm" Apr 22 19:57:47.263886 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.263707 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c28089f2-d625-4e69-b372-16c2a540e3a1-tuning-conf-dir\") pod \"multus-additional-cni-plugins-p4l6b\" (UID: \"c28089f2-d625-4e69-b372-16c2a540e3a1\") " pod="openshift-multus/multus-additional-cni-plugins-p4l6b" Apr 22 19:57:47.263886 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.263711 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c28089f2-d625-4e69-b372-16c2a540e3a1-system-cni-dir\") pod \"multus-additional-cni-plugins-p4l6b\" (UID: \"c28089f2-d625-4e69-b372-16c2a540e3a1\") " pod="openshift-multus/multus-additional-cni-plugins-p4l6b" Apr 22 19:57:47.263886 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.263750 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/e24f6b8a-d137-4b5b-94b4-011f680ada1d-host-run-k8s-cni-cncf-io\") pod \"multus-pkpcm\" (UID: \"e24f6b8a-d137-4b5b-94b4-011f680ada1d\") " pod="openshift-multus/multus-pkpcm" Apr 22 19:57:47.263886 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.263729 2577 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 22 19:57:47.263886 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.263773 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c28089f2-d625-4e69-b372-16c2a540e3a1-system-cni-dir\") pod \"multus-additional-cni-plugins-p4l6b\" (UID: \"c28089f2-d625-4e69-b372-16c2a540e3a1\") " pod="openshift-multus/multus-additional-cni-plugins-p4l6b" Apr 22 19:57:47.263886 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.263802 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/36e9b580-270c-4cbb-b3e6-78fde6f244ec-etc-openvswitch\") pod \"ovnkube-node-wrbxl\" (UID: \"36e9b580-270c-4cbb-b3e6-78fde6f244ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-wrbxl" Apr 22 19:57:47.263886 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.263867 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e24f6b8a-d137-4b5b-94b4-011f680ada1d-host-var-lib-cni-bin\") pod \"multus-pkpcm\" (UID: \"e24f6b8a-d137-4b5b-94b4-011f680ada1d\") " pod="openshift-multus/multus-pkpcm" Apr 22 19:57:47.263886 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.263897 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/e24f6b8a-d137-4b5b-94b4-011f680ada1d-multus-daemon-config\") pod \"multus-pkpcm\" (UID: \"e24f6b8a-d137-4b5b-94b4-011f680ada1d\") " pod="openshift-multus/multus-pkpcm" Apr 22 19:57:47.264478 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.263925 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/90351fbd-9ae1-41d6-bb18-5239e60b2a9d-etc-systemd\") pod \"tuned-46d9d\" (UID: \"90351fbd-9ae1-41d6-bb18-5239e60b2a9d\") " pod="openshift-cluster-node-tuning-operator/tuned-46d9d" Apr 22 19:57:47.264478 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.263941 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/c28089f2-d625-4e69-b372-16c2a540e3a1-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-p4l6b\" (UID: \"c28089f2-d625-4e69-b372-16c2a540e3a1\") " pod="openshift-multus/multus-additional-cni-plugins-p4l6b" Apr 22 19:57:47.264478 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.263955 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/36e9b580-270c-4cbb-b3e6-78fde6f244ec-run-ovn\") pod \"ovnkube-node-wrbxl\" (UID: \"36e9b580-270c-4cbb-b3e6-78fde6f244ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-wrbxl" Apr 22 19:57:47.264478 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.263983 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/e34993a1-0c8c-4395-9202-5841e22c2788-sys-fs\") pod \"aws-ebs-csi-driver-node-tsrrb\" (UID: \"e34993a1-0c8c-4395-9202-5841e22c2788\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tsrrb" Apr 22 19:57:47.264478 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.264011 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/0763314b-12d3-4771-844c-120f25ae1bc3-original-pull-secret\") pod \"global-pull-secret-syncer-x55sm\" (UID: \"0763314b-12d3-4771-844c-120f25ae1bc3\") " pod="kube-system/global-pull-secret-syncer-x55sm" Apr 22 19:57:47.264478 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.264018 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/0763314b-12d3-4771-844c-120f25ae1bc3-kubelet-config\") pod \"global-pull-secret-syncer-x55sm\" (UID: \"0763314b-12d3-4771-844c-120f25ae1bc3\") " pod="kube-system/global-pull-secret-syncer-x55sm" Apr 22 19:57:47.264478 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.264037 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e24f6b8a-d137-4b5b-94b4-011f680ada1d-multus-cni-dir\") pod \"multus-pkpcm\" (UID: \"e24f6b8a-d137-4b5b-94b4-011f680ada1d\") " pod="openshift-multus/multus-pkpcm" Apr 22 19:57:47.264478 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.264063 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/e24f6b8a-d137-4b5b-94b4-011f680ada1d-host-var-lib-cni-multus\") pod \"multus-pkpcm\" (UID: \"e24f6b8a-d137-4b5b-94b4-011f680ada1d\") " pod="openshift-multus/multus-pkpcm" Apr 22 19:57:47.264478 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.264077 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/36e9b580-270c-4cbb-b3e6-78fde6f244ec-run-ovn\") pod \"ovnkube-node-wrbxl\" (UID: \"36e9b580-270c-4cbb-b3e6-78fde6f244ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-wrbxl" Apr 22 19:57:47.264478 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.264089 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/90351fbd-9ae1-41d6-bb18-5239e60b2a9d-run\") pod \"tuned-46d9d\" (UID: \"90351fbd-9ae1-41d6-bb18-5239e60b2a9d\") " pod="openshift-cluster-node-tuning-operator/tuned-46d9d" Apr 22 19:57:47.264478 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.264095 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c28089f2-d625-4e69-b372-16c2a540e3a1-cni-binary-copy\") pod \"multus-additional-cni-plugins-p4l6b\" (UID: \"c28089f2-d625-4e69-b372-16c2a540e3a1\") " pod="openshift-multus/multus-additional-cni-plugins-p4l6b" Apr 22 19:57:47.264478 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.264113 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/90351fbd-9ae1-41d6-bb18-5239e60b2a9d-var-lib-kubelet\") pod \"tuned-46d9d\" (UID: \"90351fbd-9ae1-41d6-bb18-5239e60b2a9d\") " pod="openshift-cluster-node-tuning-operator/tuned-46d9d" Apr 22 19:57:47.264478 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.264142 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/c28089f2-d625-4e69-b372-16c2a540e3a1-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-p4l6b\" (UID: \"c28089f2-d625-4e69-b372-16c2a540e3a1\") " pod="openshift-multus/multus-additional-cni-plugins-p4l6b" Apr 22 19:57:47.264478 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.264167 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/e24f6b8a-d137-4b5b-94b4-011f680ada1d-multus-socket-dir-parent\") pod \"multus-pkpcm\" (UID: \"e24f6b8a-d137-4b5b-94b4-011f680ada1d\") " pod="openshift-multus/multus-pkpcm" Apr 22 19:57:47.264478 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.264188 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/36e9b580-270c-4cbb-b3e6-78fde6f244ec-node-log\") pod \"ovnkube-node-wrbxl\" (UID: \"36e9b580-270c-4cbb-b3e6-78fde6f244ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-wrbxl" Apr 22 19:57:47.264478 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.264204 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c28089f2-d625-4e69-b372-16c2a540e3a1-cnibin\") pod \"multus-additional-cni-plugins-p4l6b\" (UID: \"c28089f2-d625-4e69-b372-16c2a540e3a1\") " pod="openshift-multus/multus-additional-cni-plugins-p4l6b" Apr 22 19:57:47.264478 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.264220 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/e24f6b8a-d137-4b5b-94b4-011f680ada1d-cni-binary-copy\") pod \"multus-pkpcm\" (UID: \"e24f6b8a-d137-4b5b-94b4-011f680ada1d\") " pod="openshift-multus/multus-pkpcm" Apr 22 19:57:47.265215 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.264240 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/36e9b580-270c-4cbb-b3e6-78fde6f244ec-host-kubelet\") pod \"ovnkube-node-wrbxl\" (UID: \"36e9b580-270c-4cbb-b3e6-78fde6f244ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-wrbxl" Apr 22 19:57:47.265215 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.264263 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/90351fbd-9ae1-41d6-bb18-5239e60b2a9d-host\") pod \"tuned-46d9d\" (UID: \"90351fbd-9ae1-41d6-bb18-5239e60b2a9d\") " pod="openshift-cluster-node-tuning-operator/tuned-46d9d" Apr 22 19:57:47.265215 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.264281 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/74ab80fc-aaa7-48f7-8670-ed1cd47ff5c8-konnectivity-ca\") pod \"konnectivity-agent-wzwch\" (UID: \"74ab80fc-aaa7-48f7-8670-ed1cd47ff5c8\") " pod="kube-system/konnectivity-agent-wzwch" Apr 22 19:57:47.265215 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.264329 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e24f6b8a-d137-4b5b-94b4-011f680ada1d-system-cni-dir\") pod \"multus-pkpcm\" (UID: \"e24f6b8a-d137-4b5b-94b4-011f680ada1d\") " pod="openshift-multus/multus-pkpcm" Apr 22 19:57:47.265215 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.264350 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/e24f6b8a-d137-4b5b-94b4-011f680ada1d-cnibin\") pod \"multus-pkpcm\" (UID: \"e24f6b8a-d137-4b5b-94b4-011f680ada1d\") " pod="openshift-multus/multus-pkpcm" Apr 22 19:57:47.265215 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.264366 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2rmhd\" (UniqueName: \"kubernetes.io/projected/e24f6b8a-d137-4b5b-94b4-011f680ada1d-kube-api-access-2rmhd\") pod \"multus-pkpcm\" (UID: \"e24f6b8a-d137-4b5b-94b4-011f680ada1d\") " pod="openshift-multus/multus-pkpcm" Apr 22 19:57:47.265215 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.264381 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/36e9b580-270c-4cbb-b3e6-78fde6f244ec-systemd-units\") pod \"ovnkube-node-wrbxl\" (UID: \"36e9b580-270c-4cbb-b3e6-78fde6f244ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-wrbxl" Apr 22 19:57:47.265215 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.264396 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/36e9b580-270c-4cbb-b3e6-78fde6f244ec-run-systemd\") pod \"ovnkube-node-wrbxl\" (UID: \"36e9b580-270c-4cbb-b3e6-78fde6f244ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-wrbxl" Apr 22 19:57:47.265215 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.264418 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/36e9b580-270c-4cbb-b3e6-78fde6f244ec-var-lib-openvswitch\") pod \"ovnkube-node-wrbxl\" (UID: \"36e9b580-270c-4cbb-b3e6-78fde6f244ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-wrbxl" Apr 22 19:57:47.265215 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.264435 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/36e9b580-270c-4cbb-b3e6-78fde6f244ec-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-wrbxl\" (UID: \"36e9b580-270c-4cbb-b3e6-78fde6f244ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-wrbxl" Apr 22 19:57:47.265215 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.264451 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/8b248b8e-1022-47ab-b16f-e3e4f3ee7abb-serviceca\") pod \"node-ca-rpfnc\" (UID: \"8b248b8e-1022-47ab-b16f-e3e4f3ee7abb\") " pod="openshift-image-registry/node-ca-rpfnc" Apr 22 19:57:47.265215 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.264471 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/8f5cde8a-b2da-4205-8a72-53560841ac3b-host-slash\") pod \"iptables-alerter-kpwmj\" (UID: \"8f5cde8a-b2da-4205-8a72-53560841ac3b\") " pod="openshift-network-operator/iptables-alerter-kpwmj" Apr 22 19:57:47.265215 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.264498 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/36e9b580-270c-4cbb-b3e6-78fde6f244ec-host-run-netns\") pod \"ovnkube-node-wrbxl\" (UID: \"36e9b580-270c-4cbb-b3e6-78fde6f244ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-wrbxl" Apr 22 19:57:47.265215 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.264522 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/90351fbd-9ae1-41d6-bb18-5239e60b2a9d-tmp\") pod \"tuned-46d9d\" (UID: \"90351fbd-9ae1-41d6-bb18-5239e60b2a9d\") " pod="openshift-cluster-node-tuning-operator/tuned-46d9d" Apr 22 19:57:47.265215 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.264547 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qvqr8\" (UniqueName: \"kubernetes.io/projected/d77c3bfb-9ca8-4f41-8c75-d15f2c1ab3b8-kube-api-access-qvqr8\") pod \"network-metrics-daemon-5dv89\" (UID: \"d77c3bfb-9ca8-4f41-8c75-d15f2c1ab3b8\") " pod="openshift-multus/network-metrics-daemon-5dv89" Apr 22 19:57:47.265215 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.264576 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/e24f6b8a-d137-4b5b-94b4-011f680ada1d-host-run-multus-certs\") pod \"multus-pkpcm\" (UID: \"e24f6b8a-d137-4b5b-94b4-011f680ada1d\") " pod="openshift-multus/multus-pkpcm" Apr 22 19:57:47.265215 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.264625 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e24f6b8a-d137-4b5b-94b4-011f680ada1d-host-var-lib-cni-bin\") pod \"multus-pkpcm\" (UID: \"e24f6b8a-d137-4b5b-94b4-011f680ada1d\") " pod="openshift-multus/multus-pkpcm" Apr 22 19:57:47.266075 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.264682 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/e34993a1-0c8c-4395-9202-5841e22c2788-sys-fs\") pod \"aws-ebs-csi-driver-node-tsrrb\" (UID: \"e34993a1-0c8c-4395-9202-5841e22c2788\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tsrrb" Apr 22 19:57:47.266075 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.264680 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/e24f6b8a-d137-4b5b-94b4-011f680ada1d-host-run-k8s-cni-cncf-io\") pod \"multus-pkpcm\" (UID: \"e24f6b8a-d137-4b5b-94b4-011f680ada1d\") " pod="openshift-multus/multus-pkpcm" Apr 22 19:57:47.266075 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.264710 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e24f6b8a-d137-4b5b-94b4-011f680ada1d-multus-cni-dir\") pod \"multus-pkpcm\" (UID: \"e24f6b8a-d137-4b5b-94b4-011f680ada1d\") " pod="openshift-multus/multus-pkpcm" Apr 22 19:57:47.266075 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.264774 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/36e9b580-270c-4cbb-b3e6-78fde6f244ec-node-log\") pod \"ovnkube-node-wrbxl\" (UID: \"36e9b580-270c-4cbb-b3e6-78fde6f244ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-wrbxl" Apr 22 19:57:47.266075 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.264776 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/e24f6b8a-d137-4b5b-94b4-011f680ada1d-multus-daemon-config\") pod \"multus-pkpcm\" (UID: \"e24f6b8a-d137-4b5b-94b4-011f680ada1d\") " pod="openshift-multus/multus-pkpcm" Apr 22 19:57:47.266075 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.264794 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e24f6b8a-d137-4b5b-94b4-011f680ada1d-system-cni-dir\") pod \"multus-pkpcm\" (UID: \"e24f6b8a-d137-4b5b-94b4-011f680ada1d\") " pod="openshift-multus/multus-pkpcm" Apr 22 19:57:47.266075 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.264821 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c28089f2-d625-4e69-b372-16c2a540e3a1-cnibin\") pod \"multus-additional-cni-plugins-p4l6b\" (UID: \"c28089f2-d625-4e69-b372-16c2a540e3a1\") " pod="openshift-multus/multus-additional-cni-plugins-p4l6b" Apr 22 19:57:47.266075 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.264852 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/90351fbd-9ae1-41d6-bb18-5239e60b2a9d-run\") pod \"tuned-46d9d\" (UID: \"90351fbd-9ae1-41d6-bb18-5239e60b2a9d\") " pod="openshift-cluster-node-tuning-operator/tuned-46d9d" Apr 22 19:57:47.266075 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.264798 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/e24f6b8a-d137-4b5b-94b4-011f680ada1d-cnibin\") pod \"multus-pkpcm\" (UID: \"e24f6b8a-d137-4b5b-94b4-011f680ada1d\") " pod="openshift-multus/multus-pkpcm" Apr 22 19:57:47.266075 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.264854 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/36e9b580-270c-4cbb-b3e6-78fde6f244ec-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-wrbxl\" (UID: \"36e9b580-270c-4cbb-b3e6-78fde6f244ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-wrbxl" Apr 22 19:57:47.266075 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.264855 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/36e9b580-270c-4cbb-b3e6-78fde6f244ec-systemd-units\") pod \"ovnkube-node-wrbxl\" (UID: \"36e9b580-270c-4cbb-b3e6-78fde6f244ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-wrbxl" Apr 22 19:57:47.266075 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.264908 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/90351fbd-9ae1-41d6-bb18-5239e60b2a9d-etc-systemd\") pod \"tuned-46d9d\" (UID: \"90351fbd-9ae1-41d6-bb18-5239e60b2a9d\") " pod="openshift-cluster-node-tuning-operator/tuned-46d9d" Apr 22 19:57:47.266075 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.264917 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/8f5cde8a-b2da-4205-8a72-53560841ac3b-host-slash\") pod \"iptables-alerter-kpwmj\" (UID: \"8f5cde8a-b2da-4205-8a72-53560841ac3b\") " pod="openshift-network-operator/iptables-alerter-kpwmj" Apr 22 19:57:47.266075 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.264917 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/90351fbd-9ae1-41d6-bb18-5239e60b2a9d-var-lib-kubelet\") pod \"tuned-46d9d\" (UID: \"90351fbd-9ae1-41d6-bb18-5239e60b2a9d\") " pod="openshift-cluster-node-tuning-operator/tuned-46d9d" Apr 22 19:57:47.266075 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.264946 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/36e9b580-270c-4cbb-b3e6-78fde6f244ec-host-kubelet\") pod \"ovnkube-node-wrbxl\" (UID: \"36e9b580-270c-4cbb-b3e6-78fde6f244ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-wrbxl" Apr 22 19:57:47.266075 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.264960 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/e24f6b8a-d137-4b5b-94b4-011f680ada1d-multus-socket-dir-parent\") pod \"multus-pkpcm\" (UID: \"e24f6b8a-d137-4b5b-94b4-011f680ada1d\") " pod="openshift-multus/multus-pkpcm" Apr 22 19:57:47.266075 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.264974 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/e24f6b8a-d137-4b5b-94b4-011f680ada1d-host-run-multus-certs\") pod \"multus-pkpcm\" (UID: \"e24f6b8a-d137-4b5b-94b4-011f680ada1d\") " pod="openshift-multus/multus-pkpcm" Apr 22 19:57:47.266075 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.264986 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/e24f6b8a-d137-4b5b-94b4-011f680ada1d-host-var-lib-cni-multus\") pod \"multus-pkpcm\" (UID: \"e24f6b8a-d137-4b5b-94b4-011f680ada1d\") " pod="openshift-multus/multus-pkpcm" Apr 22 19:57:47.266851 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.264998 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/36e9b580-270c-4cbb-b3e6-78fde6f244ec-var-lib-openvswitch\") pod \"ovnkube-node-wrbxl\" (UID: \"36e9b580-270c-4cbb-b3e6-78fde6f244ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-wrbxl" Apr 22 19:57:47.266851 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.265028 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/36e9b580-270c-4cbb-b3e6-78fde6f244ec-run-openvswitch\") pod \"ovnkube-node-wrbxl\" (UID: \"36e9b580-270c-4cbb-b3e6-78fde6f244ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-wrbxl" Apr 22 19:57:47.266851 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.265055 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d77c3bfb-9ca8-4f41-8c75-d15f2c1ab3b8-metrics-certs\") pod \"network-metrics-daemon-5dv89\" (UID: \"d77c3bfb-9ca8-4f41-8c75-d15f2c1ab3b8\") " pod="openshift-multus/network-metrics-daemon-5dv89" Apr 22 19:57:47.266851 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.265082 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q4bl4\" (UniqueName: \"kubernetes.io/projected/c28089f2-d625-4e69-b372-16c2a540e3a1-kube-api-access-q4bl4\") pod \"multus-additional-cni-plugins-p4l6b\" (UID: \"c28089f2-d625-4e69-b372-16c2a540e3a1\") " pod="openshift-multus/multus-additional-cni-plugins-p4l6b" Apr 22 19:57:47.266851 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.265106 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/36e9b580-270c-4cbb-b3e6-78fde6f244ec-host-slash\") pod \"ovnkube-node-wrbxl\" (UID: \"36e9b580-270c-4cbb-b3e6-78fde6f244ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-wrbxl" Apr 22 19:57:47.266851 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.265131 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/e24f6b8a-d137-4b5b-94b4-011f680ada1d-hostroot\") pod \"multus-pkpcm\" (UID: \"e24f6b8a-d137-4b5b-94b4-011f680ada1d\") " pod="openshift-multus/multus-pkpcm" Apr 22 19:57:47.266851 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.265157 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e24f6b8a-d137-4b5b-94b4-011f680ada1d-multus-conf-dir\") pod \"multus-pkpcm\" (UID: \"e24f6b8a-d137-4b5b-94b4-011f680ada1d\") " pod="openshift-multus/multus-pkpcm" Apr 22 19:57:47.266851 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.265184 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/e34993a1-0c8c-4395-9202-5841e22c2788-socket-dir\") pod \"aws-ebs-csi-driver-node-tsrrb\" (UID: \"e34993a1-0c8c-4395-9202-5841e22c2788\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tsrrb" Apr 22 19:57:47.266851 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.265186 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/c28089f2-d625-4e69-b372-16c2a540e3a1-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-p4l6b\" (UID: \"c28089f2-d625-4e69-b372-16c2a540e3a1\") " pod="openshift-multus/multus-additional-cni-plugins-p4l6b" Apr 22 19:57:47.266851 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.265209 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e24f6b8a-d137-4b5b-94b4-011f680ada1d-host-var-lib-kubelet\") pod \"multus-pkpcm\" (UID: \"e24f6b8a-d137-4b5b-94b4-011f680ada1d\") " pod="openshift-multus/multus-pkpcm" Apr 22 19:57:47.266851 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.265234 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/8f5cde8a-b2da-4205-8a72-53560841ac3b-iptables-alerter-script\") pod \"iptables-alerter-kpwmj\" (UID: \"8f5cde8a-b2da-4205-8a72-53560841ac3b\") " pod="openshift-network-operator/iptables-alerter-kpwmj" Apr 22 19:57:47.266851 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.265242 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/90351fbd-9ae1-41d6-bb18-5239e60b2a9d-host\") pod \"tuned-46d9d\" (UID: \"90351fbd-9ae1-41d6-bb18-5239e60b2a9d\") " pod="openshift-cluster-node-tuning-operator/tuned-46d9d" Apr 22 19:57:47.266851 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.265257 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/36e9b580-270c-4cbb-b3e6-78fde6f244ec-host-cni-bin\") pod \"ovnkube-node-wrbxl\" (UID: \"36e9b580-270c-4cbb-b3e6-78fde6f244ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-wrbxl" Apr 22 19:57:47.266851 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.265281 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/90351fbd-9ae1-41d6-bb18-5239e60b2a9d-etc-sysctl-d\") pod \"tuned-46d9d\" (UID: \"90351fbd-9ae1-41d6-bb18-5239e60b2a9d\") " pod="openshift-cluster-node-tuning-operator/tuned-46d9d" Apr 22 19:57:47.266851 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.265310 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/74ab80fc-aaa7-48f7-8670-ed1cd47ff5c8-agent-certs\") pod \"konnectivity-agent-wzwch\" (UID: \"74ab80fc-aaa7-48f7-8670-ed1cd47ff5c8\") " pod="kube-system/konnectivity-agent-wzwch" Apr 22 19:57:47.266851 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.265337 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/3d105cfe-1e71-45ef-b072-4f6de04ca9c1-hosts-file\") pod \"node-resolver-msxbb\" (UID: \"3d105cfe-1e71-45ef-b072-4f6de04ca9c1\") " pod="openshift-dns/node-resolver-msxbb" Apr 22 19:57:47.266851 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.265369 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n5fwz\" (UniqueName: \"kubernetes.io/projected/e34993a1-0c8c-4395-9202-5841e22c2788-kube-api-access-n5fwz\") pod \"aws-ebs-csi-driver-node-tsrrb\" (UID: \"e34993a1-0c8c-4395-9202-5841e22c2788\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tsrrb" Apr 22 19:57:47.267589 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.265395 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8dlq4\" (UniqueName: \"kubernetes.io/projected/8b248b8e-1022-47ab-b16f-e3e4f3ee7abb-kube-api-access-8dlq4\") pod \"node-ca-rpfnc\" (UID: \"8b248b8e-1022-47ab-b16f-e3e4f3ee7abb\") " pod="openshift-image-registry/node-ca-rpfnc" Apr 22 19:57:47.267589 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.265409 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/8b248b8e-1022-47ab-b16f-e3e4f3ee7abb-serviceca\") pod \"node-ca-rpfnc\" (UID: \"8b248b8e-1022-47ab-b16f-e3e4f3ee7abb\") " pod="openshift-image-registry/node-ca-rpfnc" Apr 22 19:57:47.267589 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.265420 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e24f6b8a-d137-4b5b-94b4-011f680ada1d-host-run-netns\") pod \"multus-pkpcm\" (UID: \"e24f6b8a-d137-4b5b-94b4-011f680ada1d\") " pod="openshift-multus/multus-pkpcm" Apr 22 19:57:47.267589 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.265443 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/e24f6b8a-d137-4b5b-94b4-011f680ada1d-cni-binary-copy\") pod \"multus-pkpcm\" (UID: \"e24f6b8a-d137-4b5b-94b4-011f680ada1d\") " pod="openshift-multus/multus-pkpcm" Apr 22 19:57:47.267589 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.265505 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/36e9b580-270c-4cbb-b3e6-78fde6f244ec-host-run-ovn-kubernetes\") pod \"ovnkube-node-wrbxl\" (UID: \"36e9b580-270c-4cbb-b3e6-78fde6f244ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-wrbxl" Apr 22 19:57:47.267589 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.265545 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e24f6b8a-d137-4b5b-94b4-011f680ada1d-host-var-lib-kubelet\") pod \"multus-pkpcm\" (UID: \"e24f6b8a-d137-4b5b-94b4-011f680ada1d\") " pod="openshift-multus/multus-pkpcm" Apr 22 19:57:47.267589 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.265543 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/e34993a1-0c8c-4395-9202-5841e22c2788-socket-dir\") pod \"aws-ebs-csi-driver-node-tsrrb\" (UID: \"e34993a1-0c8c-4395-9202-5841e22c2788\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tsrrb" Apr 22 19:57:47.267589 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.265580 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/36e9b580-270c-4cbb-b3e6-78fde6f244ec-run-systemd\") pod \"ovnkube-node-wrbxl\" (UID: \"36e9b580-270c-4cbb-b3e6-78fde6f244ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-wrbxl" Apr 22 19:57:47.267589 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.265616 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/36e9b580-270c-4cbb-b3e6-78fde6f244ec-run-openvswitch\") pod \"ovnkube-node-wrbxl\" (UID: \"36e9b580-270c-4cbb-b3e6-78fde6f244ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-wrbxl" Apr 22 19:57:47.267589 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.265447 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/36e9b580-270c-4cbb-b3e6-78fde6f244ec-host-run-ovn-kubernetes\") pod \"ovnkube-node-wrbxl\" (UID: \"36e9b580-270c-4cbb-b3e6-78fde6f244ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-wrbxl" Apr 22 19:57:47.267589 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.265658 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vzzfk\" (UniqueName: \"kubernetes.io/projected/36e9b580-270c-4cbb-b3e6-78fde6f244ec-kube-api-access-vzzfk\") pod \"ovnkube-node-wrbxl\" (UID: \"36e9b580-270c-4cbb-b3e6-78fde6f244ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-wrbxl" Apr 22 19:57:47.267589 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.265683 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/90351fbd-9ae1-41d6-bb18-5239e60b2a9d-etc-sysctl-conf\") pod \"tuned-46d9d\" (UID: \"90351fbd-9ae1-41d6-bb18-5239e60b2a9d\") " pod="openshift-cluster-node-tuning-operator/tuned-46d9d" Apr 22 19:57:47.267589 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.265700 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/e24f6b8a-d137-4b5b-94b4-011f680ada1d-hostroot\") pod \"multus-pkpcm\" (UID: \"e24f6b8a-d137-4b5b-94b4-011f680ada1d\") " pod="openshift-multus/multus-pkpcm" Apr 22 19:57:47.267589 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.265707 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/90351fbd-9ae1-41d6-bb18-5239e60b2a9d-etc-tuned\") pod \"tuned-46d9d\" (UID: \"90351fbd-9ae1-41d6-bb18-5239e60b2a9d\") " pod="openshift-cluster-node-tuning-operator/tuned-46d9d" Apr 22 19:57:47.267589 ip-10-0-143-253 kubenswrapper[2577]: E0422 19:57:47.265712 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:57:47.267589 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.265739 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/3d105cfe-1e71-45ef-b072-4f6de04ca9c1-tmp-dir\") pod \"node-resolver-msxbb\" (UID: \"3d105cfe-1e71-45ef-b072-4f6de04ca9c1\") " pod="openshift-dns/node-resolver-msxbb" Apr 22 19:57:47.267589 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.265770 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e34993a1-0c8c-4395-9202-5841e22c2788-kubelet-dir\") pod \"aws-ebs-csi-driver-node-tsrrb\" (UID: \"e34993a1-0c8c-4395-9202-5841e22c2788\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tsrrb" Apr 22 19:57:47.268304 ip-10-0-143-253 kubenswrapper[2577]: E0422 19:57:47.265789 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d77c3bfb-9ca8-4f41-8c75-d15f2c1ab3b8-metrics-certs podName:d77c3bfb-9ca8-4f41-8c75-d15f2c1ab3b8 nodeName:}" failed. No retries permitted until 2026-04-22 19:57:47.765766966 +0000 UTC m=+3.080859177 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d77c3bfb-9ca8-4f41-8c75-d15f2c1ab3b8-metrics-certs") pod "network-metrics-daemon-5dv89" (UID: "d77c3bfb-9ca8-4f41-8c75-d15f2c1ab3b8") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:57:47.268304 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.265807 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e34993a1-0c8c-4395-9202-5841e22c2788-kubelet-dir\") pod \"aws-ebs-csi-driver-node-tsrrb\" (UID: \"e34993a1-0c8c-4395-9202-5841e22c2788\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tsrrb" Apr 22 19:57:47.268304 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.265821 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/e24f6b8a-d137-4b5b-94b4-011f680ada1d-os-release\") pod \"multus-pkpcm\" (UID: \"e24f6b8a-d137-4b5b-94b4-011f680ada1d\") " pod="openshift-multus/multus-pkpcm" Apr 22 19:57:47.268304 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.265879 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/36e9b580-270c-4cbb-b3e6-78fde6f244ec-log-socket\") pod \"ovnkube-node-wrbxl\" (UID: \"36e9b580-270c-4cbb-b3e6-78fde6f244ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-wrbxl" Apr 22 19:57:47.268304 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.265903 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/36e9b580-270c-4cbb-b3e6-78fde6f244ec-ovnkube-script-lib\") pod \"ovnkube-node-wrbxl\" (UID: \"36e9b580-270c-4cbb-b3e6-78fde6f244ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-wrbxl" Apr 22 19:57:47.268304 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.265926 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/90351fbd-9ae1-41d6-bb18-5239e60b2a9d-etc-modprobe-d\") pod \"tuned-46d9d\" (UID: \"90351fbd-9ae1-41d6-bb18-5239e60b2a9d\") " pod="openshift-cluster-node-tuning-operator/tuned-46d9d" Apr 22 19:57:47.268304 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.265932 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/8f5cde8a-b2da-4205-8a72-53560841ac3b-iptables-alerter-script\") pod \"iptables-alerter-kpwmj\" (UID: \"8f5cde8a-b2da-4205-8a72-53560841ac3b\") " pod="openshift-network-operator/iptables-alerter-kpwmj" Apr 22 19:57:47.268304 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.265948 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/90351fbd-9ae1-41d6-bb18-5239e60b2a9d-etc-sysconfig\") pod \"tuned-46d9d\" (UID: \"90351fbd-9ae1-41d6-bb18-5239e60b2a9d\") " pod="openshift-cluster-node-tuning-operator/tuned-46d9d" Apr 22 19:57:47.268304 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.265986 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/90351fbd-9ae1-41d6-bb18-5239e60b2a9d-etc-kubernetes\") pod \"tuned-46d9d\" (UID: \"90351fbd-9ae1-41d6-bb18-5239e60b2a9d\") " pod="openshift-cluster-node-tuning-operator/tuned-46d9d" Apr 22 19:57:47.268304 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.266008 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/90351fbd-9ae1-41d6-bb18-5239e60b2a9d-sys\") pod \"tuned-46d9d\" (UID: \"90351fbd-9ae1-41d6-bb18-5239e60b2a9d\") " pod="openshift-cluster-node-tuning-operator/tuned-46d9d" Apr 22 19:57:47.268304 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.266033 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/e34993a1-0c8c-4395-9202-5841e22c2788-registration-dir\") pod \"aws-ebs-csi-driver-node-tsrrb\" (UID: \"e34993a1-0c8c-4395-9202-5841e22c2788\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tsrrb" Apr 22 19:57:47.268304 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.266059 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/0763314b-12d3-4771-844c-120f25ae1bc3-dbus\") pod \"global-pull-secret-syncer-x55sm\" (UID: \"0763314b-12d3-4771-844c-120f25ae1bc3\") " pod="kube-system/global-pull-secret-syncer-x55sm" Apr 22 19:57:47.268304 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.266077 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/90351fbd-9ae1-41d6-bb18-5239e60b2a9d-etc-sysctl-d\") pod \"tuned-46d9d\" (UID: \"90351fbd-9ae1-41d6-bb18-5239e60b2a9d\") " pod="openshift-cluster-node-tuning-operator/tuned-46d9d" Apr 22 19:57:47.268304 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.266083 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/36e9b580-270c-4cbb-b3e6-78fde6f244ec-ovnkube-config\") pod \"ovnkube-node-wrbxl\" (UID: \"36e9b580-270c-4cbb-b3e6-78fde6f244ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-wrbxl" Apr 22 19:57:47.268304 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.266112 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/36e9b580-270c-4cbb-b3e6-78fde6f244ec-host-slash\") pod \"ovnkube-node-wrbxl\" (UID: \"36e9b580-270c-4cbb-b3e6-78fde6f244ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-wrbxl" Apr 22 19:57:47.268304 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.266131 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8prlz\" (UniqueName: \"kubernetes.io/projected/90351fbd-9ae1-41d6-bb18-5239e60b2a9d-kube-api-access-8prlz\") pod \"tuned-46d9d\" (UID: \"90351fbd-9ae1-41d6-bb18-5239e60b2a9d\") " pod="openshift-cluster-node-tuning-operator/tuned-46d9d" Apr 22 19:57:47.268304 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.266163 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8b248b8e-1022-47ab-b16f-e3e4f3ee7abb-host\") pod \"node-ca-rpfnc\" (UID: \"8b248b8e-1022-47ab-b16f-e3e4f3ee7abb\") " pod="openshift-image-registry/node-ca-rpfnc" Apr 22 19:57:47.269114 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.265152 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/36e9b580-270c-4cbb-b3e6-78fde6f244ec-host-run-netns\") pod \"ovnkube-node-wrbxl\" (UID: \"36e9b580-270c-4cbb-b3e6-78fde6f244ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-wrbxl" Apr 22 19:57:47.269114 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.267951 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/36e9b580-270c-4cbb-b3e6-78fde6f244ec-host-cni-bin\") pod \"ovnkube-node-wrbxl\" (UID: \"36e9b580-270c-4cbb-b3e6-78fde6f244ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-wrbxl" Apr 22 19:57:47.269114 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.268133 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/36e9b580-270c-4cbb-b3e6-78fde6f244ec-ovnkube-config\") pod \"ovnkube-node-wrbxl\" (UID: \"36e9b580-270c-4cbb-b3e6-78fde6f244ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-wrbxl" Apr 22 19:57:47.269114 ip-10-0-143-253 kubenswrapper[2577]: E0422 19:57:47.268190 2577 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 19:57:47.269114 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.268327 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/36e9b580-270c-4cbb-b3e6-78fde6f244ec-env-overrides\") pod \"ovnkube-node-wrbxl\" (UID: \"36e9b580-270c-4cbb-b3e6-78fde6f244ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-wrbxl" Apr 22 19:57:47.269114 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.268632 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/36e9b580-270c-4cbb-b3e6-78fde6f244ec-ovn-node-metrics-cert\") pod \"ovnkube-node-wrbxl\" (UID: \"36e9b580-270c-4cbb-b3e6-78fde6f244ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-wrbxl" Apr 22 19:57:47.269114 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.268887 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5zggk\" (UniqueName: \"kubernetes.io/projected/8f5cde8a-b2da-4205-8a72-53560841ac3b-kube-api-access-5zggk\") pod \"iptables-alerter-kpwmj\" (UID: \"8f5cde8a-b2da-4205-8a72-53560841ac3b\") " pod="openshift-network-operator/iptables-alerter-kpwmj" Apr 22 19:57:47.269114 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.268918 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e24f6b8a-d137-4b5b-94b4-011f680ada1d-host-run-netns\") pod \"multus-pkpcm\" (UID: \"e24f6b8a-d137-4b5b-94b4-011f680ada1d\") " pod="openshift-multus/multus-pkpcm" Apr 22 19:57:47.269114 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.268954 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/90351fbd-9ae1-41d6-bb18-5239e60b2a9d-lib-modules\") pod \"tuned-46d9d\" (UID: \"90351fbd-9ae1-41d6-bb18-5239e60b2a9d\") " pod="openshift-cluster-node-tuning-operator/tuned-46d9d" Apr 22 19:57:47.269114 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.269061 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/90351fbd-9ae1-41d6-bb18-5239e60b2a9d-etc-sysctl-conf\") pod \"tuned-46d9d\" (UID: \"90351fbd-9ae1-41d6-bb18-5239e60b2a9d\") " pod="openshift-cluster-node-tuning-operator/tuned-46d9d" Apr 22 19:57:47.269586 ip-10-0-143-253 kubenswrapper[2577]: E0422 19:57:47.269229 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0763314b-12d3-4771-844c-120f25ae1bc3-original-pull-secret podName:0763314b-12d3-4771-844c-120f25ae1bc3 nodeName:}" failed. No retries permitted until 2026-04-22 19:57:47.769210068 +0000 UTC m=+3.084302291 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/0763314b-12d3-4771-844c-120f25ae1bc3-original-pull-secret") pod "global-pull-secret-syncer-x55sm" (UID: "0763314b-12d3-4771-844c-120f25ae1bc3") : object "kube-system"/"original-pull-secret" not registered Apr 22 19:57:47.269586 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.269239 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8b248b8e-1022-47ab-b16f-e3e4f3ee7abb-host\") pod \"node-ca-rpfnc\" (UID: \"8b248b8e-1022-47ab-b16f-e3e4f3ee7abb\") " pod="openshift-image-registry/node-ca-rpfnc" Apr 22 19:57:47.269586 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.269294 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/90351fbd-9ae1-41d6-bb18-5239e60b2a9d-etc-kubernetes\") pod \"tuned-46d9d\" (UID: \"90351fbd-9ae1-41d6-bb18-5239e60b2a9d\") " pod="openshift-cluster-node-tuning-operator/tuned-46d9d" Apr 22 19:57:47.269586 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.269356 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/e24f6b8a-d137-4b5b-94b4-011f680ada1d-os-release\") pod \"multus-pkpcm\" (UID: \"e24f6b8a-d137-4b5b-94b4-011f680ada1d\") " pod="openshift-multus/multus-pkpcm" Apr 22 19:57:47.269586 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.269399 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/36e9b580-270c-4cbb-b3e6-78fde6f244ec-log-socket\") pod \"ovnkube-node-wrbxl\" (UID: \"36e9b580-270c-4cbb-b3e6-78fde6f244ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-wrbxl" Apr 22 19:57:47.269586 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.269484 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/90351fbd-9ae1-41d6-bb18-5239e60b2a9d-sys\") pod \"tuned-46d9d\" (UID: \"90351fbd-9ae1-41d6-bb18-5239e60b2a9d\") " pod="openshift-cluster-node-tuning-operator/tuned-46d9d" Apr 22 19:57:47.269586 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.269529 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e24f6b8a-d137-4b5b-94b4-011f680ada1d-multus-conf-dir\") pod \"multus-pkpcm\" (UID: \"e24f6b8a-d137-4b5b-94b4-011f680ada1d\") " pod="openshift-multus/multus-pkpcm" Apr 22 19:57:47.270038 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.269586 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/90351fbd-9ae1-41d6-bb18-5239e60b2a9d-lib-modules\") pod \"tuned-46d9d\" (UID: \"90351fbd-9ae1-41d6-bb18-5239e60b2a9d\") " pod="openshift-cluster-node-tuning-operator/tuned-46d9d" Apr 22 19:57:47.270038 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.269660 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/e34993a1-0c8c-4395-9202-5841e22c2788-registration-dir\") pod \"aws-ebs-csi-driver-node-tsrrb\" (UID: \"e34993a1-0c8c-4395-9202-5841e22c2788\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tsrrb" Apr 22 19:57:47.270038 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.269734 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/90351fbd-9ae1-41d6-bb18-5239e60b2a9d-etc-sysconfig\") pod \"tuned-46d9d\" (UID: \"90351fbd-9ae1-41d6-bb18-5239e60b2a9d\") " pod="openshift-cluster-node-tuning-operator/tuned-46d9d" Apr 22 19:57:47.270038 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.269753 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/90351fbd-9ae1-41d6-bb18-5239e60b2a9d-etc-modprobe-d\") pod \"tuned-46d9d\" (UID: \"90351fbd-9ae1-41d6-bb18-5239e60b2a9d\") " pod="openshift-cluster-node-tuning-operator/tuned-46d9d" Apr 22 19:57:47.270038 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.269874 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/0763314b-12d3-4771-844c-120f25ae1bc3-dbus\") pod \"global-pull-secret-syncer-x55sm\" (UID: \"0763314b-12d3-4771-844c-120f25ae1bc3\") " pod="kube-system/global-pull-secret-syncer-x55sm" Apr 22 19:57:47.270038 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.269926 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/36e9b580-270c-4cbb-b3e6-78fde6f244ec-ovnkube-script-lib\") pod \"ovnkube-node-wrbxl\" (UID: \"36e9b580-270c-4cbb-b3e6-78fde6f244ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-wrbxl" Apr 22 19:57:47.270038 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.269917 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/90351fbd-9ae1-41d6-bb18-5239e60b2a9d-tmp\") pod \"tuned-46d9d\" (UID: \"90351fbd-9ae1-41d6-bb18-5239e60b2a9d\") " pod="openshift-cluster-node-tuning-operator/tuned-46d9d" Apr 22 19:57:47.271150 ip-10-0-143-253 kubenswrapper[2577]: E0422 19:57:47.270465 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 19:57:47.271150 ip-10-0-143-253 kubenswrapper[2577]: E0422 19:57:47.270509 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 19:57:47.271150 ip-10-0-143-253 kubenswrapper[2577]: E0422 19:57:47.270523 2577 projected.go:194] Error preparing data for projected volume kube-api-access-l9mmc for pod openshift-network-diagnostics/network-check-target-t68sf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:57:47.271150 ip-10-0-143-253 kubenswrapper[2577]: E0422 19:57:47.270608 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/73129b41-d555-4a74-9f2a-640a35e9625f-kube-api-access-l9mmc podName:73129b41-d555-4a74-9f2a-640a35e9625f nodeName:}" failed. No retries permitted until 2026-04-22 19:57:47.770591513 +0000 UTC m=+3.085683744 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-l9mmc" (UniqueName: "kubernetes.io/projected/73129b41-d555-4a74-9f2a-640a35e9625f-kube-api-access-l9mmc") pod "network-check-target-t68sf" (UID: "73129b41-d555-4a74-9f2a-640a35e9625f") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:57:47.272026 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.272004 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/90351fbd-9ae1-41d6-bb18-5239e60b2a9d-etc-tuned\") pod \"tuned-46d9d\" (UID: \"90351fbd-9ae1-41d6-bb18-5239e60b2a9d\") " pod="openshift-cluster-node-tuning-operator/tuned-46d9d" Apr 22 19:57:47.273698 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.273671 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qvqr8\" (UniqueName: \"kubernetes.io/projected/d77c3bfb-9ca8-4f41-8c75-d15f2c1ab3b8-kube-api-access-qvqr8\") pod \"network-metrics-daemon-5dv89\" (UID: \"d77c3bfb-9ca8-4f41-8c75-d15f2c1ab3b8\") " pod="openshift-multus/network-metrics-daemon-5dv89" Apr 22 19:57:47.280661 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.280595 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2rmhd\" (UniqueName: \"kubernetes.io/projected/e24f6b8a-d137-4b5b-94b4-011f680ada1d-kube-api-access-2rmhd\") pod \"multus-pkpcm\" (UID: \"e24f6b8a-d137-4b5b-94b4-011f680ada1d\") " pod="openshift-multus/multus-pkpcm" Apr 22 19:57:47.281070 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.281052 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q4bl4\" (UniqueName: \"kubernetes.io/projected/c28089f2-d625-4e69-b372-16c2a540e3a1-kube-api-access-q4bl4\") pod \"multus-additional-cni-plugins-p4l6b\" (UID: \"c28089f2-d625-4e69-b372-16c2a540e3a1\") " pod="openshift-multus/multus-additional-cni-plugins-p4l6b" Apr 22 19:57:47.281070 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.281058 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5zggk\" (UniqueName: \"kubernetes.io/projected/8f5cde8a-b2da-4205-8a72-53560841ac3b-kube-api-access-5zggk\") pod \"iptables-alerter-kpwmj\" (UID: \"8f5cde8a-b2da-4205-8a72-53560841ac3b\") " pod="openshift-network-operator/iptables-alerter-kpwmj" Apr 22 19:57:47.281680 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.281656 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n5fwz\" (UniqueName: \"kubernetes.io/projected/e34993a1-0c8c-4395-9202-5841e22c2788-kube-api-access-n5fwz\") pod \"aws-ebs-csi-driver-node-tsrrb\" (UID: \"e34993a1-0c8c-4395-9202-5841e22c2788\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tsrrb" Apr 22 19:57:47.281756 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.281693 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8dlq4\" (UniqueName: \"kubernetes.io/projected/8b248b8e-1022-47ab-b16f-e3e4f3ee7abb-kube-api-access-8dlq4\") pod \"node-ca-rpfnc\" (UID: \"8b248b8e-1022-47ab-b16f-e3e4f3ee7abb\") " pod="openshift-image-registry/node-ca-rpfnc" Apr 22 19:57:47.281998 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.281978 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vzzfk\" (UniqueName: \"kubernetes.io/projected/36e9b580-270c-4cbb-b3e6-78fde6f244ec-kube-api-access-vzzfk\") pod \"ovnkube-node-wrbxl\" (UID: \"36e9b580-270c-4cbb-b3e6-78fde6f244ec\") " pod="openshift-ovn-kubernetes/ovnkube-node-wrbxl" Apr 22 19:57:47.283177 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.283155 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8prlz\" (UniqueName: \"kubernetes.io/projected/90351fbd-9ae1-41d6-bb18-5239e60b2a9d-kube-api-access-8prlz\") pod \"tuned-46d9d\" (UID: \"90351fbd-9ae1-41d6-bb18-5239e60b2a9d\") " pod="openshift-cluster-node-tuning-operator/tuned-46d9d" Apr 22 19:57:47.370159 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.370122 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zvwp4\" (UniqueName: \"kubernetes.io/projected/3d105cfe-1e71-45ef-b072-4f6de04ca9c1-kube-api-access-zvwp4\") pod \"node-resolver-msxbb\" (UID: \"3d105cfe-1e71-45ef-b072-4f6de04ca9c1\") " pod="openshift-dns/node-resolver-msxbb" Apr 22 19:57:47.370347 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.370209 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/74ab80fc-aaa7-48f7-8670-ed1cd47ff5c8-konnectivity-ca\") pod \"konnectivity-agent-wzwch\" (UID: \"74ab80fc-aaa7-48f7-8670-ed1cd47ff5c8\") " pod="kube-system/konnectivity-agent-wzwch" Apr 22 19:57:47.370347 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.370301 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/74ab80fc-aaa7-48f7-8670-ed1cd47ff5c8-agent-certs\") pod \"konnectivity-agent-wzwch\" (UID: \"74ab80fc-aaa7-48f7-8670-ed1cd47ff5c8\") " pod="kube-system/konnectivity-agent-wzwch" Apr 22 19:57:47.370347 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.370336 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/3d105cfe-1e71-45ef-b072-4f6de04ca9c1-hosts-file\") pod \"node-resolver-msxbb\" (UID: \"3d105cfe-1e71-45ef-b072-4f6de04ca9c1\") " pod="openshift-dns/node-resolver-msxbb" Apr 22 19:57:47.370504 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.370370 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/3d105cfe-1e71-45ef-b072-4f6de04ca9c1-tmp-dir\") pod \"node-resolver-msxbb\" (UID: \"3d105cfe-1e71-45ef-b072-4f6de04ca9c1\") " pod="openshift-dns/node-resolver-msxbb" Apr 22 19:57:47.370504 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.370455 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/3d105cfe-1e71-45ef-b072-4f6de04ca9c1-hosts-file\") pod \"node-resolver-msxbb\" (UID: \"3d105cfe-1e71-45ef-b072-4f6de04ca9c1\") " pod="openshift-dns/node-resolver-msxbb" Apr 22 19:57:47.370774 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.370713 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/3d105cfe-1e71-45ef-b072-4f6de04ca9c1-tmp-dir\") pod \"node-resolver-msxbb\" (UID: \"3d105cfe-1e71-45ef-b072-4f6de04ca9c1\") " pod="openshift-dns/node-resolver-msxbb" Apr 22 19:57:47.371012 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.370994 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/74ab80fc-aaa7-48f7-8670-ed1cd47ff5c8-konnectivity-ca\") pod \"konnectivity-agent-wzwch\" (UID: \"74ab80fc-aaa7-48f7-8670-ed1cd47ff5c8\") " pod="kube-system/konnectivity-agent-wzwch" Apr 22 19:57:47.373067 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.373045 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/74ab80fc-aaa7-48f7-8670-ed1cd47ff5c8-agent-certs\") pod \"konnectivity-agent-wzwch\" (UID: \"74ab80fc-aaa7-48f7-8670-ed1cd47ff5c8\") " pod="kube-system/konnectivity-agent-wzwch" Apr 22 19:57:47.378415 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.378395 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zvwp4\" (UniqueName: \"kubernetes.io/projected/3d105cfe-1e71-45ef-b072-4f6de04ca9c1-kube-api-access-zvwp4\") pod \"node-resolver-msxbb\" (UID: \"3d105cfe-1e71-45ef-b072-4f6de04ca9c1\") " pod="openshift-dns/node-resolver-msxbb" Apr 22 19:57:47.453529 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.453452 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-46d9d" Apr 22 19:57:47.462246 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.462222 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tsrrb" Apr 22 19:57:47.469807 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.469789 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-p4l6b" Apr 22 19:57:47.474508 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.474487 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-pkpcm" Apr 22 19:57:47.482060 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.482031 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-kpwmj" Apr 22 19:57:47.488579 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.488563 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-rpfnc" Apr 22 19:57:47.496263 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.496243 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-wrbxl" Apr 22 19:57:47.504786 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.504768 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-msxbb" Apr 22 19:57:47.509358 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.509341 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-wzwch" Apr 22 19:57:47.773544 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.773522 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l9mmc\" (UniqueName: \"kubernetes.io/projected/73129b41-d555-4a74-9f2a-640a35e9625f-kube-api-access-l9mmc\") pod \"network-check-target-t68sf\" (UID: \"73129b41-d555-4a74-9f2a-640a35e9625f\") " pod="openshift-network-diagnostics/network-check-target-t68sf" Apr 22 19:57:47.773626 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.773557 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/0763314b-12d3-4771-844c-120f25ae1bc3-original-pull-secret\") pod \"global-pull-secret-syncer-x55sm\" (UID: \"0763314b-12d3-4771-844c-120f25ae1bc3\") " pod="kube-system/global-pull-secret-syncer-x55sm" Apr 22 19:57:47.773626 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:47.773589 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d77c3bfb-9ca8-4f41-8c75-d15f2c1ab3b8-metrics-certs\") pod \"network-metrics-daemon-5dv89\" (UID: \"d77c3bfb-9ca8-4f41-8c75-d15f2c1ab3b8\") " pod="openshift-multus/network-metrics-daemon-5dv89" Apr 22 19:57:47.773710 ip-10-0-143-253 kubenswrapper[2577]: E0422 19:57:47.773673 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:57:47.773710 ip-10-0-143-253 kubenswrapper[2577]: E0422 19:57:47.773676 2577 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 19:57:47.773710 ip-10-0-143-253 kubenswrapper[2577]: E0422 19:57:47.773687 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 19:57:47.773710 ip-10-0-143-253 kubenswrapper[2577]: E0422 19:57:47.773706 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 19:57:47.773851 ip-10-0-143-253 kubenswrapper[2577]: E0422 19:57:47.773728 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d77c3bfb-9ca8-4f41-8c75-d15f2c1ab3b8-metrics-certs podName:d77c3bfb-9ca8-4f41-8c75-d15f2c1ab3b8 nodeName:}" failed. No retries permitted until 2026-04-22 19:57:48.773713332 +0000 UTC m=+4.088805542 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d77c3bfb-9ca8-4f41-8c75-d15f2c1ab3b8-metrics-certs") pod "network-metrics-daemon-5dv89" (UID: "d77c3bfb-9ca8-4f41-8c75-d15f2c1ab3b8") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:57:47.773851 ip-10-0-143-253 kubenswrapper[2577]: E0422 19:57:47.773717 2577 projected.go:194] Error preparing data for projected volume kube-api-access-l9mmc for pod openshift-network-diagnostics/network-check-target-t68sf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:57:47.773851 ip-10-0-143-253 kubenswrapper[2577]: E0422 19:57:47.773746 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0763314b-12d3-4771-844c-120f25ae1bc3-original-pull-secret podName:0763314b-12d3-4771-844c-120f25ae1bc3 nodeName:}" failed. No retries permitted until 2026-04-22 19:57:48.773737037 +0000 UTC m=+4.088829249 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/0763314b-12d3-4771-844c-120f25ae1bc3-original-pull-secret") pod "global-pull-secret-syncer-x55sm" (UID: "0763314b-12d3-4771-844c-120f25ae1bc3") : object "kube-system"/"original-pull-secret" not registered Apr 22 19:57:47.773851 ip-10-0-143-253 kubenswrapper[2577]: E0422 19:57:47.773770 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/73129b41-d555-4a74-9f2a-640a35e9625f-kube-api-access-l9mmc podName:73129b41-d555-4a74-9f2a-640a35e9625f nodeName:}" failed. No retries permitted until 2026-04-22 19:57:48.773755856 +0000 UTC m=+4.088848071 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-l9mmc" (UniqueName: "kubernetes.io/projected/73129b41-d555-4a74-9f2a-640a35e9625f-kube-api-access-l9mmc") pod "network-check-target-t68sf" (UID: "73129b41-d555-4a74-9f2a-640a35e9625f") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:57:47.784403 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:47.784372 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode34993a1_0c8c_4395_9202_5841e22c2788.slice/crio-a366da4db093918c77de43766842ad3d9ee9d496b3290ad079b9226cd831e746 WatchSource:0}: Error finding container a366da4db093918c77de43766842ad3d9ee9d496b3290ad079b9226cd831e746: Status 404 returned error can't find the container with id a366da4db093918c77de43766842ad3d9ee9d496b3290ad079b9226cd831e746 Apr 22 19:57:47.786447 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:47.786419 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode24f6b8a_d137_4b5b_94b4_011f680ada1d.slice/crio-7114da8bb117017cf551a2108c93315bb6dfa401eaff6c7e9cfc0e105b0910a3 WatchSource:0}: Error finding container 7114da8bb117017cf551a2108c93315bb6dfa401eaff6c7e9cfc0e105b0910a3: Status 404 returned error can't find the container with id 7114da8bb117017cf551a2108c93315bb6dfa401eaff6c7e9cfc0e105b0910a3 Apr 22 19:57:47.791508 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:47.791485 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod74ab80fc_aaa7_48f7_8670_ed1cd47ff5c8.slice/crio-3c5862a7efe1d34fc4a6114a382297457bfba44e3ebcbb2d8e4b74e41003a3c5 WatchSource:0}: Error finding container 3c5862a7efe1d34fc4a6114a382297457bfba44e3ebcbb2d8e4b74e41003a3c5: Status 404 returned error can't find the container with id 3c5862a7efe1d34fc4a6114a382297457bfba44e3ebcbb2d8e4b74e41003a3c5 Apr 22 19:57:47.792358 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:47.792326 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc28089f2_d625_4e69_b372_16c2a540e3a1.slice/crio-3e2d798f5f9232ee31d749251b338113aa4b31665bd9280ad722f70253498358 WatchSource:0}: Error finding container 3e2d798f5f9232ee31d749251b338113aa4b31665bd9280ad722f70253498358: Status 404 returned error can't find the container with id 3e2d798f5f9232ee31d749251b338113aa4b31665bd9280ad722f70253498358 Apr 22 19:57:47.792817 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:47.792794 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8b248b8e_1022_47ab_b16f_e3e4f3ee7abb.slice/crio-9e623965fd7dfe8b682ba2c2387a55a6279060dbf232b6c389874d9c11799316 WatchSource:0}: Error finding container 9e623965fd7dfe8b682ba2c2387a55a6279060dbf232b6c389874d9c11799316: Status 404 returned error can't find the container with id 9e623965fd7dfe8b682ba2c2387a55a6279060dbf232b6c389874d9c11799316 Apr 22 19:57:47.793846 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:47.793707 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod90351fbd_9ae1_41d6_bb18_5239e60b2a9d.slice/crio-186b0e3ba1ea5533d43338cd4d99e61f0814160a4b17c1b912b1b74f012e73db WatchSource:0}: Error finding container 186b0e3ba1ea5533d43338cd4d99e61f0814160a4b17c1b912b1b74f012e73db: Status 404 returned error can't find the container with id 186b0e3ba1ea5533d43338cd4d99e61f0814160a4b17c1b912b1b74f012e73db Apr 22 19:57:47.794960 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:47.794883 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod36e9b580_270c_4cbb_b3e6_78fde6f244ec.slice/crio-68cb31c8dbc425daf263b2f0564fe5bd61b23b76abbbca14518189021d32307a WatchSource:0}: Error finding container 68cb31c8dbc425daf263b2f0564fe5bd61b23b76abbbca14518189021d32307a: Status 404 returned error can't find the container with id 68cb31c8dbc425daf263b2f0564fe5bd61b23b76abbbca14518189021d32307a Apr 22 19:57:47.795739 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:47.795706 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8f5cde8a_b2da_4205_8a72_53560841ac3b.slice/crio-69dab1ad90be296a756d894b405c7a6a66ff9b6e6dee34b6b1f486d412bc329d WatchSource:0}: Error finding container 69dab1ad90be296a756d894b405c7a6a66ff9b6e6dee34b6b1f486d412bc329d: Status 404 returned error can't find the container with id 69dab1ad90be296a756d894b405c7a6a66ff9b6e6dee34b6b1f486d412bc329d Apr 22 19:57:47.798305 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:57:47.798270 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3d105cfe_1e71_45ef_b072_4f6de04ca9c1.slice/crio-b4eac16ccf39105abdc3a1b7d1d3d52a3e569285a4fe5496edf8da52b7a697bd WatchSource:0}: Error finding container b4eac16ccf39105abdc3a1b7d1d3d52a3e569285a4fe5496edf8da52b7a697bd: Status 404 returned error can't find the container with id b4eac16ccf39105abdc3a1b7d1d3d52a3e569285a4fe5496edf8da52b7a697bd Apr 22 19:57:48.200993 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:48.200657 2577 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-21 19:52:46 +0000 UTC" deadline="2027-10-10 19:18:17.862780221 +0000 UTC" Apr 22 19:57:48.200993 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:48.200953 2577 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12863h20m29.661834086s" Apr 22 19:57:48.274640 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:48.273826 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-143-253.ec2.internal" event={"ID":"68c5e58877595fc451d476fd9e217735","Type":"ContainerStarted","Data":"e4917e2a9f1cbcd243ddd872278890f71622d1ec53e61f07d031cfdbee820d58"} Apr 22 19:57:48.279709 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:48.279626 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-46d9d" event={"ID":"90351fbd-9ae1-41d6-bb18-5239e60b2a9d","Type":"ContainerStarted","Data":"186b0e3ba1ea5533d43338cd4d99e61f0814160a4b17c1b912b1b74f012e73db"} Apr 22 19:57:48.283288 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:48.283219 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-rpfnc" event={"ID":"8b248b8e-1022-47ab-b16f-e3e4f3ee7abb","Type":"ContainerStarted","Data":"9e623965fd7dfe8b682ba2c2387a55a6279060dbf232b6c389874d9c11799316"} Apr 22 19:57:48.286551 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:48.286521 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tsrrb" event={"ID":"e34993a1-0c8c-4395-9202-5841e22c2788","Type":"ContainerStarted","Data":"a366da4db093918c77de43766842ad3d9ee9d496b3290ad079b9226cd831e746"} Apr 22 19:57:48.292445 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:48.292100 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-143-253.ec2.internal" podStartSLOduration=2.292083807 podStartE2EDuration="2.292083807s" podCreationTimestamp="2026-04-22 19:57:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:57:48.291197933 +0000 UTC m=+3.606290167" watchObservedRunningTime="2026-04-22 19:57:48.292083807 +0000 UTC m=+3.607176039" Apr 22 19:57:48.297250 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:48.297208 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-msxbb" event={"ID":"3d105cfe-1e71-45ef-b072-4f6de04ca9c1","Type":"ContainerStarted","Data":"b4eac16ccf39105abdc3a1b7d1d3d52a3e569285a4fe5496edf8da52b7a697bd"} Apr 22 19:57:48.300401 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:48.300342 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-kpwmj" event={"ID":"8f5cde8a-b2da-4205-8a72-53560841ac3b","Type":"ContainerStarted","Data":"69dab1ad90be296a756d894b405c7a6a66ff9b6e6dee34b6b1f486d412bc329d"} Apr 22 19:57:48.306428 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:48.306401 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wrbxl" event={"ID":"36e9b580-270c-4cbb-b3e6-78fde6f244ec","Type":"ContainerStarted","Data":"68cb31c8dbc425daf263b2f0564fe5bd61b23b76abbbca14518189021d32307a"} Apr 22 19:57:48.319980 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:48.319939 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-p4l6b" event={"ID":"c28089f2-d625-4e69-b372-16c2a540e3a1","Type":"ContainerStarted","Data":"3e2d798f5f9232ee31d749251b338113aa4b31665bd9280ad722f70253498358"} Apr 22 19:57:48.323063 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:48.322993 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-wzwch" event={"ID":"74ab80fc-aaa7-48f7-8670-ed1cd47ff5c8","Type":"ContainerStarted","Data":"3c5862a7efe1d34fc4a6114a382297457bfba44e3ebcbb2d8e4b74e41003a3c5"} Apr 22 19:57:48.327581 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:48.327520 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-pkpcm" event={"ID":"e24f6b8a-d137-4b5b-94b4-011f680ada1d","Type":"ContainerStarted","Data":"7114da8bb117017cf551a2108c93315bb6dfa401eaff6c7e9cfc0e105b0910a3"} Apr 22 19:57:48.783635 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:48.783606 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d77c3bfb-9ca8-4f41-8c75-d15f2c1ab3b8-metrics-certs\") pod \"network-metrics-daemon-5dv89\" (UID: \"d77c3bfb-9ca8-4f41-8c75-d15f2c1ab3b8\") " pod="openshift-multus/network-metrics-daemon-5dv89" Apr 22 19:57:48.783751 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:48.783674 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l9mmc\" (UniqueName: \"kubernetes.io/projected/73129b41-d555-4a74-9f2a-640a35e9625f-kube-api-access-l9mmc\") pod \"network-check-target-t68sf\" (UID: \"73129b41-d555-4a74-9f2a-640a35e9625f\") " pod="openshift-network-diagnostics/network-check-target-t68sf" Apr 22 19:57:48.783751 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:48.783706 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/0763314b-12d3-4771-844c-120f25ae1bc3-original-pull-secret\") pod \"global-pull-secret-syncer-x55sm\" (UID: \"0763314b-12d3-4771-844c-120f25ae1bc3\") " pod="kube-system/global-pull-secret-syncer-x55sm" Apr 22 19:57:48.783884 ip-10-0-143-253 kubenswrapper[2577]: E0422 19:57:48.783792 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:57:48.783884 ip-10-0-143-253 kubenswrapper[2577]: E0422 19:57:48.783800 2577 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 19:57:48.783984 ip-10-0-143-253 kubenswrapper[2577]: E0422 19:57:48.783896 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 19:57:48.783984 ip-10-0-143-253 kubenswrapper[2577]: E0422 19:57:48.783911 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0763314b-12d3-4771-844c-120f25ae1bc3-original-pull-secret podName:0763314b-12d3-4771-844c-120f25ae1bc3 nodeName:}" failed. No retries permitted until 2026-04-22 19:57:50.783889431 +0000 UTC m=+6.098981657 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/0763314b-12d3-4771-844c-120f25ae1bc3-original-pull-secret") pod "global-pull-secret-syncer-x55sm" (UID: "0763314b-12d3-4771-844c-120f25ae1bc3") : object "kube-system"/"original-pull-secret" not registered Apr 22 19:57:48.783984 ip-10-0-143-253 kubenswrapper[2577]: E0422 19:57:48.783912 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 19:57:48.783984 ip-10-0-143-253 kubenswrapper[2577]: E0422 19:57:48.783926 2577 projected.go:194] Error preparing data for projected volume kube-api-access-l9mmc for pod openshift-network-diagnostics/network-check-target-t68sf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:57:48.783984 ip-10-0-143-253 kubenswrapper[2577]: E0422 19:57:48.783944 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d77c3bfb-9ca8-4f41-8c75-d15f2c1ab3b8-metrics-certs podName:d77c3bfb-9ca8-4f41-8c75-d15f2c1ab3b8 nodeName:}" failed. No retries permitted until 2026-04-22 19:57:50.783931258 +0000 UTC m=+6.099023483 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d77c3bfb-9ca8-4f41-8c75-d15f2c1ab3b8-metrics-certs") pod "network-metrics-daemon-5dv89" (UID: "d77c3bfb-9ca8-4f41-8c75-d15f2c1ab3b8") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:57:48.783984 ip-10-0-143-253 kubenswrapper[2577]: E0422 19:57:48.783975 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/73129b41-d555-4a74-9f2a-640a35e9625f-kube-api-access-l9mmc podName:73129b41-d555-4a74-9f2a-640a35e9625f nodeName:}" failed. No retries permitted until 2026-04-22 19:57:50.783958764 +0000 UTC m=+6.099050988 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-l9mmc" (UniqueName: "kubernetes.io/projected/73129b41-d555-4a74-9f2a-640a35e9625f-kube-api-access-l9mmc") pod "network-check-target-t68sf" (UID: "73129b41-d555-4a74-9f2a-640a35e9625f") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:57:49.249723 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:49.249015 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5dv89" Apr 22 19:57:49.249723 ip-10-0-143-253 kubenswrapper[2577]: E0422 19:57:49.249153 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5dv89" podUID="d77c3bfb-9ca8-4f41-8c75-d15f2c1ab3b8" Apr 22 19:57:49.249723 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:49.249573 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-x55sm" Apr 22 19:57:49.249723 ip-10-0-143-253 kubenswrapper[2577]: E0422 19:57:49.249668 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-x55sm" podUID="0763314b-12d3-4771-844c-120f25ae1bc3" Apr 22 19:57:49.250762 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:49.250563 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-t68sf" Apr 22 19:57:49.250762 ip-10-0-143-253 kubenswrapper[2577]: E0422 19:57:49.250655 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-t68sf" podUID="73129b41-d555-4a74-9f2a-640a35e9625f" Apr 22 19:57:49.343319 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:49.343186 2577 generic.go:358] "Generic (PLEG): container finished" podID="6a8d971fc4211db7d5e8400be76011e5" containerID="b6f0f31fe3b07e25210a4a36c92a5ea408e66321017dad2e203338e32e1c317f" exitCode=0 Apr 22 19:57:49.343319 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:49.343273 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-253.ec2.internal" event={"ID":"6a8d971fc4211db7d5e8400be76011e5","Type":"ContainerDied","Data":"b6f0f31fe3b07e25210a4a36c92a5ea408e66321017dad2e203338e32e1c317f"} Apr 22 19:57:50.351715 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:50.351674 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-253.ec2.internal" event={"ID":"6a8d971fc4211db7d5e8400be76011e5","Type":"ContainerStarted","Data":"5d6410d6e7019a88516362312028afd83e414a2e6dff4a60827833ba9b0d8b7d"} Apr 22 19:57:50.801812 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:50.801765 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d77c3bfb-9ca8-4f41-8c75-d15f2c1ab3b8-metrics-certs\") pod \"network-metrics-daemon-5dv89\" (UID: \"d77c3bfb-9ca8-4f41-8c75-d15f2c1ab3b8\") " pod="openshift-multus/network-metrics-daemon-5dv89" Apr 22 19:57:50.802128 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:50.801858 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l9mmc\" (UniqueName: \"kubernetes.io/projected/73129b41-d555-4a74-9f2a-640a35e9625f-kube-api-access-l9mmc\") pod \"network-check-target-t68sf\" (UID: \"73129b41-d555-4a74-9f2a-640a35e9625f\") " pod="openshift-network-diagnostics/network-check-target-t68sf" Apr 22 19:57:50.802128 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:50.801891 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/0763314b-12d3-4771-844c-120f25ae1bc3-original-pull-secret\") pod \"global-pull-secret-syncer-x55sm\" (UID: \"0763314b-12d3-4771-844c-120f25ae1bc3\") " pod="kube-system/global-pull-secret-syncer-x55sm" Apr 22 19:57:50.802128 ip-10-0-143-253 kubenswrapper[2577]: E0422 19:57:50.801986 2577 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 19:57:50.802128 ip-10-0-143-253 kubenswrapper[2577]: E0422 19:57:50.801987 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:57:50.802128 ip-10-0-143-253 kubenswrapper[2577]: E0422 19:57:50.801998 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 19:57:50.802128 ip-10-0-143-253 kubenswrapper[2577]: E0422 19:57:50.802042 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0763314b-12d3-4771-844c-120f25ae1bc3-original-pull-secret podName:0763314b-12d3-4771-844c-120f25ae1bc3 nodeName:}" failed. No retries permitted until 2026-04-22 19:57:54.802022745 +0000 UTC m=+10.117114978 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/0763314b-12d3-4771-844c-120f25ae1bc3-original-pull-secret") pod "global-pull-secret-syncer-x55sm" (UID: "0763314b-12d3-4771-844c-120f25ae1bc3") : object "kube-system"/"original-pull-secret" not registered Apr 22 19:57:50.802128 ip-10-0-143-253 kubenswrapper[2577]: E0422 19:57:50.802054 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 19:57:50.802128 ip-10-0-143-253 kubenswrapper[2577]: E0422 19:57:50.802062 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d77c3bfb-9ca8-4f41-8c75-d15f2c1ab3b8-metrics-certs podName:d77c3bfb-9ca8-4f41-8c75-d15f2c1ab3b8 nodeName:}" failed. No retries permitted until 2026-04-22 19:57:54.802050322 +0000 UTC m=+10.117142535 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d77c3bfb-9ca8-4f41-8c75-d15f2c1ab3b8-metrics-certs") pod "network-metrics-daemon-5dv89" (UID: "d77c3bfb-9ca8-4f41-8c75-d15f2c1ab3b8") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:57:50.802128 ip-10-0-143-253 kubenswrapper[2577]: E0422 19:57:50.802067 2577 projected.go:194] Error preparing data for projected volume kube-api-access-l9mmc for pod openshift-network-diagnostics/network-check-target-t68sf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:57:50.802128 ip-10-0-143-253 kubenswrapper[2577]: E0422 19:57:50.802107 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/73129b41-d555-4a74-9f2a-640a35e9625f-kube-api-access-l9mmc podName:73129b41-d555-4a74-9f2a-640a35e9625f nodeName:}" failed. No retries permitted until 2026-04-22 19:57:54.802091408 +0000 UTC m=+10.117183631 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-l9mmc" (UniqueName: "kubernetes.io/projected/73129b41-d555-4a74-9f2a-640a35e9625f-kube-api-access-l9mmc") pod "network-check-target-t68sf" (UID: "73129b41-d555-4a74-9f2a-640a35e9625f") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:57:51.250473 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:51.250393 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-t68sf" Apr 22 19:57:51.250610 ip-10-0-143-253 kubenswrapper[2577]: E0422 19:57:51.250540 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-t68sf" podUID="73129b41-d555-4a74-9f2a-640a35e9625f" Apr 22 19:57:51.250985 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:51.250968 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5dv89" Apr 22 19:57:51.251069 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:51.250972 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-x55sm" Apr 22 19:57:51.251121 ip-10-0-143-253 kubenswrapper[2577]: E0422 19:57:51.251080 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5dv89" podUID="d77c3bfb-9ca8-4f41-8c75-d15f2c1ab3b8" Apr 22 19:57:51.251174 ip-10-0-143-253 kubenswrapper[2577]: E0422 19:57:51.251155 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-x55sm" podUID="0763314b-12d3-4771-844c-120f25ae1bc3" Apr 22 19:57:53.248199 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:53.248126 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-t68sf" Apr 22 19:57:53.248651 ip-10-0-143-253 kubenswrapper[2577]: E0422 19:57:53.248248 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-t68sf" podUID="73129b41-d555-4a74-9f2a-640a35e9625f" Apr 22 19:57:53.248790 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:53.248703 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5dv89" Apr 22 19:57:53.248902 ip-10-0-143-253 kubenswrapper[2577]: E0422 19:57:53.248806 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5dv89" podUID="d77c3bfb-9ca8-4f41-8c75-d15f2c1ab3b8" Apr 22 19:57:53.248992 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:53.248983 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-x55sm" Apr 22 19:57:53.249133 ip-10-0-143-253 kubenswrapper[2577]: E0422 19:57:53.249078 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-x55sm" podUID="0763314b-12d3-4771-844c-120f25ae1bc3" Apr 22 19:57:54.837934 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:54.837894 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d77c3bfb-9ca8-4f41-8c75-d15f2c1ab3b8-metrics-certs\") pod \"network-metrics-daemon-5dv89\" (UID: \"d77c3bfb-9ca8-4f41-8c75-d15f2c1ab3b8\") " pod="openshift-multus/network-metrics-daemon-5dv89" Apr 22 19:57:54.838415 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:54.837985 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l9mmc\" (UniqueName: \"kubernetes.io/projected/73129b41-d555-4a74-9f2a-640a35e9625f-kube-api-access-l9mmc\") pod \"network-check-target-t68sf\" (UID: \"73129b41-d555-4a74-9f2a-640a35e9625f\") " pod="openshift-network-diagnostics/network-check-target-t68sf" Apr 22 19:57:54.838415 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:54.838022 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/0763314b-12d3-4771-844c-120f25ae1bc3-original-pull-secret\") pod \"global-pull-secret-syncer-x55sm\" (UID: \"0763314b-12d3-4771-844c-120f25ae1bc3\") " pod="kube-system/global-pull-secret-syncer-x55sm" Apr 22 19:57:54.838415 ip-10-0-143-253 kubenswrapper[2577]: E0422 19:57:54.838067 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:57:54.838415 ip-10-0-143-253 kubenswrapper[2577]: E0422 19:57:54.838136 2577 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 19:57:54.838415 ip-10-0-143-253 kubenswrapper[2577]: E0422 19:57:54.838147 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d77c3bfb-9ca8-4f41-8c75-d15f2c1ab3b8-metrics-certs podName:d77c3bfb-9ca8-4f41-8c75-d15f2c1ab3b8 nodeName:}" failed. No retries permitted until 2026-04-22 19:58:02.838125834 +0000 UTC m=+18.153218048 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d77c3bfb-9ca8-4f41-8c75-d15f2c1ab3b8-metrics-certs") pod "network-metrics-daemon-5dv89" (UID: "d77c3bfb-9ca8-4f41-8c75-d15f2c1ab3b8") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:57:54.838415 ip-10-0-143-253 kubenswrapper[2577]: E0422 19:57:54.838149 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 19:57:54.838415 ip-10-0-143-253 kubenswrapper[2577]: E0422 19:57:54.838170 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 19:57:54.838415 ip-10-0-143-253 kubenswrapper[2577]: E0422 19:57:54.838182 2577 projected.go:194] Error preparing data for projected volume kube-api-access-l9mmc for pod openshift-network-diagnostics/network-check-target-t68sf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:57:54.838415 ip-10-0-143-253 kubenswrapper[2577]: E0422 19:57:54.838186 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0763314b-12d3-4771-844c-120f25ae1bc3-original-pull-secret podName:0763314b-12d3-4771-844c-120f25ae1bc3 nodeName:}" failed. No retries permitted until 2026-04-22 19:58:02.838172148 +0000 UTC m=+18.153264357 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/0763314b-12d3-4771-844c-120f25ae1bc3-original-pull-secret") pod "global-pull-secret-syncer-x55sm" (UID: "0763314b-12d3-4771-844c-120f25ae1bc3") : object "kube-system"/"original-pull-secret" not registered Apr 22 19:57:54.838415 ip-10-0-143-253 kubenswrapper[2577]: E0422 19:57:54.838229 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/73129b41-d555-4a74-9f2a-640a35e9625f-kube-api-access-l9mmc podName:73129b41-d555-4a74-9f2a-640a35e9625f nodeName:}" failed. No retries permitted until 2026-04-22 19:58:02.838214839 +0000 UTC m=+18.153307072 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-l9mmc" (UniqueName: "kubernetes.io/projected/73129b41-d555-4a74-9f2a-640a35e9625f-kube-api-access-l9mmc") pod "network-check-target-t68sf" (UID: "73129b41-d555-4a74-9f2a-640a35e9625f") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:57:55.249026 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:55.248942 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5dv89" Apr 22 19:57:55.249198 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:55.249087 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-x55sm" Apr 22 19:57:55.249198 ip-10-0-143-253 kubenswrapper[2577]: E0422 19:57:55.249116 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5dv89" podUID="d77c3bfb-9ca8-4f41-8c75-d15f2c1ab3b8" Apr 22 19:57:55.249198 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:55.249137 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-t68sf" Apr 22 19:57:55.249362 ip-10-0-143-253 kubenswrapper[2577]: E0422 19:57:55.249221 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-t68sf" podUID="73129b41-d555-4a74-9f2a-640a35e9625f" Apr 22 19:57:55.249362 ip-10-0-143-253 kubenswrapper[2577]: E0422 19:57:55.249298 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-x55sm" podUID="0763314b-12d3-4771-844c-120f25ae1bc3" Apr 22 19:57:57.248442 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:57.248360 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-t68sf" Apr 22 19:57:57.248883 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:57.248452 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5dv89" Apr 22 19:57:57.248883 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:57.248470 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-x55sm" Apr 22 19:57:57.248883 ip-10-0-143-253 kubenswrapper[2577]: E0422 19:57:57.248560 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-t68sf" podUID="73129b41-d555-4a74-9f2a-640a35e9625f" Apr 22 19:57:57.248883 ip-10-0-143-253 kubenswrapper[2577]: E0422 19:57:57.248678 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5dv89" podUID="d77c3bfb-9ca8-4f41-8c75-d15f2c1ab3b8" Apr 22 19:57:57.248883 ip-10-0-143-253 kubenswrapper[2577]: E0422 19:57:57.248763 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-x55sm" podUID="0763314b-12d3-4771-844c-120f25ae1bc3" Apr 22 19:57:59.247801 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:59.247770 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5dv89" Apr 22 19:57:59.248254 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:59.247771 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-x55sm" Apr 22 19:57:59.248254 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:57:59.247777 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-t68sf" Apr 22 19:57:59.248254 ip-10-0-143-253 kubenswrapper[2577]: E0422 19:57:59.247922 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5dv89" podUID="d77c3bfb-9ca8-4f41-8c75-d15f2c1ab3b8" Apr 22 19:57:59.248254 ip-10-0-143-253 kubenswrapper[2577]: E0422 19:57:59.248060 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-x55sm" podUID="0763314b-12d3-4771-844c-120f25ae1bc3" Apr 22 19:57:59.248254 ip-10-0-143-253 kubenswrapper[2577]: E0422 19:57:59.248146 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-t68sf" podUID="73129b41-d555-4a74-9f2a-640a35e9625f" Apr 22 19:58:01.248163 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:01.248127 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-x55sm" Apr 22 19:58:01.248601 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:01.248126 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5dv89" Apr 22 19:58:01.248601 ip-10-0-143-253 kubenswrapper[2577]: E0422 19:58:01.248265 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-x55sm" podUID="0763314b-12d3-4771-844c-120f25ae1bc3" Apr 22 19:58:01.248601 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:01.248127 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-t68sf" Apr 22 19:58:01.248601 ip-10-0-143-253 kubenswrapper[2577]: E0422 19:58:01.248395 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5dv89" podUID="d77c3bfb-9ca8-4f41-8c75-d15f2c1ab3b8" Apr 22 19:58:01.248601 ip-10-0-143-253 kubenswrapper[2577]: E0422 19:58:01.248525 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-t68sf" podUID="73129b41-d555-4a74-9f2a-640a35e9625f" Apr 22 19:58:02.899079 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:02.899042 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l9mmc\" (UniqueName: \"kubernetes.io/projected/73129b41-d555-4a74-9f2a-640a35e9625f-kube-api-access-l9mmc\") pod \"network-check-target-t68sf\" (UID: \"73129b41-d555-4a74-9f2a-640a35e9625f\") " pod="openshift-network-diagnostics/network-check-target-t68sf" Apr 22 19:58:02.899656 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:02.899094 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/0763314b-12d3-4771-844c-120f25ae1bc3-original-pull-secret\") pod \"global-pull-secret-syncer-x55sm\" (UID: \"0763314b-12d3-4771-844c-120f25ae1bc3\") " pod="kube-system/global-pull-secret-syncer-x55sm" Apr 22 19:58:02.899656 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:02.899134 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d77c3bfb-9ca8-4f41-8c75-d15f2c1ab3b8-metrics-certs\") pod \"network-metrics-daemon-5dv89\" (UID: \"d77c3bfb-9ca8-4f41-8c75-d15f2c1ab3b8\") " pod="openshift-multus/network-metrics-daemon-5dv89" Apr 22 19:58:02.899656 ip-10-0-143-253 kubenswrapper[2577]: E0422 19:58:02.899239 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:58:02.899656 ip-10-0-143-253 kubenswrapper[2577]: E0422 19:58:02.899250 2577 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 19:58:02.899656 ip-10-0-143-253 kubenswrapper[2577]: E0422 19:58:02.899239 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 19:58:02.899656 ip-10-0-143-253 kubenswrapper[2577]: E0422 19:58:02.899306 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d77c3bfb-9ca8-4f41-8c75-d15f2c1ab3b8-metrics-certs podName:d77c3bfb-9ca8-4f41-8c75-d15f2c1ab3b8 nodeName:}" failed. No retries permitted until 2026-04-22 19:58:18.899290434 +0000 UTC m=+34.214382657 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d77c3bfb-9ca8-4f41-8c75-d15f2c1ab3b8-metrics-certs") pod "network-metrics-daemon-5dv89" (UID: "d77c3bfb-9ca8-4f41-8c75-d15f2c1ab3b8") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:58:02.899656 ip-10-0-143-253 kubenswrapper[2577]: E0422 19:58:02.899318 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 19:58:02.899656 ip-10-0-143-253 kubenswrapper[2577]: E0422 19:58:02.899320 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0763314b-12d3-4771-844c-120f25ae1bc3-original-pull-secret podName:0763314b-12d3-4771-844c-120f25ae1bc3 nodeName:}" failed. No retries permitted until 2026-04-22 19:58:18.899313362 +0000 UTC m=+34.214405572 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/0763314b-12d3-4771-844c-120f25ae1bc3-original-pull-secret") pod "global-pull-secret-syncer-x55sm" (UID: "0763314b-12d3-4771-844c-120f25ae1bc3") : object "kube-system"/"original-pull-secret" not registered Apr 22 19:58:02.899656 ip-10-0-143-253 kubenswrapper[2577]: E0422 19:58:02.899331 2577 projected.go:194] Error preparing data for projected volume kube-api-access-l9mmc for pod openshift-network-diagnostics/network-check-target-t68sf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:58:02.899656 ip-10-0-143-253 kubenswrapper[2577]: E0422 19:58:02.899370 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/73129b41-d555-4a74-9f2a-640a35e9625f-kube-api-access-l9mmc podName:73129b41-d555-4a74-9f2a-640a35e9625f nodeName:}" failed. No retries permitted until 2026-04-22 19:58:18.899355196 +0000 UTC m=+34.214447412 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-l9mmc" (UniqueName: "kubernetes.io/projected/73129b41-d555-4a74-9f2a-640a35e9625f-kube-api-access-l9mmc") pod "network-check-target-t68sf" (UID: "73129b41-d555-4a74-9f2a-640a35e9625f") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:58:03.247995 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:03.247909 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-x55sm" Apr 22 19:58:03.248161 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:03.247909 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5dv89" Apr 22 19:58:03.248161 ip-10-0-143-253 kubenswrapper[2577]: E0422 19:58:03.248043 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-x55sm" podUID="0763314b-12d3-4771-844c-120f25ae1bc3" Apr 22 19:58:03.248161 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:03.247909 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-t68sf" Apr 22 19:58:03.248161 ip-10-0-143-253 kubenswrapper[2577]: E0422 19:58:03.248128 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5dv89" podUID="d77c3bfb-9ca8-4f41-8c75-d15f2c1ab3b8" Apr 22 19:58:03.248354 ip-10-0-143-253 kubenswrapper[2577]: E0422 19:58:03.248227 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-t68sf" podUID="73129b41-d555-4a74-9f2a-640a35e9625f" Apr 22 19:58:05.248989 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:05.248700 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-t68sf" Apr 22 19:58:05.249331 ip-10-0-143-253 kubenswrapper[2577]: E0422 19:58:05.249039 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-t68sf" podUID="73129b41-d555-4a74-9f2a-640a35e9625f" Apr 22 19:58:05.249331 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:05.249153 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5dv89" Apr 22 19:58:05.249331 ip-10-0-143-253 kubenswrapper[2577]: E0422 19:58:05.249236 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5dv89" podUID="d77c3bfb-9ca8-4f41-8c75-d15f2c1ab3b8" Apr 22 19:58:05.250304 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:05.250283 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-x55sm" Apr 22 19:58:05.250437 ip-10-0-143-253 kubenswrapper[2577]: E0422 19:58:05.250393 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-x55sm" podUID="0763314b-12d3-4771-844c-120f25ae1bc3" Apr 22 19:58:05.381414 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:05.381191 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-46d9d" event={"ID":"90351fbd-9ae1-41d6-bb18-5239e60b2a9d","Type":"ContainerStarted","Data":"cbadf15c75e14157a3447c96b0cbdb2bbcf7adfcaa3cea91ee46826a0e559c72"} Apr 22 19:58:05.382731 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:05.382676 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-rpfnc" event={"ID":"8b248b8e-1022-47ab-b16f-e3e4f3ee7abb","Type":"ContainerStarted","Data":"79db902453cd09cb248986b05f3c76e1d5a1b0badc25ac46a821b253d7dad437"} Apr 22 19:58:05.384023 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:05.383990 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tsrrb" event={"ID":"e34993a1-0c8c-4395-9202-5841e22c2788","Type":"ContainerStarted","Data":"7ccda2ac647a8703a98c44d08bd69301d509b3da2a289d03d5328d5f9420355b"} Apr 22 19:58:05.385319 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:05.385194 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-msxbb" event={"ID":"3d105cfe-1e71-45ef-b072-4f6de04ca9c1","Type":"ContainerStarted","Data":"6e2d66e1a6e24a1603415dab238a41a47bd3bd025c9d95a88834b928eab4eec9"} Apr 22 19:58:05.390421 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:05.390398 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wrbxl" event={"ID":"36e9b580-270c-4cbb-b3e6-78fde6f244ec","Type":"ContainerStarted","Data":"0ee9ad984da675bb78d99e49ff207e82259d65743d051d5414b056820e370279"} Apr 22 19:58:05.390516 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:05.390427 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wrbxl" event={"ID":"36e9b580-270c-4cbb-b3e6-78fde6f244ec","Type":"ContainerStarted","Data":"996bf9ec5421f81ba862cba0ea93ed39116dda0bf82a0cd4aa5835983a8d1809"} Apr 22 19:58:05.392083 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:05.392061 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-p4l6b" event={"ID":"c28089f2-d625-4e69-b372-16c2a540e3a1","Type":"ContainerStarted","Data":"229719abca5e51abac535e2710d29a115c3ac957aee8863303fce4a39b5580f1"} Apr 22 19:58:05.393607 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:05.393516 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-wzwch" event={"ID":"74ab80fc-aaa7-48f7-8670-ed1cd47ff5c8","Type":"ContainerStarted","Data":"38bd8e61db67c9b83a528939f65169efc07df4758d75326f0f35f2f96196f023"} Apr 22 19:58:05.394783 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:05.394760 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-pkpcm" event={"ID":"e24f6b8a-d137-4b5b-94b4-011f680ada1d","Type":"ContainerStarted","Data":"814914a3096438569897aa4c5ec12a7da69322cfcb6c0a8fc334ff3328758a50"} Apr 22 19:58:05.400127 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:05.399980 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-253.ec2.internal" podStartSLOduration=19.399964878 podStartE2EDuration="19.399964878s" podCreationTimestamp="2026-04-22 19:57:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:57:50.36666727 +0000 UTC m=+5.681759503" watchObservedRunningTime="2026-04-22 19:58:05.399964878 +0000 UTC m=+20.715057114" Apr 22 19:58:05.420949 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:05.420885 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-46d9d" podStartSLOduration=3.300269929 podStartE2EDuration="20.420857549s" podCreationTimestamp="2026-04-22 19:57:45 +0000 UTC" firstStartedPulling="2026-04-22 19:57:47.796062828 +0000 UTC m=+3.111155044" lastFinishedPulling="2026-04-22 19:58:04.916650443 +0000 UTC m=+20.231742664" observedRunningTime="2026-04-22 19:58:05.399958343 +0000 UTC m=+20.715050575" watchObservedRunningTime="2026-04-22 19:58:05.420857549 +0000 UTC m=+20.735949782" Apr 22 19:58:05.436767 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:05.436718 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-pkpcm" podStartSLOduration=2.946594572 podStartE2EDuration="20.436703394s" podCreationTimestamp="2026-04-22 19:57:45 +0000 UTC" firstStartedPulling="2026-04-22 19:57:47.789386476 +0000 UTC m=+3.104478698" lastFinishedPulling="2026-04-22 19:58:05.279495307 +0000 UTC m=+20.594587520" observedRunningTime="2026-04-22 19:58:05.435982252 +0000 UTC m=+20.751074495" watchObservedRunningTime="2026-04-22 19:58:05.436703394 +0000 UTC m=+20.751795626" Apr 22 19:58:05.451283 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:05.451243 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-msxbb" podStartSLOduration=3.334232095 podStartE2EDuration="20.451232717s" podCreationTimestamp="2026-04-22 19:57:45 +0000 UTC" firstStartedPulling="2026-04-22 19:57:47.799652665 +0000 UTC m=+3.114744879" lastFinishedPulling="2026-04-22 19:58:04.916653292 +0000 UTC m=+20.231745501" observedRunningTime="2026-04-22 19:58:05.45098779 +0000 UTC m=+20.766080023" watchObservedRunningTime="2026-04-22 19:58:05.451232717 +0000 UTC m=+20.766324948" Apr 22 19:58:05.484547 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:05.484501 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-rpfnc" podStartSLOduration=3.363081327 podStartE2EDuration="20.484487403s" podCreationTimestamp="2026-04-22 19:57:45 +0000 UTC" firstStartedPulling="2026-04-22 19:57:47.795247694 +0000 UTC m=+3.110339918" lastFinishedPulling="2026-04-22 19:58:04.916653784 +0000 UTC m=+20.231745994" observedRunningTime="2026-04-22 19:58:05.46419054 +0000 UTC m=+20.779282771" watchObservedRunningTime="2026-04-22 19:58:05.484487403 +0000 UTC m=+20.799579632" Apr 22 19:58:05.485049 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:05.485026 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-wzwch" podStartSLOduration=3.361908395 podStartE2EDuration="20.485019843s" podCreationTimestamp="2026-04-22 19:57:45 +0000 UTC" firstStartedPulling="2026-04-22 19:57:47.79354878 +0000 UTC m=+3.108640993" lastFinishedPulling="2026-04-22 19:58:04.916660232 +0000 UTC m=+20.231752441" observedRunningTime="2026-04-22 19:58:05.484458621 +0000 UTC m=+20.799550857" watchObservedRunningTime="2026-04-22 19:58:05.485019843 +0000 UTC m=+20.800112115" Apr 22 19:58:06.111941 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:06.111921 2577 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 22 19:58:06.228133 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:06.228037 2577 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-22T19:58:06.111937532Z","UUID":"3d764dde-1ee5-4552-97f1-00f84a003552","Handler":null,"Name":"","Endpoint":""} Apr 22 19:58:06.229905 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:06.229883 2577 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 22 19:58:06.229905 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:06.229910 2577 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 22 19:58:06.397885 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:06.397830 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-kpwmj" event={"ID":"8f5cde8a-b2da-4205-8a72-53560841ac3b","Type":"ContainerStarted","Data":"0563a308168d991fb5ec1fd99dc7817c010acce464a14ecdeec58a58436850ab"} Apr 22 19:58:06.400433 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:06.400407 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wrbxl" event={"ID":"36e9b580-270c-4cbb-b3e6-78fde6f244ec","Type":"ContainerStarted","Data":"84042cabbe5303e894ddfdcd247a1f22c3057cd21a6fb687706c242d081294c7"} Apr 22 19:58:06.400556 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:06.400443 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wrbxl" event={"ID":"36e9b580-270c-4cbb-b3e6-78fde6f244ec","Type":"ContainerStarted","Data":"9f1de609890f65acbfacda55adb8ab233960e7ffeab64d07e815bd53a56d3593"} Apr 22 19:58:06.400556 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:06.400458 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wrbxl" event={"ID":"36e9b580-270c-4cbb-b3e6-78fde6f244ec","Type":"ContainerStarted","Data":"910613e56777bcefd28e6bb40ba16349d7958b1759758b032b6515a61531d6eb"} Apr 22 19:58:06.400556 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:06.400468 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wrbxl" event={"ID":"36e9b580-270c-4cbb-b3e6-78fde6f244ec","Type":"ContainerStarted","Data":"b700b4b3925ca55bde5d92d9e15d3a9e06fe68c1a0b28d3f9c101a9e6426acf3"} Apr 22 19:58:06.401710 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:06.401684 2577 generic.go:358] "Generic (PLEG): container finished" podID="c28089f2-d625-4e69-b372-16c2a540e3a1" containerID="229719abca5e51abac535e2710d29a115c3ac957aee8863303fce4a39b5580f1" exitCode=0 Apr 22 19:58:06.401796 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:06.401750 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-p4l6b" event={"ID":"c28089f2-d625-4e69-b372-16c2a540e3a1","Type":"ContainerDied","Data":"229719abca5e51abac535e2710d29a115c3ac957aee8863303fce4a39b5580f1"} Apr 22 19:58:06.403206 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:06.403183 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tsrrb" event={"ID":"e34993a1-0c8c-4395-9202-5841e22c2788","Type":"ContainerStarted","Data":"4d5b0c6ad537394c03febd62c975fde2f6b6f8868846a0976a131fbb09ab9ab6"} Apr 22 19:58:06.431502 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:06.431460 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-kpwmj" podStartSLOduration=4.264564281 podStartE2EDuration="21.431447449s" podCreationTimestamp="2026-04-22 19:57:45 +0000 UTC" firstStartedPulling="2026-04-22 19:57:47.797740663 +0000 UTC m=+3.112832878" lastFinishedPulling="2026-04-22 19:58:04.964623818 +0000 UTC m=+20.279716046" observedRunningTime="2026-04-22 19:58:06.41260702 +0000 UTC m=+21.727699251" watchObservedRunningTime="2026-04-22 19:58:06.431447449 +0000 UTC m=+21.746539681" Apr 22 19:58:07.248651 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:07.248570 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-t68sf" Apr 22 19:58:07.248880 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:07.248570 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5dv89" Apr 22 19:58:07.248880 ip-10-0-143-253 kubenswrapper[2577]: E0422 19:58:07.248700 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-t68sf" podUID="73129b41-d555-4a74-9f2a-640a35e9625f" Apr 22 19:58:07.248880 ip-10-0-143-253 kubenswrapper[2577]: E0422 19:58:07.248783 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5dv89" podUID="d77c3bfb-9ca8-4f41-8c75-d15f2c1ab3b8" Apr 22 19:58:07.248880 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:07.248570 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-x55sm" Apr 22 19:58:07.249093 ip-10-0-143-253 kubenswrapper[2577]: E0422 19:58:07.248898 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-x55sm" podUID="0763314b-12d3-4771-844c-120f25ae1bc3" Apr 22 19:58:07.407011 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:07.406975 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tsrrb" event={"ID":"e34993a1-0c8c-4395-9202-5841e22c2788","Type":"ContainerStarted","Data":"1a324d98686a5b2ce247499d939ab782b92b1028bd669dfd2ef4a6b71984b435"} Apr 22 19:58:07.424281 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:07.424230 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tsrrb" podStartSLOduration=3.228878626 podStartE2EDuration="22.424211205s" podCreationTimestamp="2026-04-22 19:57:45 +0000 UTC" firstStartedPulling="2026-04-22 19:57:47.787823956 +0000 UTC m=+3.102916169" lastFinishedPulling="2026-04-22 19:58:06.983156526 +0000 UTC m=+22.298248748" observedRunningTime="2026-04-22 19:58:07.423420794 +0000 UTC m=+22.738513026" watchObservedRunningTime="2026-04-22 19:58:07.424211205 +0000 UTC m=+22.739303439" Apr 22 19:58:08.414688 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:08.414654 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wrbxl" event={"ID":"36e9b580-270c-4cbb-b3e6-78fde6f244ec","Type":"ContainerStarted","Data":"a8b7636767c5f705dbaed21e04f86b434f4c6aadbdc6063b33aed0f1b33b8f58"} Apr 22 19:58:08.742894 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:08.742800 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-wzwch" Apr 22 19:58:08.743597 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:08.743577 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-wzwch" Apr 22 19:58:09.248772 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:09.248531 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-t68sf" Apr 22 19:58:09.248977 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:09.248591 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5dv89" Apr 22 19:58:09.248977 ip-10-0-143-253 kubenswrapper[2577]: E0422 19:58:09.248895 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-t68sf" podUID="73129b41-d555-4a74-9f2a-640a35e9625f" Apr 22 19:58:09.248977 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:09.248592 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-x55sm" Apr 22 19:58:09.249139 ip-10-0-143-253 kubenswrapper[2577]: E0422 19:58:09.248983 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5dv89" podUID="d77c3bfb-9ca8-4f41-8c75-d15f2c1ab3b8" Apr 22 19:58:09.249139 ip-10-0-143-253 kubenswrapper[2577]: E0422 19:58:09.249040 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-x55sm" podUID="0763314b-12d3-4771-844c-120f25ae1bc3" Apr 22 19:58:09.416102 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:09.416068 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-wzwch" Apr 22 19:58:09.416587 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:09.416571 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-wzwch" Apr 22 19:58:10.421364 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:10.421336 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wrbxl" event={"ID":"36e9b580-270c-4cbb-b3e6-78fde6f244ec","Type":"ContainerStarted","Data":"77bd49b7327b51ab128ae614442815edb4d82c32be2ca2b0a76ddd9f306a4ec6"} Apr 22 19:58:10.422111 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:10.421605 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-wrbxl" Apr 22 19:58:10.423086 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:10.423060 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-p4l6b" event={"ID":"c28089f2-d625-4e69-b372-16c2a540e3a1","Type":"ContainerStarted","Data":"61b503be482ba8a9e51e4a5778bf38f18c54f13e32b0d1c099811fb4b61c1dd9"} Apr 22 19:58:10.436541 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:10.436411 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-wrbxl" Apr 22 19:58:10.448455 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:10.448410 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-wrbxl" podStartSLOduration=8.244572155 podStartE2EDuration="25.448395666s" podCreationTimestamp="2026-04-22 19:57:45 +0000 UTC" firstStartedPulling="2026-04-22 19:57:47.79657619 +0000 UTC m=+3.111668409" lastFinishedPulling="2026-04-22 19:58:05.00039969 +0000 UTC m=+20.315491920" observedRunningTime="2026-04-22 19:58:10.448137069 +0000 UTC m=+25.763229301" watchObservedRunningTime="2026-04-22 19:58:10.448395666 +0000 UTC m=+25.763487899" Apr 22 19:58:10.811589 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:10.811559 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-wrbxl" Apr 22 19:58:11.247816 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:11.247746 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5dv89" Apr 22 19:58:11.247985 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:11.247746 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-t68sf" Apr 22 19:58:11.247985 ip-10-0-143-253 kubenswrapper[2577]: E0422 19:58:11.247870 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5dv89" podUID="d77c3bfb-9ca8-4f41-8c75-d15f2c1ab3b8" Apr 22 19:58:11.247985 ip-10-0-143-253 kubenswrapper[2577]: E0422 19:58:11.247968 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-t68sf" podUID="73129b41-d555-4a74-9f2a-640a35e9625f" Apr 22 19:58:11.248140 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:11.247748 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-x55sm" Apr 22 19:58:11.248140 ip-10-0-143-253 kubenswrapper[2577]: E0422 19:58:11.248074 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-x55sm" podUID="0763314b-12d3-4771-844c-120f25ae1bc3" Apr 22 19:58:11.432186 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:11.432146 2577 generic.go:358] "Generic (PLEG): container finished" podID="c28089f2-d625-4e69-b372-16c2a540e3a1" containerID="61b503be482ba8a9e51e4a5778bf38f18c54f13e32b0d1c099811fb4b61c1dd9" exitCode=0 Apr 22 19:58:11.432616 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:11.432267 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-p4l6b" event={"ID":"c28089f2-d625-4e69-b372-16c2a540e3a1","Type":"ContainerDied","Data":"61b503be482ba8a9e51e4a5778bf38f18c54f13e32b0d1c099811fb4b61c1dd9"} Apr 22 19:58:11.432892 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:11.432862 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-wrbxl" Apr 22 19:58:11.449936 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:11.449914 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-wrbxl" Apr 22 19:58:12.095162 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:12.095027 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-t68sf"] Apr 22 19:58:12.095162 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:12.095136 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-t68sf" Apr 22 19:58:12.095443 ip-10-0-143-253 kubenswrapper[2577]: E0422 19:58:12.095215 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-t68sf" podUID="73129b41-d555-4a74-9f2a-640a35e9625f" Apr 22 19:58:12.098116 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:12.098012 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-5dv89"] Apr 22 19:58:12.098116 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:12.098117 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5dv89" Apr 22 19:58:12.098262 ip-10-0-143-253 kubenswrapper[2577]: E0422 19:58:12.098222 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5dv89" podUID="d77c3bfb-9ca8-4f41-8c75-d15f2c1ab3b8" Apr 22 19:58:12.106528 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:12.106435 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-x55sm"] Apr 22 19:58:12.106528 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:12.106525 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-x55sm" Apr 22 19:58:12.106675 ip-10-0-143-253 kubenswrapper[2577]: E0422 19:58:12.106611 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-x55sm" podUID="0763314b-12d3-4771-844c-120f25ae1bc3" Apr 22 19:58:12.435662 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:12.435618 2577 generic.go:358] "Generic (PLEG): container finished" podID="c28089f2-d625-4e69-b372-16c2a540e3a1" containerID="cc73116bac94bedd2a62c8b9a8835187171a5b61bb5deb1dd6a19998194032dc" exitCode=0 Apr 22 19:58:12.436070 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:12.435703 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-p4l6b" event={"ID":"c28089f2-d625-4e69-b372-16c2a540e3a1","Type":"ContainerDied","Data":"cc73116bac94bedd2a62c8b9a8835187171a5b61bb5deb1dd6a19998194032dc"} Apr 22 19:58:13.439322 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:13.439284 2577 generic.go:358] "Generic (PLEG): container finished" podID="c28089f2-d625-4e69-b372-16c2a540e3a1" containerID="701355cbbf47f732f8f96dbd018d29d0c7f2994e09d78ed3ec834750fee3ec5e" exitCode=0 Apr 22 19:58:13.439763 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:13.439367 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-p4l6b" event={"ID":"c28089f2-d625-4e69-b372-16c2a540e3a1","Type":"ContainerDied","Data":"701355cbbf47f732f8f96dbd018d29d0c7f2994e09d78ed3ec834750fee3ec5e"} Apr 22 19:58:14.248756 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:14.248671 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-t68sf" Apr 22 19:58:14.248950 ip-10-0-143-253 kubenswrapper[2577]: E0422 19:58:14.248795 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-t68sf" podUID="73129b41-d555-4a74-9f2a-640a35e9625f" Apr 22 19:58:14.248950 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:14.248804 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-x55sm" Apr 22 19:58:14.248950 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:14.248830 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5dv89" Apr 22 19:58:14.248950 ip-10-0-143-253 kubenswrapper[2577]: E0422 19:58:14.248911 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-x55sm" podUID="0763314b-12d3-4771-844c-120f25ae1bc3" Apr 22 19:58:14.249166 ip-10-0-143-253 kubenswrapper[2577]: E0422 19:58:14.248991 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5dv89" podUID="d77c3bfb-9ca8-4f41-8c75-d15f2c1ab3b8" Apr 22 19:58:16.248827 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:16.248572 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-x55sm" Apr 22 19:58:16.249236 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:16.248638 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5dv89" Apr 22 19:58:16.249236 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:16.248662 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-t68sf" Apr 22 19:58:16.249236 ip-10-0-143-253 kubenswrapper[2577]: E0422 19:58:16.248952 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-x55sm" podUID="0763314b-12d3-4771-844c-120f25ae1bc3" Apr 22 19:58:16.249236 ip-10-0-143-253 kubenswrapper[2577]: E0422 19:58:16.249028 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5dv89" podUID="d77c3bfb-9ca8-4f41-8c75-d15f2c1ab3b8" Apr 22 19:58:16.249236 ip-10-0-143-253 kubenswrapper[2577]: E0422 19:58:16.249097 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-t68sf" podUID="73129b41-d555-4a74-9f2a-640a35e9625f" Apr 22 19:58:18.027372 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:18.027305 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-253.ec2.internal" event="NodeReady" Apr 22 19:58:18.027895 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:18.027448 2577 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 22 19:58:18.060779 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:18.060752 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-69c5956889-gpgjs"] Apr 22 19:58:18.089896 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:18.089864 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-ptknf"] Apr 22 19:58:18.090068 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:18.090004 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-69c5956889-gpgjs" Apr 22 19:58:18.095342 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:18.095301 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 22 19:58:18.095505 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:18.095323 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 22 19:58:18.096270 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:18.096246 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 22 19:58:18.096769 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:18.096571 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-wxhns\"" Apr 22 19:58:18.101853 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:18.101817 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 22 19:58:18.110998 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:18.110973 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-svd5v"] Apr 22 19:58:18.111141 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:18.111124 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-ptknf" Apr 22 19:58:18.113658 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:18.113637 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 22 19:58:18.113742 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:18.113693 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 22 19:58:18.113873 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:18.113855 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-wrr4x\"" Apr 22 19:58:18.125292 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:18.125271 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-69c5956889-gpgjs"] Apr 22 19:58:18.125292 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:18.125295 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-ptknf"] Apr 22 19:58:18.125408 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:18.125304 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-svd5v"] Apr 22 19:58:18.125408 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:18.125399 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-svd5v" Apr 22 19:58:18.127774 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:18.127712 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 22 19:58:18.127998 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:18.127981 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 22 19:58:18.128093 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:18.128014 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 22 19:58:18.128244 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:18.128161 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-2xv8f\"" Apr 22 19:58:18.218894 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:18.218853 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f493ef8a-2452-4069-a58f-62b9adde6d11-ca-trust-extracted\") pod \"image-registry-69c5956889-gpgjs\" (UID: \"f493ef8a-2452-4069-a58f-62b9adde6d11\") " pod="openshift-image-registry/image-registry-69c5956889-gpgjs" Apr 22 19:58:18.218894 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:18.218905 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/82d3388d-34c5-45f5-82dd-28252d41e89a-config-volume\") pod \"dns-default-ptknf\" (UID: \"82d3388d-34c5-45f5-82dd-28252d41e89a\") " pod="openshift-dns/dns-default-ptknf" Apr 22 19:58:18.219214 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:18.218937 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f493ef8a-2452-4069-a58f-62b9adde6d11-installation-pull-secrets\") pod \"image-registry-69c5956889-gpgjs\" (UID: \"f493ef8a-2452-4069-a58f-62b9adde6d11\") " pod="openshift-image-registry/image-registry-69c5956889-gpgjs" Apr 22 19:58:18.219214 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:18.218966 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f493ef8a-2452-4069-a58f-62b9adde6d11-bound-sa-token\") pod \"image-registry-69c5956889-gpgjs\" (UID: \"f493ef8a-2452-4069-a58f-62b9adde6d11\") " pod="openshift-image-registry/image-registry-69c5956889-gpgjs" Apr 22 19:58:18.219214 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:18.219024 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mqpvt\" (UniqueName: \"kubernetes.io/projected/f493ef8a-2452-4069-a58f-62b9adde6d11-kube-api-access-mqpvt\") pod \"image-registry-69c5956889-gpgjs\" (UID: \"f493ef8a-2452-4069-a58f-62b9adde6d11\") " pod="openshift-image-registry/image-registry-69c5956889-gpgjs" Apr 22 19:58:18.219214 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:18.219106 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/82d3388d-34c5-45f5-82dd-28252d41e89a-metrics-tls\") pod \"dns-default-ptknf\" (UID: \"82d3388d-34c5-45f5-82dd-28252d41e89a\") " pod="openshift-dns/dns-default-ptknf" Apr 22 19:58:18.219214 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:18.219135 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/f493ef8a-2452-4069-a58f-62b9adde6d11-image-registry-private-configuration\") pod \"image-registry-69c5956889-gpgjs\" (UID: \"f493ef8a-2452-4069-a58f-62b9adde6d11\") " pod="openshift-image-registry/image-registry-69c5956889-gpgjs" Apr 22 19:58:18.219214 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:18.219154 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f493ef8a-2452-4069-a58f-62b9adde6d11-registry-certificates\") pod \"image-registry-69c5956889-gpgjs\" (UID: \"f493ef8a-2452-4069-a58f-62b9adde6d11\") " pod="openshift-image-registry/image-registry-69c5956889-gpgjs" Apr 22 19:58:18.219214 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:18.219184 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/82d3388d-34c5-45f5-82dd-28252d41e89a-tmp-dir\") pod \"dns-default-ptknf\" (UID: \"82d3388d-34c5-45f5-82dd-28252d41e89a\") " pod="openshift-dns/dns-default-ptknf" Apr 22 19:58:18.219214 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:18.219212 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-949ff\" (UniqueName: \"kubernetes.io/projected/82d3388d-34c5-45f5-82dd-28252d41e89a-kube-api-access-949ff\") pod \"dns-default-ptknf\" (UID: \"82d3388d-34c5-45f5-82dd-28252d41e89a\") " pod="openshift-dns/dns-default-ptknf" Apr 22 19:58:18.219602 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:18.219242 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f493ef8a-2452-4069-a58f-62b9adde6d11-trusted-ca\") pod \"image-registry-69c5956889-gpgjs\" (UID: \"f493ef8a-2452-4069-a58f-62b9adde6d11\") " pod="openshift-image-registry/image-registry-69c5956889-gpgjs" Apr 22 19:58:18.219602 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:18.219269 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f493ef8a-2452-4069-a58f-62b9adde6d11-registry-tls\") pod \"image-registry-69c5956889-gpgjs\" (UID: \"f493ef8a-2452-4069-a58f-62b9adde6d11\") " pod="openshift-image-registry/image-registry-69c5956889-gpgjs" Apr 22 19:58:18.248364 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:18.248325 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-x55sm" Apr 22 19:58:18.248364 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:18.248361 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-t68sf" Apr 22 19:58:18.248566 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:18.248345 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5dv89" Apr 22 19:58:18.251254 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:18.251080 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 22 19:58:18.251254 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:18.251085 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 22 19:58:18.251254 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:18.251116 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 22 19:58:18.251254 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:18.251191 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-gl2tf\"" Apr 22 19:58:18.251588 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:18.251260 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-pxnlc\"" Apr 22 19:58:18.251588 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:18.251300 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 22 19:58:18.319662 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:18.319625 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f493ef8a-2452-4069-a58f-62b9adde6d11-ca-trust-extracted\") pod \"image-registry-69c5956889-gpgjs\" (UID: \"f493ef8a-2452-4069-a58f-62b9adde6d11\") " pod="openshift-image-registry/image-registry-69c5956889-gpgjs" Apr 22 19:58:18.319870 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:18.319679 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/82d3388d-34c5-45f5-82dd-28252d41e89a-config-volume\") pod \"dns-default-ptknf\" (UID: \"82d3388d-34c5-45f5-82dd-28252d41e89a\") " pod="openshift-dns/dns-default-ptknf" Apr 22 19:58:18.319870 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:18.319710 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f493ef8a-2452-4069-a58f-62b9adde6d11-installation-pull-secrets\") pod \"image-registry-69c5956889-gpgjs\" (UID: \"f493ef8a-2452-4069-a58f-62b9adde6d11\") " pod="openshift-image-registry/image-registry-69c5956889-gpgjs" Apr 22 19:58:18.319870 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:18.319737 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f493ef8a-2452-4069-a58f-62b9adde6d11-bound-sa-token\") pod \"image-registry-69c5956889-gpgjs\" (UID: \"f493ef8a-2452-4069-a58f-62b9adde6d11\") " pod="openshift-image-registry/image-registry-69c5956889-gpgjs" Apr 22 19:58:18.319870 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:18.319768 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z7vds\" (UniqueName: \"kubernetes.io/projected/a663c8f0-aa0a-4c22-a907-7ecf606a4790-kube-api-access-z7vds\") pod \"ingress-canary-svd5v\" (UID: \"a663c8f0-aa0a-4c22-a907-7ecf606a4790\") " pod="openshift-ingress-canary/ingress-canary-svd5v" Apr 22 19:58:18.319870 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:18.319802 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mqpvt\" (UniqueName: \"kubernetes.io/projected/f493ef8a-2452-4069-a58f-62b9adde6d11-kube-api-access-mqpvt\") pod \"image-registry-69c5956889-gpgjs\" (UID: \"f493ef8a-2452-4069-a58f-62b9adde6d11\") " pod="openshift-image-registry/image-registry-69c5956889-gpgjs" Apr 22 19:58:18.320137 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:18.319879 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/82d3388d-34c5-45f5-82dd-28252d41e89a-metrics-tls\") pod \"dns-default-ptknf\" (UID: \"82d3388d-34c5-45f5-82dd-28252d41e89a\") " pod="openshift-dns/dns-default-ptknf" Apr 22 19:58:18.320137 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:18.319911 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/f493ef8a-2452-4069-a58f-62b9adde6d11-image-registry-private-configuration\") pod \"image-registry-69c5956889-gpgjs\" (UID: \"f493ef8a-2452-4069-a58f-62b9adde6d11\") " pod="openshift-image-registry/image-registry-69c5956889-gpgjs" Apr 22 19:58:18.320137 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:18.319939 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f493ef8a-2452-4069-a58f-62b9adde6d11-registry-certificates\") pod \"image-registry-69c5956889-gpgjs\" (UID: \"f493ef8a-2452-4069-a58f-62b9adde6d11\") " pod="openshift-image-registry/image-registry-69c5956889-gpgjs" Apr 22 19:58:18.320137 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:18.319968 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/82d3388d-34c5-45f5-82dd-28252d41e89a-tmp-dir\") pod \"dns-default-ptknf\" (UID: \"82d3388d-34c5-45f5-82dd-28252d41e89a\") " pod="openshift-dns/dns-default-ptknf" Apr 22 19:58:18.320137 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:18.319992 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-949ff\" (UniqueName: \"kubernetes.io/projected/82d3388d-34c5-45f5-82dd-28252d41e89a-kube-api-access-949ff\") pod \"dns-default-ptknf\" (UID: \"82d3388d-34c5-45f5-82dd-28252d41e89a\") " pod="openshift-dns/dns-default-ptknf" Apr 22 19:58:18.320137 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:18.320024 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f493ef8a-2452-4069-a58f-62b9adde6d11-trusted-ca\") pod \"image-registry-69c5956889-gpgjs\" (UID: \"f493ef8a-2452-4069-a58f-62b9adde6d11\") " pod="openshift-image-registry/image-registry-69c5956889-gpgjs" Apr 22 19:58:18.320137 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:18.320051 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f493ef8a-2452-4069-a58f-62b9adde6d11-registry-tls\") pod \"image-registry-69c5956889-gpgjs\" (UID: \"f493ef8a-2452-4069-a58f-62b9adde6d11\") " pod="openshift-image-registry/image-registry-69c5956889-gpgjs" Apr 22 19:58:18.320137 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:18.320087 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f493ef8a-2452-4069-a58f-62b9adde6d11-ca-trust-extracted\") pod \"image-registry-69c5956889-gpgjs\" (UID: \"f493ef8a-2452-4069-a58f-62b9adde6d11\") " pod="openshift-image-registry/image-registry-69c5956889-gpgjs" Apr 22 19:58:18.320137 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:18.320096 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a663c8f0-aa0a-4c22-a907-7ecf606a4790-cert\") pod \"ingress-canary-svd5v\" (UID: \"a663c8f0-aa0a-4c22-a907-7ecf606a4790\") " pod="openshift-ingress-canary/ingress-canary-svd5v" Apr 22 19:58:18.320559 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:18.320346 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/82d3388d-34c5-45f5-82dd-28252d41e89a-config-volume\") pod \"dns-default-ptknf\" (UID: \"82d3388d-34c5-45f5-82dd-28252d41e89a\") " pod="openshift-dns/dns-default-ptknf" Apr 22 19:58:18.320710 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:18.320677 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f493ef8a-2452-4069-a58f-62b9adde6d11-registry-certificates\") pod \"image-registry-69c5956889-gpgjs\" (UID: \"f493ef8a-2452-4069-a58f-62b9adde6d11\") " pod="openshift-image-registry/image-registry-69c5956889-gpgjs" Apr 22 19:58:18.320780 ip-10-0-143-253 kubenswrapper[2577]: E0422 19:58:18.320757 2577 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 19:58:18.320780 ip-10-0-143-253 kubenswrapper[2577]: E0422 19:58:18.320778 2577 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-69c5956889-gpgjs: secret "image-registry-tls" not found Apr 22 19:58:18.320892 ip-10-0-143-253 kubenswrapper[2577]: E0422 19:58:18.320872 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f493ef8a-2452-4069-a58f-62b9adde6d11-registry-tls podName:f493ef8a-2452-4069-a58f-62b9adde6d11 nodeName:}" failed. No retries permitted until 2026-04-22 19:58:18.820817927 +0000 UTC m=+34.135910156 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/f493ef8a-2452-4069-a58f-62b9adde6d11-registry-tls") pod "image-registry-69c5956889-gpgjs" (UID: "f493ef8a-2452-4069-a58f-62b9adde6d11") : secret "image-registry-tls" not found Apr 22 19:58:18.321117 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:18.321050 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/82d3388d-34c5-45f5-82dd-28252d41e89a-tmp-dir\") pod \"dns-default-ptknf\" (UID: \"82d3388d-34c5-45f5-82dd-28252d41e89a\") " pod="openshift-dns/dns-default-ptknf" Apr 22 19:58:18.321117 ip-10-0-143-253 kubenswrapper[2577]: E0422 19:58:18.321091 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 19:58:18.321276 ip-10-0-143-253 kubenswrapper[2577]: E0422 19:58:18.321155 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/82d3388d-34c5-45f5-82dd-28252d41e89a-metrics-tls podName:82d3388d-34c5-45f5-82dd-28252d41e89a nodeName:}" failed. No retries permitted until 2026-04-22 19:58:18.821136512 +0000 UTC m=+34.136228727 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/82d3388d-34c5-45f5-82dd-28252d41e89a-metrics-tls") pod "dns-default-ptknf" (UID: "82d3388d-34c5-45f5-82dd-28252d41e89a") : secret "dns-default-metrics-tls" not found Apr 22 19:58:18.321951 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:18.321932 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f493ef8a-2452-4069-a58f-62b9adde6d11-trusted-ca\") pod \"image-registry-69c5956889-gpgjs\" (UID: \"f493ef8a-2452-4069-a58f-62b9adde6d11\") " pod="openshift-image-registry/image-registry-69c5956889-gpgjs" Apr 22 19:58:18.325326 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:18.325306 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/f493ef8a-2452-4069-a58f-62b9adde6d11-image-registry-private-configuration\") pod \"image-registry-69c5956889-gpgjs\" (UID: \"f493ef8a-2452-4069-a58f-62b9adde6d11\") " pod="openshift-image-registry/image-registry-69c5956889-gpgjs" Apr 22 19:58:18.325419 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:18.325330 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f493ef8a-2452-4069-a58f-62b9adde6d11-installation-pull-secrets\") pod \"image-registry-69c5956889-gpgjs\" (UID: \"f493ef8a-2452-4069-a58f-62b9adde6d11\") " pod="openshift-image-registry/image-registry-69c5956889-gpgjs" Apr 22 19:58:18.334436 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:18.334414 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f493ef8a-2452-4069-a58f-62b9adde6d11-bound-sa-token\") pod \"image-registry-69c5956889-gpgjs\" (UID: \"f493ef8a-2452-4069-a58f-62b9adde6d11\") " pod="openshift-image-registry/image-registry-69c5956889-gpgjs" Apr 22 19:58:18.338596 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:18.338576 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mqpvt\" (UniqueName: \"kubernetes.io/projected/f493ef8a-2452-4069-a58f-62b9adde6d11-kube-api-access-mqpvt\") pod \"image-registry-69c5956889-gpgjs\" (UID: \"f493ef8a-2452-4069-a58f-62b9adde6d11\") " pod="openshift-image-registry/image-registry-69c5956889-gpgjs" Apr 22 19:58:18.345138 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:18.345118 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-949ff\" (UniqueName: \"kubernetes.io/projected/82d3388d-34c5-45f5-82dd-28252d41e89a-kube-api-access-949ff\") pod \"dns-default-ptknf\" (UID: \"82d3388d-34c5-45f5-82dd-28252d41e89a\") " pod="openshift-dns/dns-default-ptknf" Apr 22 19:58:18.421438 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:18.421407 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a663c8f0-aa0a-4c22-a907-7ecf606a4790-cert\") pod \"ingress-canary-svd5v\" (UID: \"a663c8f0-aa0a-4c22-a907-7ecf606a4790\") " pod="openshift-ingress-canary/ingress-canary-svd5v" Apr 22 19:58:18.421620 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:18.421468 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z7vds\" (UniqueName: \"kubernetes.io/projected/a663c8f0-aa0a-4c22-a907-7ecf606a4790-kube-api-access-z7vds\") pod \"ingress-canary-svd5v\" (UID: \"a663c8f0-aa0a-4c22-a907-7ecf606a4790\") " pod="openshift-ingress-canary/ingress-canary-svd5v" Apr 22 19:58:18.421676 ip-10-0-143-253 kubenswrapper[2577]: E0422 19:58:18.421613 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 19:58:18.421718 ip-10-0-143-253 kubenswrapper[2577]: E0422 19:58:18.421692 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a663c8f0-aa0a-4c22-a907-7ecf606a4790-cert podName:a663c8f0-aa0a-4c22-a907-7ecf606a4790 nodeName:}" failed. No retries permitted until 2026-04-22 19:58:18.921671667 +0000 UTC m=+34.236763891 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a663c8f0-aa0a-4c22-a907-7ecf606a4790-cert") pod "ingress-canary-svd5v" (UID: "a663c8f0-aa0a-4c22-a907-7ecf606a4790") : secret "canary-serving-cert" not found Apr 22 19:58:18.435483 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:18.435450 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z7vds\" (UniqueName: \"kubernetes.io/projected/a663c8f0-aa0a-4c22-a907-7ecf606a4790-kube-api-access-z7vds\") pod \"ingress-canary-svd5v\" (UID: \"a663c8f0-aa0a-4c22-a907-7ecf606a4790\") " pod="openshift-ingress-canary/ingress-canary-svd5v" Apr 22 19:58:18.825913 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:18.825863 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f493ef8a-2452-4069-a58f-62b9adde6d11-registry-tls\") pod \"image-registry-69c5956889-gpgjs\" (UID: \"f493ef8a-2452-4069-a58f-62b9adde6d11\") " pod="openshift-image-registry/image-registry-69c5956889-gpgjs" Apr 22 19:58:18.826163 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:18.825994 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/82d3388d-34c5-45f5-82dd-28252d41e89a-metrics-tls\") pod \"dns-default-ptknf\" (UID: \"82d3388d-34c5-45f5-82dd-28252d41e89a\") " pod="openshift-dns/dns-default-ptknf" Apr 22 19:58:18.826163 ip-10-0-143-253 kubenswrapper[2577]: E0422 19:58:18.826034 2577 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 19:58:18.826163 ip-10-0-143-253 kubenswrapper[2577]: E0422 19:58:18.826058 2577 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-69c5956889-gpgjs: secret "image-registry-tls" not found Apr 22 19:58:18.826163 ip-10-0-143-253 kubenswrapper[2577]: E0422 19:58:18.826118 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 19:58:18.826163 ip-10-0-143-253 kubenswrapper[2577]: E0422 19:58:18.826123 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f493ef8a-2452-4069-a58f-62b9adde6d11-registry-tls podName:f493ef8a-2452-4069-a58f-62b9adde6d11 nodeName:}" failed. No retries permitted until 2026-04-22 19:58:19.826103322 +0000 UTC m=+35.141195538 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/f493ef8a-2452-4069-a58f-62b9adde6d11-registry-tls") pod "image-registry-69c5956889-gpgjs" (UID: "f493ef8a-2452-4069-a58f-62b9adde6d11") : secret "image-registry-tls" not found Apr 22 19:58:18.826433 ip-10-0-143-253 kubenswrapper[2577]: E0422 19:58:18.826184 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/82d3388d-34c5-45f5-82dd-28252d41e89a-metrics-tls podName:82d3388d-34c5-45f5-82dd-28252d41e89a nodeName:}" failed. No retries permitted until 2026-04-22 19:58:19.826165977 +0000 UTC m=+35.141258189 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/82d3388d-34c5-45f5-82dd-28252d41e89a-metrics-tls") pod "dns-default-ptknf" (UID: "82d3388d-34c5-45f5-82dd-28252d41e89a") : secret "dns-default-metrics-tls" not found Apr 22 19:58:18.927241 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:18.927190 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d77c3bfb-9ca8-4f41-8c75-d15f2c1ab3b8-metrics-certs\") pod \"network-metrics-daemon-5dv89\" (UID: \"d77c3bfb-9ca8-4f41-8c75-d15f2c1ab3b8\") " pod="openshift-multus/network-metrics-daemon-5dv89" Apr 22 19:58:18.927241 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:18.927246 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a663c8f0-aa0a-4c22-a907-7ecf606a4790-cert\") pod \"ingress-canary-svd5v\" (UID: \"a663c8f0-aa0a-4c22-a907-7ecf606a4790\") " pod="openshift-ingress-canary/ingress-canary-svd5v" Apr 22 19:58:18.927488 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:18.927338 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l9mmc\" (UniqueName: \"kubernetes.io/projected/73129b41-d555-4a74-9f2a-640a35e9625f-kube-api-access-l9mmc\") pod \"network-check-target-t68sf\" (UID: \"73129b41-d555-4a74-9f2a-640a35e9625f\") " pod="openshift-network-diagnostics/network-check-target-t68sf" Apr 22 19:58:18.927488 ip-10-0-143-253 kubenswrapper[2577]: E0422 19:58:18.927367 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 22 19:58:18.927488 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:18.927372 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/0763314b-12d3-4771-844c-120f25ae1bc3-original-pull-secret\") pod \"global-pull-secret-syncer-x55sm\" (UID: \"0763314b-12d3-4771-844c-120f25ae1bc3\") " pod="kube-system/global-pull-secret-syncer-x55sm" Apr 22 19:58:18.927488 ip-10-0-143-253 kubenswrapper[2577]: E0422 19:58:18.927373 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 19:58:18.927488 ip-10-0-143-253 kubenswrapper[2577]: E0422 19:58:18.927447 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d77c3bfb-9ca8-4f41-8c75-d15f2c1ab3b8-metrics-certs podName:d77c3bfb-9ca8-4f41-8c75-d15f2c1ab3b8 nodeName:}" failed. No retries permitted until 2026-04-22 19:58:50.927425228 +0000 UTC m=+66.242517461 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d77c3bfb-9ca8-4f41-8c75-d15f2c1ab3b8-metrics-certs") pod "network-metrics-daemon-5dv89" (UID: "d77c3bfb-9ca8-4f41-8c75-d15f2c1ab3b8") : secret "metrics-daemon-secret" not found Apr 22 19:58:18.927750 ip-10-0-143-253 kubenswrapper[2577]: E0422 19:58:18.927495 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a663c8f0-aa0a-4c22-a907-7ecf606a4790-cert podName:a663c8f0-aa0a-4c22-a907-7ecf606a4790 nodeName:}" failed. No retries permitted until 2026-04-22 19:58:19.927472547 +0000 UTC m=+35.242564770 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a663c8f0-aa0a-4c22-a907-7ecf606a4790-cert") pod "ingress-canary-svd5v" (UID: "a663c8f0-aa0a-4c22-a907-7ecf606a4790") : secret "canary-serving-cert" not found Apr 22 19:58:18.930065 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:18.930040 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/0763314b-12d3-4771-844c-120f25ae1bc3-original-pull-secret\") pod \"global-pull-secret-syncer-x55sm\" (UID: \"0763314b-12d3-4771-844c-120f25ae1bc3\") " pod="kube-system/global-pull-secret-syncer-x55sm" Apr 22 19:58:18.930279 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:18.930254 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l9mmc\" (UniqueName: \"kubernetes.io/projected/73129b41-d555-4a74-9f2a-640a35e9625f-kube-api-access-l9mmc\") pod \"network-check-target-t68sf\" (UID: \"73129b41-d555-4a74-9f2a-640a35e9625f\") " pod="openshift-network-diagnostics/network-check-target-t68sf" Apr 22 19:58:19.159632 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:19.159554 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-x55sm" Apr 22 19:58:19.170361 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:19.170341 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-t68sf" Apr 22 19:58:19.488967 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:19.488784 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-t68sf"] Apr 22 19:58:19.493889 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:58:19.493857 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod73129b41_d555_4a74_9f2a_640a35e9625f.slice/crio-9dfc49ebf620c4edf52222dfb2fab46374485f3d15a5b72836ff8feb13d709d0 WatchSource:0}: Error finding container 9dfc49ebf620c4edf52222dfb2fab46374485f3d15a5b72836ff8feb13d709d0: Status 404 returned error can't find the container with id 9dfc49ebf620c4edf52222dfb2fab46374485f3d15a5b72836ff8feb13d709d0 Apr 22 19:58:19.498340 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:19.498168 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-x55sm"] Apr 22 19:58:19.506763 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:58:19.506741 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0763314b_12d3_4771_844c_120f25ae1bc3.slice/crio-74cd94ccd3a2e829592acc193e5f0b08fad898585de94dac1016876e86729c82 WatchSource:0}: Error finding container 74cd94ccd3a2e829592acc193e5f0b08fad898585de94dac1016876e86729c82: Status 404 returned error can't find the container with id 74cd94ccd3a2e829592acc193e5f0b08fad898585de94dac1016876e86729c82 Apr 22 19:58:19.835204 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:19.835173 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/82d3388d-34c5-45f5-82dd-28252d41e89a-metrics-tls\") pod \"dns-default-ptknf\" (UID: \"82d3388d-34c5-45f5-82dd-28252d41e89a\") " pod="openshift-dns/dns-default-ptknf" Apr 22 19:58:19.835204 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:19.835218 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f493ef8a-2452-4069-a58f-62b9adde6d11-registry-tls\") pod \"image-registry-69c5956889-gpgjs\" (UID: \"f493ef8a-2452-4069-a58f-62b9adde6d11\") " pod="openshift-image-registry/image-registry-69c5956889-gpgjs" Apr 22 19:58:19.835511 ip-10-0-143-253 kubenswrapper[2577]: E0422 19:58:19.835316 2577 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 19:58:19.835511 ip-10-0-143-253 kubenswrapper[2577]: E0422 19:58:19.835329 2577 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-69c5956889-gpgjs: secret "image-registry-tls" not found Apr 22 19:58:19.835511 ip-10-0-143-253 kubenswrapper[2577]: E0422 19:58:19.835333 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 19:58:19.835511 ip-10-0-143-253 kubenswrapper[2577]: E0422 19:58:19.835379 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f493ef8a-2452-4069-a58f-62b9adde6d11-registry-tls podName:f493ef8a-2452-4069-a58f-62b9adde6d11 nodeName:}" failed. No retries permitted until 2026-04-22 19:58:21.835364391 +0000 UTC m=+37.150456600 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/f493ef8a-2452-4069-a58f-62b9adde6d11-registry-tls") pod "image-registry-69c5956889-gpgjs" (UID: "f493ef8a-2452-4069-a58f-62b9adde6d11") : secret "image-registry-tls" not found Apr 22 19:58:19.835511 ip-10-0-143-253 kubenswrapper[2577]: E0422 19:58:19.835392 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/82d3388d-34c5-45f5-82dd-28252d41e89a-metrics-tls podName:82d3388d-34c5-45f5-82dd-28252d41e89a nodeName:}" failed. No retries permitted until 2026-04-22 19:58:21.835385761 +0000 UTC m=+37.150477971 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/82d3388d-34c5-45f5-82dd-28252d41e89a-metrics-tls") pod "dns-default-ptknf" (UID: "82d3388d-34c5-45f5-82dd-28252d41e89a") : secret "dns-default-metrics-tls" not found Apr 22 19:58:19.936526 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:19.936452 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a663c8f0-aa0a-4c22-a907-7ecf606a4790-cert\") pod \"ingress-canary-svd5v\" (UID: \"a663c8f0-aa0a-4c22-a907-7ecf606a4790\") " pod="openshift-ingress-canary/ingress-canary-svd5v" Apr 22 19:58:19.936663 ip-10-0-143-253 kubenswrapper[2577]: E0422 19:58:19.936572 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 19:58:19.936663 ip-10-0-143-253 kubenswrapper[2577]: E0422 19:58:19.936627 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a663c8f0-aa0a-4c22-a907-7ecf606a4790-cert podName:a663c8f0-aa0a-4c22-a907-7ecf606a4790 nodeName:}" failed. No retries permitted until 2026-04-22 19:58:21.936611863 +0000 UTC m=+37.251704094 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a663c8f0-aa0a-4c22-a907-7ecf606a4790-cert") pod "ingress-canary-svd5v" (UID: "a663c8f0-aa0a-4c22-a907-7ecf606a4790") : secret "canary-serving-cert" not found Apr 22 19:58:20.457966 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:20.457928 2577 generic.go:358] "Generic (PLEG): container finished" podID="c28089f2-d625-4e69-b372-16c2a540e3a1" containerID="d0ec8c8419e7026ffa46a6cfe0eadeb9f0ecf9cf41f156cec4e02cd412581ca4" exitCode=0 Apr 22 19:58:20.458818 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:20.458015 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-p4l6b" event={"ID":"c28089f2-d625-4e69-b372-16c2a540e3a1","Type":"ContainerDied","Data":"d0ec8c8419e7026ffa46a6cfe0eadeb9f0ecf9cf41f156cec4e02cd412581ca4"} Apr 22 19:58:20.459475 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:20.459077 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-x55sm" event={"ID":"0763314b-12d3-4771-844c-120f25ae1bc3","Type":"ContainerStarted","Data":"74cd94ccd3a2e829592acc193e5f0b08fad898585de94dac1016876e86729c82"} Apr 22 19:58:20.460270 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:20.460207 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-t68sf" event={"ID":"73129b41-d555-4a74-9f2a-640a35e9625f","Type":"ContainerStarted","Data":"9dfc49ebf620c4edf52222dfb2fab46374485f3d15a5b72836ff8feb13d709d0"} Apr 22 19:58:21.466120 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:21.466085 2577 generic.go:358] "Generic (PLEG): container finished" podID="c28089f2-d625-4e69-b372-16c2a540e3a1" containerID="a9cb6954c820135a8fd464dcc03ec6984414a0cf8d1554373ec61f3540859eef" exitCode=0 Apr 22 19:58:21.466867 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:21.466165 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-p4l6b" event={"ID":"c28089f2-d625-4e69-b372-16c2a540e3a1","Type":"ContainerDied","Data":"a9cb6954c820135a8fd464dcc03ec6984414a0cf8d1554373ec61f3540859eef"} Apr 22 19:58:21.852786 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:21.852755 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/82d3388d-34c5-45f5-82dd-28252d41e89a-metrics-tls\") pod \"dns-default-ptknf\" (UID: \"82d3388d-34c5-45f5-82dd-28252d41e89a\") " pod="openshift-dns/dns-default-ptknf" Apr 22 19:58:21.853003 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:21.852811 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f493ef8a-2452-4069-a58f-62b9adde6d11-registry-tls\") pod \"image-registry-69c5956889-gpgjs\" (UID: \"f493ef8a-2452-4069-a58f-62b9adde6d11\") " pod="openshift-image-registry/image-registry-69c5956889-gpgjs" Apr 22 19:58:21.853003 ip-10-0-143-253 kubenswrapper[2577]: E0422 19:58:21.852954 2577 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 19:58:21.853003 ip-10-0-143-253 kubenswrapper[2577]: E0422 19:58:21.852970 2577 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-69c5956889-gpgjs: secret "image-registry-tls" not found Apr 22 19:58:21.853003 ip-10-0-143-253 kubenswrapper[2577]: E0422 19:58:21.852978 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 19:58:21.853216 ip-10-0-143-253 kubenswrapper[2577]: E0422 19:58:21.853029 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f493ef8a-2452-4069-a58f-62b9adde6d11-registry-tls podName:f493ef8a-2452-4069-a58f-62b9adde6d11 nodeName:}" failed. No retries permitted until 2026-04-22 19:58:25.853011082 +0000 UTC m=+41.168103315 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/f493ef8a-2452-4069-a58f-62b9adde6d11-registry-tls") pod "image-registry-69c5956889-gpgjs" (UID: "f493ef8a-2452-4069-a58f-62b9adde6d11") : secret "image-registry-tls" not found Apr 22 19:58:21.853216 ip-10-0-143-253 kubenswrapper[2577]: E0422 19:58:21.853042 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/82d3388d-34c5-45f5-82dd-28252d41e89a-metrics-tls podName:82d3388d-34c5-45f5-82dd-28252d41e89a nodeName:}" failed. No retries permitted until 2026-04-22 19:58:25.853036635 +0000 UTC m=+41.168128845 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/82d3388d-34c5-45f5-82dd-28252d41e89a-metrics-tls") pod "dns-default-ptknf" (UID: "82d3388d-34c5-45f5-82dd-28252d41e89a") : secret "dns-default-metrics-tls" not found Apr 22 19:58:21.953601 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:21.953556 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a663c8f0-aa0a-4c22-a907-7ecf606a4790-cert\") pod \"ingress-canary-svd5v\" (UID: \"a663c8f0-aa0a-4c22-a907-7ecf606a4790\") " pod="openshift-ingress-canary/ingress-canary-svd5v" Apr 22 19:58:21.953772 ip-10-0-143-253 kubenswrapper[2577]: E0422 19:58:21.953710 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 19:58:21.953881 ip-10-0-143-253 kubenswrapper[2577]: E0422 19:58:21.953865 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a663c8f0-aa0a-4c22-a907-7ecf606a4790-cert podName:a663c8f0-aa0a-4c22-a907-7ecf606a4790 nodeName:}" failed. No retries permitted until 2026-04-22 19:58:25.953765574 +0000 UTC m=+41.268857789 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a663c8f0-aa0a-4c22-a907-7ecf606a4790-cert") pod "ingress-canary-svd5v" (UID: "a663c8f0-aa0a-4c22-a907-7ecf606a4790") : secret "canary-serving-cert" not found Apr 22 19:58:22.473087 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:22.473051 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-p4l6b" event={"ID":"c28089f2-d625-4e69-b372-16c2a540e3a1","Type":"ContainerStarted","Data":"ec5c7ce99ad54ab33904b3fc789bfd5170ae24414baf341b31a1296c97e6a8e5"} Apr 22 19:58:22.495376 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:22.495315 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-p4l6b" podStartSLOduration=5.936176048 podStartE2EDuration="37.495296497s" podCreationTimestamp="2026-04-22 19:57:45 +0000 UTC" firstStartedPulling="2026-04-22 19:57:47.794116663 +0000 UTC m=+3.109208873" lastFinishedPulling="2026-04-22 19:58:19.353237095 +0000 UTC m=+34.668329322" observedRunningTime="2026-04-22 19:58:22.493929019 +0000 UTC m=+37.809021254" watchObservedRunningTime="2026-04-22 19:58:22.495296497 +0000 UTC m=+37.810388736" Apr 22 19:58:25.481228 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:25.480983 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-t68sf" event={"ID":"73129b41-d555-4a74-9f2a-640a35e9625f","Type":"ContainerStarted","Data":"0708c97a9893b6ab6ce99c5d3c627c808be4122c8e998fcaf0c38fecce1181d6"} Apr 22 19:58:25.481636 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:25.481250 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-t68sf" Apr 22 19:58:25.482335 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:25.482309 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-x55sm" event={"ID":"0763314b-12d3-4771-844c-120f25ae1bc3","Type":"ContainerStarted","Data":"c412cae32f67bce221a3eff9bbeac30df442ac4279764e60e0b8034737dd56e8"} Apr 22 19:58:25.494904 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:25.494860 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-t68sf" podStartSLOduration=35.574461346 podStartE2EDuration="40.494828735s" podCreationTimestamp="2026-04-22 19:57:45 +0000 UTC" firstStartedPulling="2026-04-22 19:58:19.495865456 +0000 UTC m=+34.810957670" lastFinishedPulling="2026-04-22 19:58:24.416232846 +0000 UTC m=+39.731325059" observedRunningTime="2026-04-22 19:58:25.493928837 +0000 UTC m=+40.809021068" watchObservedRunningTime="2026-04-22 19:58:25.494828735 +0000 UTC m=+40.809920968" Apr 22 19:58:25.512244 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:25.512200 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-x55sm" podStartSLOduration=35.596383873 podStartE2EDuration="40.512187806s" podCreationTimestamp="2026-04-22 19:57:45 +0000 UTC" firstStartedPulling="2026-04-22 19:58:19.510276718 +0000 UTC m=+34.825368943" lastFinishedPulling="2026-04-22 19:58:24.426080664 +0000 UTC m=+39.741172876" observedRunningTime="2026-04-22 19:58:25.511527486 +0000 UTC m=+40.826619714" watchObservedRunningTime="2026-04-22 19:58:25.512187806 +0000 UTC m=+40.827280037" Apr 22 19:58:25.877327 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:25.877291 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/82d3388d-34c5-45f5-82dd-28252d41e89a-metrics-tls\") pod \"dns-default-ptknf\" (UID: \"82d3388d-34c5-45f5-82dd-28252d41e89a\") " pod="openshift-dns/dns-default-ptknf" Apr 22 19:58:25.877501 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:25.877341 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f493ef8a-2452-4069-a58f-62b9adde6d11-registry-tls\") pod \"image-registry-69c5956889-gpgjs\" (UID: \"f493ef8a-2452-4069-a58f-62b9adde6d11\") " pod="openshift-image-registry/image-registry-69c5956889-gpgjs" Apr 22 19:58:25.877501 ip-10-0-143-253 kubenswrapper[2577]: E0422 19:58:25.877431 2577 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 19:58:25.877501 ip-10-0-143-253 kubenswrapper[2577]: E0422 19:58:25.877434 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 19:58:25.877600 ip-10-0-143-253 kubenswrapper[2577]: E0422 19:58:25.877510 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/82d3388d-34c5-45f5-82dd-28252d41e89a-metrics-tls podName:82d3388d-34c5-45f5-82dd-28252d41e89a nodeName:}" failed. No retries permitted until 2026-04-22 19:58:33.877494719 +0000 UTC m=+49.192586928 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/82d3388d-34c5-45f5-82dd-28252d41e89a-metrics-tls") pod "dns-default-ptknf" (UID: "82d3388d-34c5-45f5-82dd-28252d41e89a") : secret "dns-default-metrics-tls" not found Apr 22 19:58:25.877600 ip-10-0-143-253 kubenswrapper[2577]: E0422 19:58:25.877443 2577 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-69c5956889-gpgjs: secret "image-registry-tls" not found Apr 22 19:58:25.877600 ip-10-0-143-253 kubenswrapper[2577]: E0422 19:58:25.877567 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f493ef8a-2452-4069-a58f-62b9adde6d11-registry-tls podName:f493ef8a-2452-4069-a58f-62b9adde6d11 nodeName:}" failed. No retries permitted until 2026-04-22 19:58:33.877556382 +0000 UTC m=+49.192648592 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/f493ef8a-2452-4069-a58f-62b9adde6d11-registry-tls") pod "image-registry-69c5956889-gpgjs" (UID: "f493ef8a-2452-4069-a58f-62b9adde6d11") : secret "image-registry-tls" not found Apr 22 19:58:25.977854 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:25.977808 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a663c8f0-aa0a-4c22-a907-7ecf606a4790-cert\") pod \"ingress-canary-svd5v\" (UID: \"a663c8f0-aa0a-4c22-a907-7ecf606a4790\") " pod="openshift-ingress-canary/ingress-canary-svd5v" Apr 22 19:58:25.977997 ip-10-0-143-253 kubenswrapper[2577]: E0422 19:58:25.977949 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 19:58:25.978034 ip-10-0-143-253 kubenswrapper[2577]: E0422 19:58:25.978007 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a663c8f0-aa0a-4c22-a907-7ecf606a4790-cert podName:a663c8f0-aa0a-4c22-a907-7ecf606a4790 nodeName:}" failed. No retries permitted until 2026-04-22 19:58:33.977990225 +0000 UTC m=+49.293082450 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a663c8f0-aa0a-4c22-a907-7ecf606a4790-cert") pod "ingress-canary-svd5v" (UID: "a663c8f0-aa0a-4c22-a907-7ecf606a4790") : secret "canary-serving-cert" not found Apr 22 19:58:33.929308 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:33.929271 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/82d3388d-34c5-45f5-82dd-28252d41e89a-metrics-tls\") pod \"dns-default-ptknf\" (UID: \"82d3388d-34c5-45f5-82dd-28252d41e89a\") " pod="openshift-dns/dns-default-ptknf" Apr 22 19:58:33.929308 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:33.929314 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f493ef8a-2452-4069-a58f-62b9adde6d11-registry-tls\") pod \"image-registry-69c5956889-gpgjs\" (UID: \"f493ef8a-2452-4069-a58f-62b9adde6d11\") " pod="openshift-image-registry/image-registry-69c5956889-gpgjs" Apr 22 19:58:33.929774 ip-10-0-143-253 kubenswrapper[2577]: E0422 19:58:33.929424 2577 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 19:58:33.929774 ip-10-0-143-253 kubenswrapper[2577]: E0422 19:58:33.929435 2577 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-69c5956889-gpgjs: secret "image-registry-tls" not found Apr 22 19:58:33.929774 ip-10-0-143-253 kubenswrapper[2577]: E0422 19:58:33.929439 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 19:58:33.929774 ip-10-0-143-253 kubenswrapper[2577]: E0422 19:58:33.929486 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f493ef8a-2452-4069-a58f-62b9adde6d11-registry-tls podName:f493ef8a-2452-4069-a58f-62b9adde6d11 nodeName:}" failed. No retries permitted until 2026-04-22 19:58:49.929471048 +0000 UTC m=+65.244563259 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/f493ef8a-2452-4069-a58f-62b9adde6d11-registry-tls") pod "image-registry-69c5956889-gpgjs" (UID: "f493ef8a-2452-4069-a58f-62b9adde6d11") : secret "image-registry-tls" not found Apr 22 19:58:33.929774 ip-10-0-143-253 kubenswrapper[2577]: E0422 19:58:33.929515 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/82d3388d-34c5-45f5-82dd-28252d41e89a-metrics-tls podName:82d3388d-34c5-45f5-82dd-28252d41e89a nodeName:}" failed. No retries permitted until 2026-04-22 19:58:49.929497414 +0000 UTC m=+65.244589642 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/82d3388d-34c5-45f5-82dd-28252d41e89a-metrics-tls") pod "dns-default-ptknf" (UID: "82d3388d-34c5-45f5-82dd-28252d41e89a") : secret "dns-default-metrics-tls" not found Apr 22 19:58:34.029946 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:34.029913 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a663c8f0-aa0a-4c22-a907-7ecf606a4790-cert\") pod \"ingress-canary-svd5v\" (UID: \"a663c8f0-aa0a-4c22-a907-7ecf606a4790\") " pod="openshift-ingress-canary/ingress-canary-svd5v" Apr 22 19:58:34.030100 ip-10-0-143-253 kubenswrapper[2577]: E0422 19:58:34.030077 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 19:58:34.030165 ip-10-0-143-253 kubenswrapper[2577]: E0422 19:58:34.030155 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a663c8f0-aa0a-4c22-a907-7ecf606a4790-cert podName:a663c8f0-aa0a-4c22-a907-7ecf606a4790 nodeName:}" failed. No retries permitted until 2026-04-22 19:58:50.030133989 +0000 UTC m=+65.345226201 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a663c8f0-aa0a-4c22-a907-7ecf606a4790-cert") pod "ingress-canary-svd5v" (UID: "a663c8f0-aa0a-4c22-a907-7ecf606a4790") : secret "canary-serving-cert" not found Apr 22 19:58:35.644035 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:35.644008 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-msxbb_3d105cfe-1e71-45ef-b072-4f6de04ca9c1/dns-node-resolver/0.log" Apr 22 19:58:36.640790 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:36.640763 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-rpfnc_8b248b8e-1022-47ab-b16f-e3e4f3ee7abb/node-ca/0.log" Apr 22 19:58:43.450280 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:43.450253 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-wrbxl" Apr 22 19:58:49.940103 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:49.940057 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f493ef8a-2452-4069-a58f-62b9adde6d11-registry-tls\") pod \"image-registry-69c5956889-gpgjs\" (UID: \"f493ef8a-2452-4069-a58f-62b9adde6d11\") " pod="openshift-image-registry/image-registry-69c5956889-gpgjs" Apr 22 19:58:49.940581 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:49.940170 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/82d3388d-34c5-45f5-82dd-28252d41e89a-metrics-tls\") pod \"dns-default-ptknf\" (UID: \"82d3388d-34c5-45f5-82dd-28252d41e89a\") " pod="openshift-dns/dns-default-ptknf" Apr 22 19:58:49.943978 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:49.943952 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/82d3388d-34c5-45f5-82dd-28252d41e89a-metrics-tls\") pod \"dns-default-ptknf\" (UID: \"82d3388d-34c5-45f5-82dd-28252d41e89a\") " pod="openshift-dns/dns-default-ptknf" Apr 22 19:58:49.944099 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:49.944075 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f493ef8a-2452-4069-a58f-62b9adde6d11-registry-tls\") pod \"image-registry-69c5956889-gpgjs\" (UID: \"f493ef8a-2452-4069-a58f-62b9adde6d11\") " pod="openshift-image-registry/image-registry-69c5956889-gpgjs" Apr 22 19:58:50.040855 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:50.040801 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a663c8f0-aa0a-4c22-a907-7ecf606a4790-cert\") pod \"ingress-canary-svd5v\" (UID: \"a663c8f0-aa0a-4c22-a907-7ecf606a4790\") " pod="openshift-ingress-canary/ingress-canary-svd5v" Apr 22 19:58:50.043099 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:50.043077 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a663c8f0-aa0a-4c22-a907-7ecf606a4790-cert\") pod \"ingress-canary-svd5v\" (UID: \"a663c8f0-aa0a-4c22-a907-7ecf606a4790\") " pod="openshift-ingress-canary/ingress-canary-svd5v" Apr 22 19:58:50.207382 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:50.207303 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-wxhns\"" Apr 22 19:58:50.215281 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:50.215259 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-69c5956889-gpgjs" Apr 22 19:58:50.222289 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:50.222263 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-wrr4x\"" Apr 22 19:58:50.230641 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:50.230618 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-ptknf" Apr 22 19:58:50.246510 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:50.246480 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-2xv8f\"" Apr 22 19:58:50.254004 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:50.253965 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-svd5v" Apr 22 19:58:50.354368 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:50.354336 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-69c5956889-gpgjs"] Apr 22 19:58:50.357532 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:58:50.357499 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf493ef8a_2452_4069_a58f_62b9adde6d11.slice/crio-5d29dd15e64bd936f27ac79f31e4b40e7b116dde735476a60b2ea7f76222be75 WatchSource:0}: Error finding container 5d29dd15e64bd936f27ac79f31e4b40e7b116dde735476a60b2ea7f76222be75: Status 404 returned error can't find the container with id 5d29dd15e64bd936f27ac79f31e4b40e7b116dde735476a60b2ea7f76222be75 Apr 22 19:58:50.365566 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:50.365537 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-ptknf"] Apr 22 19:58:50.367850 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:58:50.367812 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod82d3388d_34c5_45f5_82dd_28252d41e89a.slice/crio-73c1a686a5c770dc5201a577d09e66a77e51955510bca80776bfec1582731ba5 WatchSource:0}: Error finding container 73c1a686a5c770dc5201a577d09e66a77e51955510bca80776bfec1582731ba5: Status 404 returned error can't find the container with id 73c1a686a5c770dc5201a577d09e66a77e51955510bca80776bfec1582731ba5 Apr 22 19:58:50.389215 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:50.389192 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-svd5v"] Apr 22 19:58:50.399737 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:58:50.399707 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda663c8f0_aa0a_4c22_a907_7ecf606a4790.slice/crio-d209ecbba2b9c5d14142f82f51263b70be716d69daf1d8090c3d4002687aee2e WatchSource:0}: Error finding container d209ecbba2b9c5d14142f82f51263b70be716d69daf1d8090c3d4002687aee2e: Status 404 returned error can't find the container with id d209ecbba2b9c5d14142f82f51263b70be716d69daf1d8090c3d4002687aee2e Apr 22 19:58:50.531799 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:50.531760 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-69c5956889-gpgjs" event={"ID":"f493ef8a-2452-4069-a58f-62b9adde6d11","Type":"ContainerStarted","Data":"13ef629c4eaec4fb00204097b3114117ab893b31a0062a62ea0d14d8f5c81a5d"} Apr 22 19:58:50.531990 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:50.531804 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-69c5956889-gpgjs" event={"ID":"f493ef8a-2452-4069-a58f-62b9adde6d11","Type":"ContainerStarted","Data":"5d29dd15e64bd936f27ac79f31e4b40e7b116dde735476a60b2ea7f76222be75"} Apr 22 19:58:50.531990 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:50.531894 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-69c5956889-gpgjs" Apr 22 19:58:50.532831 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:50.532807 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-svd5v" event={"ID":"a663c8f0-aa0a-4c22-a907-7ecf606a4790","Type":"ContainerStarted","Data":"d209ecbba2b9c5d14142f82f51263b70be716d69daf1d8090c3d4002687aee2e"} Apr 22 19:58:50.533732 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:50.533705 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-ptknf" event={"ID":"82d3388d-34c5-45f5-82dd-28252d41e89a","Type":"ContainerStarted","Data":"73c1a686a5c770dc5201a577d09e66a77e51955510bca80776bfec1582731ba5"} Apr 22 19:58:50.549253 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:50.549217 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-69c5956889-gpgjs" podStartSLOduration=42.549205626 podStartE2EDuration="42.549205626s" podCreationTimestamp="2026-04-22 19:58:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:58:50.548688689 +0000 UTC m=+65.863780921" watchObservedRunningTime="2026-04-22 19:58:50.549205626 +0000 UTC m=+65.864297858" Apr 22 19:58:50.951190 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:50.951103 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d77c3bfb-9ca8-4f41-8c75-d15f2c1ab3b8-metrics-certs\") pod \"network-metrics-daemon-5dv89\" (UID: \"d77c3bfb-9ca8-4f41-8c75-d15f2c1ab3b8\") " pod="openshift-multus/network-metrics-daemon-5dv89" Apr 22 19:58:50.954113 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:50.954084 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d77c3bfb-9ca8-4f41-8c75-d15f2c1ab3b8-metrics-certs\") pod \"network-metrics-daemon-5dv89\" (UID: \"d77c3bfb-9ca8-4f41-8c75-d15f2c1ab3b8\") " pod="openshift-multus/network-metrics-daemon-5dv89" Apr 22 19:58:50.967647 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:50.967620 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-gl2tf\"" Apr 22 19:58:50.975152 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:50.975125 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5dv89" Apr 22 19:58:51.132202 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:51.132169 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-5dv89"] Apr 22 19:58:51.135996 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:58:51.135964 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd77c3bfb_9ca8_4f41_8c75_d15f2c1ab3b8.slice/crio-d2257cfae66ce59d8a143bb784b83a92cf57e09c3dd1fb2883ecf4f925d34c47 WatchSource:0}: Error finding container d2257cfae66ce59d8a143bb784b83a92cf57e09c3dd1fb2883ecf4f925d34c47: Status 404 returned error can't find the container with id d2257cfae66ce59d8a143bb784b83a92cf57e09c3dd1fb2883ecf4f925d34c47 Apr 22 19:58:51.537562 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:51.537507 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-5dv89" event={"ID":"d77c3bfb-9ca8-4f41-8c75-d15f2c1ab3b8","Type":"ContainerStarted","Data":"d2257cfae66ce59d8a143bb784b83a92cf57e09c3dd1fb2883ecf4f925d34c47"} Apr 22 19:58:53.543681 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:53.543648 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-svd5v" event={"ID":"a663c8f0-aa0a-4c22-a907-7ecf606a4790","Type":"ContainerStarted","Data":"9949b54ed26ead5b70be0225905365f0e10301d5201bc0987837ba15d8343bbb"} Apr 22 19:58:53.545188 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:53.545161 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-ptknf" event={"ID":"82d3388d-34c5-45f5-82dd-28252d41e89a","Type":"ContainerStarted","Data":"8ad73ba05945fb7eb857f0c4cb95612195c0a8c6e1a50ec87b3b90cec8439d68"} Apr 22 19:58:53.545294 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:53.545197 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-ptknf" event={"ID":"82d3388d-34c5-45f5-82dd-28252d41e89a","Type":"ContainerStarted","Data":"98481bcadacd06918ddf7fd233f9d2208b1319bc69342b54dad8b1d708b7d5c0"} Apr 22 19:58:53.545294 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:53.545271 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-ptknf" Apr 22 19:58:53.546485 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:53.546466 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-5dv89" event={"ID":"d77c3bfb-9ca8-4f41-8c75-d15f2c1ab3b8","Type":"ContainerStarted","Data":"6625df25b8ef9b050f76b432dabca166304ac6ff18e06bcc82ace3dd019977f2"} Apr 22 19:58:53.559206 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:53.559160 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-svd5v" podStartSLOduration=32.950348528 podStartE2EDuration="35.559148193s" podCreationTimestamp="2026-04-22 19:58:18 +0000 UTC" firstStartedPulling="2026-04-22 19:58:50.401472799 +0000 UTC m=+65.716565008" lastFinishedPulling="2026-04-22 19:58:53.010272463 +0000 UTC m=+68.325364673" observedRunningTime="2026-04-22 19:58:53.558281971 +0000 UTC m=+68.873374200" watchObservedRunningTime="2026-04-22 19:58:53.559148193 +0000 UTC m=+68.874240419" Apr 22 19:58:53.574847 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:53.574539 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-ptknf" podStartSLOduration=32.933938088 podStartE2EDuration="35.574526336s" podCreationTimestamp="2026-04-22 19:58:18 +0000 UTC" firstStartedPulling="2026-04-22 19:58:50.369688834 +0000 UTC m=+65.684781049" lastFinishedPulling="2026-04-22 19:58:53.010277084 +0000 UTC m=+68.325369297" observedRunningTime="2026-04-22 19:58:53.574297298 +0000 UTC m=+68.889389570" watchObservedRunningTime="2026-04-22 19:58:53.574526336 +0000 UTC m=+68.889618568" Apr 22 19:58:54.550400 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:54.550363 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-5dv89" event={"ID":"d77c3bfb-9ca8-4f41-8c75-d15f2c1ab3b8","Type":"ContainerStarted","Data":"9398bc01d7c03f58a98c36ef78ea7a8c75e6aeea0e800b4a69661a8058119ea2"} Apr 22 19:58:54.565536 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:54.565493 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-5dv89" podStartSLOduration=67.492255168 podStartE2EDuration="1m9.565480097s" podCreationTimestamp="2026-04-22 19:57:45 +0000 UTC" firstStartedPulling="2026-04-22 19:58:51.138783304 +0000 UTC m=+66.453875513" lastFinishedPulling="2026-04-22 19:58:53.212008215 +0000 UTC m=+68.527100442" observedRunningTime="2026-04-22 19:58:54.564875545 +0000 UTC m=+69.879967772" watchObservedRunningTime="2026-04-22 19:58:54.565480097 +0000 UTC m=+69.880572328" Apr 22 19:58:56.487316 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:56.487286 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-t68sf" Apr 22 19:58:57.017521 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:57.017489 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-khbvn"] Apr 22 19:58:57.022552 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:57.022530 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-khbvn" Apr 22 19:58:57.030600 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:57.030570 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 22 19:58:57.030695 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:57.030636 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-t9dg2\"" Apr 22 19:58:57.030695 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:57.030636 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 22 19:58:57.030775 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:57.030713 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 22 19:58:57.031531 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:57.031514 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-69c5956889-gpgjs"] Apr 22 19:58:57.034560 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:57.034540 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 22 19:58:57.040579 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:57.040558 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-khbvn"] Apr 22 19:58:57.092986 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:57.092957 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-lpfc8"] Apr 22 19:58:57.095395 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:57.095372 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/9bbf2aad-4420-4e3a-9215-7d09954398fa-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-khbvn\" (UID: \"9bbf2aad-4420-4e3a-9215-7d09954398fa\") " pod="openshift-insights/insights-runtime-extractor-khbvn" Apr 22 19:58:57.095482 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:57.095406 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/9bbf2aad-4420-4e3a-9215-7d09954398fa-data-volume\") pod \"insights-runtime-extractor-khbvn\" (UID: \"9bbf2aad-4420-4e3a-9215-7d09954398fa\") " pod="openshift-insights/insights-runtime-extractor-khbvn" Apr 22 19:58:57.095482 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:57.095432 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/9bbf2aad-4420-4e3a-9215-7d09954398fa-crio-socket\") pod \"insights-runtime-extractor-khbvn\" (UID: \"9bbf2aad-4420-4e3a-9215-7d09954398fa\") " pod="openshift-insights/insights-runtime-extractor-khbvn" Apr 22 19:58:57.095482 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:57.095455 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/9bbf2aad-4420-4e3a-9215-7d09954398fa-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-khbvn\" (UID: \"9bbf2aad-4420-4e3a-9215-7d09954398fa\") " pod="openshift-insights/insights-runtime-extractor-khbvn" Apr 22 19:58:57.095580 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:57.095507 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dqfxw\" (UniqueName: \"kubernetes.io/projected/9bbf2aad-4420-4e3a-9215-7d09954398fa-kube-api-access-dqfxw\") pod \"insights-runtime-extractor-khbvn\" (UID: \"9bbf2aad-4420-4e3a-9215-7d09954398fa\") " pod="openshift-insights/insights-runtime-extractor-khbvn" Apr 22 19:58:57.096534 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:57.096521 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-lpfc8" Apr 22 19:58:57.099908 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:57.099890 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-root-ca.crt\"" Apr 22 19:58:57.100058 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:57.100041 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"openshift-service-ca.crt\"" Apr 22 19:58:57.100108 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:57.100063 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-storage-version-migrator-sa-dockercfg-5h5ch\"" Apr 22 19:58:57.111599 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:57.111574 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-lpfc8"] Apr 22 19:58:57.196606 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:57.196581 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/9bbf2aad-4420-4e3a-9215-7d09954398fa-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-khbvn\" (UID: \"9bbf2aad-4420-4e3a-9215-7d09954398fa\") " pod="openshift-insights/insights-runtime-extractor-khbvn" Apr 22 19:58:57.196716 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:57.196608 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/9bbf2aad-4420-4e3a-9215-7d09954398fa-data-volume\") pod \"insights-runtime-extractor-khbvn\" (UID: \"9bbf2aad-4420-4e3a-9215-7d09954398fa\") " pod="openshift-insights/insights-runtime-extractor-khbvn" Apr 22 19:58:57.196716 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:57.196630 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/9bbf2aad-4420-4e3a-9215-7d09954398fa-crio-socket\") pod \"insights-runtime-extractor-khbvn\" (UID: \"9bbf2aad-4420-4e3a-9215-7d09954398fa\") " pod="openshift-insights/insights-runtime-extractor-khbvn" Apr 22 19:58:57.196716 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:57.196650 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/9bbf2aad-4420-4e3a-9215-7d09954398fa-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-khbvn\" (UID: \"9bbf2aad-4420-4e3a-9215-7d09954398fa\") " pod="openshift-insights/insights-runtime-extractor-khbvn" Apr 22 19:58:57.196716 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:57.196678 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dqfxw\" (UniqueName: \"kubernetes.io/projected/9bbf2aad-4420-4e3a-9215-7d09954398fa-kube-api-access-dqfxw\") pod \"insights-runtime-extractor-khbvn\" (UID: \"9bbf2aad-4420-4e3a-9215-7d09954398fa\") " pod="openshift-insights/insights-runtime-extractor-khbvn" Apr 22 19:58:57.196716 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:57.196707 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cnhrz\" (UniqueName: \"kubernetes.io/projected/5db6b33c-709b-4948-8eeb-99cd23ddba38-kube-api-access-cnhrz\") pod \"migrator-74bb7799d9-lpfc8\" (UID: \"5db6b33c-709b-4948-8eeb-99cd23ddba38\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-lpfc8" Apr 22 19:58:57.196964 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:57.196733 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/9bbf2aad-4420-4e3a-9215-7d09954398fa-crio-socket\") pod \"insights-runtime-extractor-khbvn\" (UID: \"9bbf2aad-4420-4e3a-9215-7d09954398fa\") " pod="openshift-insights/insights-runtime-extractor-khbvn" Apr 22 19:58:57.197142 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:57.197124 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/9bbf2aad-4420-4e3a-9215-7d09954398fa-data-volume\") pod \"insights-runtime-extractor-khbvn\" (UID: \"9bbf2aad-4420-4e3a-9215-7d09954398fa\") " pod="openshift-insights/insights-runtime-extractor-khbvn" Apr 22 19:58:57.197749 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:57.197731 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/9bbf2aad-4420-4e3a-9215-7d09954398fa-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-khbvn\" (UID: \"9bbf2aad-4420-4e3a-9215-7d09954398fa\") " pod="openshift-insights/insights-runtime-extractor-khbvn" Apr 22 19:58:57.198873 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:57.198857 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/9bbf2aad-4420-4e3a-9215-7d09954398fa-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-khbvn\" (UID: \"9bbf2aad-4420-4e3a-9215-7d09954398fa\") " pod="openshift-insights/insights-runtime-extractor-khbvn" Apr 22 19:58:57.216119 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:57.216094 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dqfxw\" (UniqueName: \"kubernetes.io/projected/9bbf2aad-4420-4e3a-9215-7d09954398fa-kube-api-access-dqfxw\") pod \"insights-runtime-extractor-khbvn\" (UID: \"9bbf2aad-4420-4e3a-9215-7d09954398fa\") " pod="openshift-insights/insights-runtime-extractor-khbvn" Apr 22 19:58:57.297124 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:57.297105 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cnhrz\" (UniqueName: \"kubernetes.io/projected/5db6b33c-709b-4948-8eeb-99cd23ddba38-kube-api-access-cnhrz\") pod \"migrator-74bb7799d9-lpfc8\" (UID: \"5db6b33c-709b-4948-8eeb-99cd23ddba38\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-lpfc8" Apr 22 19:58:57.313503 ip-10-0-143-253 kubenswrapper[2577]: E0422 19:58:57.313484 2577 projected.go:289] Couldn't get configMap openshift-kube-storage-version-migrator/openshift-service-ca.crt: configmap "openshift-service-ca.crt" not found Apr 22 19:58:57.313555 ip-10-0-143-253 kubenswrapper[2577]: E0422 19:58:57.313504 2577 projected.go:194] Error preparing data for projected volume kube-api-access-cnhrz for pod openshift-kube-storage-version-migrator/migrator-74bb7799d9-lpfc8: configmap "openshift-service-ca.crt" not found Apr 22 19:58:57.313590 ip-10-0-143-253 kubenswrapper[2577]: E0422 19:58:57.313557 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5db6b33c-709b-4948-8eeb-99cd23ddba38-kube-api-access-cnhrz podName:5db6b33c-709b-4948-8eeb-99cd23ddba38 nodeName:}" failed. No retries permitted until 2026-04-22 19:58:57.813542231 +0000 UTC m=+73.128634440 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cnhrz" (UniqueName: "kubernetes.io/projected/5db6b33c-709b-4948-8eeb-99cd23ddba38-kube-api-access-cnhrz") pod "migrator-74bb7799d9-lpfc8" (UID: "5db6b33c-709b-4948-8eeb-99cd23ddba38") : configmap "openshift-service-ca.crt" not found Apr 22 19:58:57.331296 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:57.331279 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-khbvn" Apr 22 19:58:57.447458 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:57.447431 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-khbvn"] Apr 22 19:58:57.450865 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:58:57.450822 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9bbf2aad_4420_4e3a_9215_7d09954398fa.slice/crio-dcd0e3a4ea75a8ddbbfdfccf305c9c02e8594850816d99e3eeac675d3c887b59 WatchSource:0}: Error finding container dcd0e3a4ea75a8ddbbfdfccf305c9c02e8594850816d99e3eeac675d3c887b59: Status 404 returned error can't find the container with id dcd0e3a4ea75a8ddbbfdfccf305c9c02e8594850816d99e3eeac675d3c887b59 Apr 22 19:58:57.557811 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:57.557742 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-khbvn" event={"ID":"9bbf2aad-4420-4e3a-9215-7d09954398fa","Type":"ContainerStarted","Data":"838b08857365501971888684b6e1a92890e006e671a19c5cd9f68c1d5f504e2f"} Apr 22 19:58:57.557811 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:57.557780 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-khbvn" event={"ID":"9bbf2aad-4420-4e3a-9215-7d09954398fa","Type":"ContainerStarted","Data":"dcd0e3a4ea75a8ddbbfdfccf305c9c02e8594850816d99e3eeac675d3c887b59"} Apr 22 19:58:57.901769 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:57.901700 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cnhrz\" (UniqueName: \"kubernetes.io/projected/5db6b33c-709b-4948-8eeb-99cd23ddba38-kube-api-access-cnhrz\") pod \"migrator-74bb7799d9-lpfc8\" (UID: \"5db6b33c-709b-4948-8eeb-99cd23ddba38\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-lpfc8" Apr 22 19:58:57.904071 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:57.904051 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cnhrz\" (UniqueName: \"kubernetes.io/projected/5db6b33c-709b-4948-8eeb-99cd23ddba38-kube-api-access-cnhrz\") pod \"migrator-74bb7799d9-lpfc8\" (UID: \"5db6b33c-709b-4948-8eeb-99cd23ddba38\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-lpfc8" Apr 22 19:58:58.004665 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:58.004637 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-lpfc8" Apr 22 19:58:58.153580 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:58.153511 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-lpfc8"] Apr 22 19:58:58.157213 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:58:58.157187 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5db6b33c_709b_4948_8eeb_99cd23ddba38.slice/crio-4ed75bb204c9d145a54eaf80f7c818f9a6b53dca33988a52e1d939c7750bb57e WatchSource:0}: Error finding container 4ed75bb204c9d145a54eaf80f7c818f9a6b53dca33988a52e1d939c7750bb57e: Status 404 returned error can't find the container with id 4ed75bb204c9d145a54eaf80f7c818f9a6b53dca33988a52e1d939c7750bb57e Apr 22 19:58:58.562056 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:58.562019 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-lpfc8" event={"ID":"5db6b33c-709b-4948-8eeb-99cd23ddba38","Type":"ContainerStarted","Data":"4ed75bb204c9d145a54eaf80f7c818f9a6b53dca33988a52e1d939c7750bb57e"} Apr 22 19:58:58.564142 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:58:58.564111 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-khbvn" event={"ID":"9bbf2aad-4420-4e3a-9215-7d09954398fa","Type":"ContainerStarted","Data":"96897daf120e7abca09897e157e72c7f2a19d58e590d0bd6233869d8177b1ec2"} Apr 22 19:59:00.570573 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:00.570536 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-khbvn" event={"ID":"9bbf2aad-4420-4e3a-9215-7d09954398fa","Type":"ContainerStarted","Data":"6ff9f246b84ce703c5497f6ff2ec06ba30fd31dfaeec1e45e3792a4af5f87827"} Apr 22 19:59:00.572050 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:00.572028 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-lpfc8" event={"ID":"5db6b33c-709b-4948-8eeb-99cd23ddba38","Type":"ContainerStarted","Data":"79aa1cfed7f0f6f7e9c70dd967a100016147ac20061068ed28f93ab46509d940"} Apr 22 19:59:00.572050 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:00.572052 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-lpfc8" event={"ID":"5db6b33c-709b-4948-8eeb-99cd23ddba38","Type":"ContainerStarted","Data":"111a953cc977e3acca1816fa8857acc321586b17fa1232dbd096dbf9469a9eb3"} Apr 22 19:59:00.591453 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:00.591403 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-khbvn" podStartSLOduration=2.534580885 podStartE2EDuration="4.591390866s" podCreationTimestamp="2026-04-22 19:58:56 +0000 UTC" firstStartedPulling="2026-04-22 19:58:57.506343268 +0000 UTC m=+72.821435477" lastFinishedPulling="2026-04-22 19:58:59.563153245 +0000 UTC m=+74.878245458" observedRunningTime="2026-04-22 19:59:00.589509707 +0000 UTC m=+75.904601968" watchObservedRunningTime="2026-04-22 19:59:00.591390866 +0000 UTC m=+75.906483098" Apr 22 19:59:00.603673 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:00.603633 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-lpfc8" podStartSLOduration=2.198506025 podStartE2EDuration="3.603620259s" podCreationTimestamp="2026-04-22 19:58:57 +0000 UTC" firstStartedPulling="2026-04-22 19:58:58.159586225 +0000 UTC m=+73.474678435" lastFinishedPulling="2026-04-22 19:58:59.56470046 +0000 UTC m=+74.879792669" observedRunningTime="2026-04-22 19:59:00.602899659 +0000 UTC m=+75.917991891" watchObservedRunningTime="2026-04-22 19:59:00.603620259 +0000 UTC m=+75.918712490" Apr 22 19:59:00.742801 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:00.742771 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-8qm52"] Apr 22 19:59:00.745912 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:00.745897 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-8qm52" Apr 22 19:59:00.748995 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:00.748962 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-tls\"" Apr 22 19:59:00.749173 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:00.749129 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-kube-rbac-proxy-config\"" Apr 22 19:59:00.749173 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:00.749146 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 22 19:59:00.749173 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:00.749150 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 22 19:59:00.749356 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:00.749264 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 22 19:59:00.749407 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:00.749395 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-dockercfg-wb9l4\"" Apr 22 19:59:00.757297 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:00.757277 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-8qm52"] Apr 22 19:59:00.819375 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:00.819347 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d43fc95f-82c0-458a-abd4-ae257a54e5b0-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-8qm52\" (UID: \"d43fc95f-82c0-458a-abd4-ae257a54e5b0\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-8qm52" Apr 22 19:59:00.819521 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:00.819386 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d43fc95f-82c0-458a-abd4-ae257a54e5b0-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-8qm52\" (UID: \"d43fc95f-82c0-458a-abd4-ae257a54e5b0\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-8qm52" Apr 22 19:59:00.819521 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:00.819412 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/d43fc95f-82c0-458a-abd4-ae257a54e5b0-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-8qm52\" (UID: \"d43fc95f-82c0-458a-abd4-ae257a54e5b0\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-8qm52" Apr 22 19:59:00.819521 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:00.819468 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5dnmh\" (UniqueName: \"kubernetes.io/projected/d43fc95f-82c0-458a-abd4-ae257a54e5b0-kube-api-access-5dnmh\") pod \"prometheus-operator-5676c8c784-8qm52\" (UID: \"d43fc95f-82c0-458a-abd4-ae257a54e5b0\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-8qm52" Apr 22 19:59:00.920151 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:00.920061 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d43fc95f-82c0-458a-abd4-ae257a54e5b0-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-8qm52\" (UID: \"d43fc95f-82c0-458a-abd4-ae257a54e5b0\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-8qm52" Apr 22 19:59:00.920151 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:00.920114 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/d43fc95f-82c0-458a-abd4-ae257a54e5b0-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-8qm52\" (UID: \"d43fc95f-82c0-458a-abd4-ae257a54e5b0\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-8qm52" Apr 22 19:59:00.920151 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:00.920153 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5dnmh\" (UniqueName: \"kubernetes.io/projected/d43fc95f-82c0-458a-abd4-ae257a54e5b0-kube-api-access-5dnmh\") pod \"prometheus-operator-5676c8c784-8qm52\" (UID: \"d43fc95f-82c0-458a-abd4-ae257a54e5b0\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-8qm52" Apr 22 19:59:00.920331 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:00.920172 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d43fc95f-82c0-458a-abd4-ae257a54e5b0-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-8qm52\" (UID: \"d43fc95f-82c0-458a-abd4-ae257a54e5b0\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-8qm52" Apr 22 19:59:00.920331 ip-10-0-143-253 kubenswrapper[2577]: E0422 19:59:00.920294 2577 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-tls: secret "prometheus-operator-tls" not found Apr 22 19:59:00.920392 ip-10-0-143-253 kubenswrapper[2577]: E0422 19:59:00.920374 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d43fc95f-82c0-458a-abd4-ae257a54e5b0-prometheus-operator-tls podName:d43fc95f-82c0-458a-abd4-ae257a54e5b0 nodeName:}" failed. No retries permitted until 2026-04-22 19:59:01.42035338 +0000 UTC m=+76.735445611 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "prometheus-operator-tls" (UniqueName: "kubernetes.io/secret/d43fc95f-82c0-458a-abd4-ae257a54e5b0-prometheus-operator-tls") pod "prometheus-operator-5676c8c784-8qm52" (UID: "d43fc95f-82c0-458a-abd4-ae257a54e5b0") : secret "prometheus-operator-tls" not found Apr 22 19:59:00.920766 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:00.920749 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d43fc95f-82c0-458a-abd4-ae257a54e5b0-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-8qm52\" (UID: \"d43fc95f-82c0-458a-abd4-ae257a54e5b0\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-8qm52" Apr 22 19:59:00.922489 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:00.922459 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d43fc95f-82c0-458a-abd4-ae257a54e5b0-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-8qm52\" (UID: \"d43fc95f-82c0-458a-abd4-ae257a54e5b0\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-8qm52" Apr 22 19:59:00.928247 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:00.928227 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5dnmh\" (UniqueName: \"kubernetes.io/projected/d43fc95f-82c0-458a-abd4-ae257a54e5b0-kube-api-access-5dnmh\") pod \"prometheus-operator-5676c8c784-8qm52\" (UID: \"d43fc95f-82c0-458a-abd4-ae257a54e5b0\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-8qm52" Apr 22 19:59:01.422942 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:01.422906 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/d43fc95f-82c0-458a-abd4-ae257a54e5b0-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-8qm52\" (UID: \"d43fc95f-82c0-458a-abd4-ae257a54e5b0\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-8qm52" Apr 22 19:59:01.425247 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:01.425228 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/d43fc95f-82c0-458a-abd4-ae257a54e5b0-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-8qm52\" (UID: \"d43fc95f-82c0-458a-abd4-ae257a54e5b0\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-8qm52" Apr 22 19:59:01.654684 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:01.654653 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-8qm52" Apr 22 19:59:01.767452 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:01.767420 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-8qm52"] Apr 22 19:59:01.770229 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:59:01.770198 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd43fc95f_82c0_458a_abd4_ae257a54e5b0.slice/crio-9860de736c85aa953379fe5985afbb343c322c840d97c943f1c70a8c56a053b2 WatchSource:0}: Error finding container 9860de736c85aa953379fe5985afbb343c322c840d97c943f1c70a8c56a053b2: Status 404 returned error can't find the container with id 9860de736c85aa953379fe5985afbb343c322c840d97c943f1c70a8c56a053b2 Apr 22 19:59:02.578427 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:02.578393 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-8qm52" event={"ID":"d43fc95f-82c0-458a-abd4-ae257a54e5b0","Type":"ContainerStarted","Data":"9860de736c85aa953379fe5985afbb343c322c840d97c943f1c70a8c56a053b2"} Apr 22 19:59:03.552578 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:03.552540 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-ptknf" Apr 22 19:59:03.583472 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:03.583440 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-8qm52" event={"ID":"d43fc95f-82c0-458a-abd4-ae257a54e5b0","Type":"ContainerStarted","Data":"3582e8edb7b8844847acbfeedb2eef2b8c1e1084b5f4b7f1764a2ea36bd98fd8"} Apr 22 19:59:03.583619 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:03.583481 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-8qm52" event={"ID":"d43fc95f-82c0-458a-abd4-ae257a54e5b0","Type":"ContainerStarted","Data":"a54e8e2466141e29af8a4de2b2fa94b2d812073145e66f2d88d46d9497f3caca"} Apr 22 19:59:03.598989 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:03.598944 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-5676c8c784-8qm52" podStartSLOduration=2.638668871 podStartE2EDuration="3.598928067s" podCreationTimestamp="2026-04-22 19:59:00 +0000 UTC" firstStartedPulling="2026-04-22 19:59:01.772062127 +0000 UTC m=+77.087154337" lastFinishedPulling="2026-04-22 19:59:02.732321304 +0000 UTC m=+78.047413533" observedRunningTime="2026-04-22 19:59:03.598028217 +0000 UTC m=+78.913120472" watchObservedRunningTime="2026-04-22 19:59:03.598928067 +0000 UTC m=+78.914020290" Apr 22 19:59:05.155786 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:05.155752 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-wg2wh"] Apr 22 19:59:05.158520 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:05.158504 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-wg2wh" Apr 22 19:59:05.161343 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:05.161321 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 22 19:59:05.162051 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:05.162029 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 22 19:59:05.162138 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:05.162041 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-q4d8q\"" Apr 22 19:59:05.162138 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:05.162112 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 22 19:59:05.215695 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:05.215662 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-v94xd"] Apr 22 19:59:05.219197 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:05.219180 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-v94xd" Apr 22 19:59:05.222945 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:05.222920 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-tls\"" Apr 22 19:59:05.223173 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:05.223158 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-dockercfg-b5fd7\"" Apr 22 19:59:05.223830 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:05.223815 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-kube-rbac-proxy-config\"" Apr 22 19:59:05.224719 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:05.224699 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-custom-resource-state-configmap\"" Apr 22 19:59:05.248712 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:05.248673 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-v94xd"] Apr 22 19:59:05.251173 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:05.251152 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/41445d5c-895e-4561-8e7f-4520630856ea-metrics-client-ca\") pod \"node-exporter-wg2wh\" (UID: \"41445d5c-895e-4561-8e7f-4520630856ea\") " pod="openshift-monitoring/node-exporter-wg2wh" Apr 22 19:59:05.251306 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:05.251197 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/41445d5c-895e-4561-8e7f-4520630856ea-node-exporter-textfile\") pod \"node-exporter-wg2wh\" (UID: \"41445d5c-895e-4561-8e7f-4520630856ea\") " pod="openshift-monitoring/node-exporter-wg2wh" Apr 22 19:59:05.251306 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:05.251223 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/41445d5c-895e-4561-8e7f-4520630856ea-node-exporter-tls\") pod \"node-exporter-wg2wh\" (UID: \"41445d5c-895e-4561-8e7f-4520630856ea\") " pod="openshift-monitoring/node-exporter-wg2wh" Apr 22 19:59:05.251306 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:05.251246 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-696lw\" (UniqueName: \"kubernetes.io/projected/41445d5c-895e-4561-8e7f-4520630856ea-kube-api-access-696lw\") pod \"node-exporter-wg2wh\" (UID: \"41445d5c-895e-4561-8e7f-4520630856ea\") " pod="openshift-monitoring/node-exporter-wg2wh" Apr 22 19:59:05.251306 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:05.251277 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/41445d5c-895e-4561-8e7f-4520630856ea-sys\") pod \"node-exporter-wg2wh\" (UID: \"41445d5c-895e-4561-8e7f-4520630856ea\") " pod="openshift-monitoring/node-exporter-wg2wh" Apr 22 19:59:05.251497 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:05.251375 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/41445d5c-895e-4561-8e7f-4520630856ea-root\") pod \"node-exporter-wg2wh\" (UID: \"41445d5c-895e-4561-8e7f-4520630856ea\") " pod="openshift-monitoring/node-exporter-wg2wh" Apr 22 19:59:05.251497 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:05.251411 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/41445d5c-895e-4561-8e7f-4520630856ea-node-exporter-accelerators-collector-config\") pod \"node-exporter-wg2wh\" (UID: \"41445d5c-895e-4561-8e7f-4520630856ea\") " pod="openshift-monitoring/node-exporter-wg2wh" Apr 22 19:59:05.251497 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:05.251478 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/41445d5c-895e-4561-8e7f-4520630856ea-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-wg2wh\" (UID: \"41445d5c-895e-4561-8e7f-4520630856ea\") " pod="openshift-monitoring/node-exporter-wg2wh" Apr 22 19:59:05.251615 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:05.251515 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/41445d5c-895e-4561-8e7f-4520630856ea-node-exporter-wtmp\") pod \"node-exporter-wg2wh\" (UID: \"41445d5c-895e-4561-8e7f-4520630856ea\") " pod="openshift-monitoring/node-exporter-wg2wh" Apr 22 19:59:05.351809 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:05.351775 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/41445d5c-895e-4561-8e7f-4520630856ea-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-wg2wh\" (UID: \"41445d5c-895e-4561-8e7f-4520630856ea\") " pod="openshift-monitoring/node-exporter-wg2wh" Apr 22 19:59:05.351809 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:05.351811 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/41445d5c-895e-4561-8e7f-4520630856ea-node-exporter-wtmp\") pod \"node-exporter-wg2wh\" (UID: \"41445d5c-895e-4561-8e7f-4520630856ea\") " pod="openshift-monitoring/node-exporter-wg2wh" Apr 22 19:59:05.352054 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:05.351969 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/41445d5c-895e-4561-8e7f-4520630856ea-node-exporter-wtmp\") pod \"node-exporter-wg2wh\" (UID: \"41445d5c-895e-4561-8e7f-4520630856ea\") " pod="openshift-monitoring/node-exporter-wg2wh" Apr 22 19:59:05.352054 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:05.352023 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/252eb25d-0b26-498e-85d2-b99506d56ed4-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-v94xd\" (UID: \"252eb25d-0b26-498e-85d2-b99506d56ed4\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-v94xd" Apr 22 19:59:05.352139 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:05.352111 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/252eb25d-0b26-498e-85d2-b99506d56ed4-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-v94xd\" (UID: \"252eb25d-0b26-498e-85d2-b99506d56ed4\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-v94xd" Apr 22 19:59:05.352187 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:05.352141 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/252eb25d-0b26-498e-85d2-b99506d56ed4-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-v94xd\" (UID: \"252eb25d-0b26-498e-85d2-b99506d56ed4\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-v94xd" Apr 22 19:59:05.352187 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:05.352170 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/41445d5c-895e-4561-8e7f-4520630856ea-metrics-client-ca\") pod \"node-exporter-wg2wh\" (UID: \"41445d5c-895e-4561-8e7f-4520630856ea\") " pod="openshift-monitoring/node-exporter-wg2wh" Apr 22 19:59:05.352263 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:05.352188 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/252eb25d-0b26-498e-85d2-b99506d56ed4-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-v94xd\" (UID: \"252eb25d-0b26-498e-85d2-b99506d56ed4\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-v94xd" Apr 22 19:59:05.352263 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:05.352228 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gmbr9\" (UniqueName: \"kubernetes.io/projected/252eb25d-0b26-498e-85d2-b99506d56ed4-kube-api-access-gmbr9\") pod \"kube-state-metrics-69db897b98-v94xd\" (UID: \"252eb25d-0b26-498e-85d2-b99506d56ed4\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-v94xd" Apr 22 19:59:05.352263 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:05.352251 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/41445d5c-895e-4561-8e7f-4520630856ea-node-exporter-textfile\") pod \"node-exporter-wg2wh\" (UID: \"41445d5c-895e-4561-8e7f-4520630856ea\") " pod="openshift-monitoring/node-exporter-wg2wh" Apr 22 19:59:05.352389 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:05.352266 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/41445d5c-895e-4561-8e7f-4520630856ea-node-exporter-tls\") pod \"node-exporter-wg2wh\" (UID: \"41445d5c-895e-4561-8e7f-4520630856ea\") " pod="openshift-monitoring/node-exporter-wg2wh" Apr 22 19:59:05.352389 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:05.352281 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-696lw\" (UniqueName: \"kubernetes.io/projected/41445d5c-895e-4561-8e7f-4520630856ea-kube-api-access-696lw\") pod \"node-exporter-wg2wh\" (UID: \"41445d5c-895e-4561-8e7f-4520630856ea\") " pod="openshift-monitoring/node-exporter-wg2wh" Apr 22 19:59:05.352389 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:05.352301 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/252eb25d-0b26-498e-85d2-b99506d56ed4-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-v94xd\" (UID: \"252eb25d-0b26-498e-85d2-b99506d56ed4\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-v94xd" Apr 22 19:59:05.352389 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:05.352331 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/41445d5c-895e-4561-8e7f-4520630856ea-sys\") pod \"node-exporter-wg2wh\" (UID: \"41445d5c-895e-4561-8e7f-4520630856ea\") " pod="openshift-monitoring/node-exporter-wg2wh" Apr 22 19:59:05.352389 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:05.352363 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/41445d5c-895e-4561-8e7f-4520630856ea-root\") pod \"node-exporter-wg2wh\" (UID: \"41445d5c-895e-4561-8e7f-4520630856ea\") " pod="openshift-monitoring/node-exporter-wg2wh" Apr 22 19:59:05.352611 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:05.352390 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/41445d5c-895e-4561-8e7f-4520630856ea-node-exporter-accelerators-collector-config\") pod \"node-exporter-wg2wh\" (UID: \"41445d5c-895e-4561-8e7f-4520630856ea\") " pod="openshift-monitoring/node-exporter-wg2wh" Apr 22 19:59:05.352611 ip-10-0-143-253 kubenswrapper[2577]: E0422 19:59:05.352430 2577 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 22 19:59:05.352611 ip-10-0-143-253 kubenswrapper[2577]: E0422 19:59:05.352493 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/41445d5c-895e-4561-8e7f-4520630856ea-node-exporter-tls podName:41445d5c-895e-4561-8e7f-4520630856ea nodeName:}" failed. No retries permitted until 2026-04-22 19:59:05.852472238 +0000 UTC m=+81.167564452 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/41445d5c-895e-4561-8e7f-4520630856ea-node-exporter-tls") pod "node-exporter-wg2wh" (UID: "41445d5c-895e-4561-8e7f-4520630856ea") : secret "node-exporter-tls" not found Apr 22 19:59:05.356855 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:05.352933 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/41445d5c-895e-4561-8e7f-4520630856ea-sys\") pod \"node-exporter-wg2wh\" (UID: \"41445d5c-895e-4561-8e7f-4520630856ea\") " pod="openshift-monitoring/node-exporter-wg2wh" Apr 22 19:59:05.356855 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:05.352579 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/41445d5c-895e-4561-8e7f-4520630856ea-node-exporter-textfile\") pod \"node-exporter-wg2wh\" (UID: \"41445d5c-895e-4561-8e7f-4520630856ea\") " pod="openshift-monitoring/node-exporter-wg2wh" Apr 22 19:59:05.356855 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:05.352598 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/41445d5c-895e-4561-8e7f-4520630856ea-root\") pod \"node-exporter-wg2wh\" (UID: \"41445d5c-895e-4561-8e7f-4520630856ea\") " pod="openshift-monitoring/node-exporter-wg2wh" Apr 22 19:59:05.356855 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:05.353371 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/41445d5c-895e-4561-8e7f-4520630856ea-node-exporter-accelerators-collector-config\") pod \"node-exporter-wg2wh\" (UID: \"41445d5c-895e-4561-8e7f-4520630856ea\") " pod="openshift-monitoring/node-exporter-wg2wh" Apr 22 19:59:05.356855 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:05.354726 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/41445d5c-895e-4561-8e7f-4520630856ea-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-wg2wh\" (UID: \"41445d5c-895e-4561-8e7f-4520630856ea\") " pod="openshift-monitoring/node-exporter-wg2wh" Apr 22 19:59:05.357203 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:05.357051 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/41445d5c-895e-4561-8e7f-4520630856ea-metrics-client-ca\") pod \"node-exporter-wg2wh\" (UID: \"41445d5c-895e-4561-8e7f-4520630856ea\") " pod="openshift-monitoring/node-exporter-wg2wh" Apr 22 19:59:05.362008 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:05.361986 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-696lw\" (UniqueName: \"kubernetes.io/projected/41445d5c-895e-4561-8e7f-4520630856ea-kube-api-access-696lw\") pod \"node-exporter-wg2wh\" (UID: \"41445d5c-895e-4561-8e7f-4520630856ea\") " pod="openshift-monitoring/node-exporter-wg2wh" Apr 22 19:59:05.453290 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:05.453217 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/252eb25d-0b26-498e-85d2-b99506d56ed4-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-v94xd\" (UID: \"252eb25d-0b26-498e-85d2-b99506d56ed4\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-v94xd" Apr 22 19:59:05.453290 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:05.453254 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/252eb25d-0b26-498e-85d2-b99506d56ed4-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-v94xd\" (UID: \"252eb25d-0b26-498e-85d2-b99506d56ed4\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-v94xd" Apr 22 19:59:05.453290 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:05.453279 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/252eb25d-0b26-498e-85d2-b99506d56ed4-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-v94xd\" (UID: \"252eb25d-0b26-498e-85d2-b99506d56ed4\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-v94xd" Apr 22 19:59:05.453500 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:05.453311 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gmbr9\" (UniqueName: \"kubernetes.io/projected/252eb25d-0b26-498e-85d2-b99506d56ed4-kube-api-access-gmbr9\") pod \"kube-state-metrics-69db897b98-v94xd\" (UID: \"252eb25d-0b26-498e-85d2-b99506d56ed4\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-v94xd" Apr 22 19:59:05.453500 ip-10-0-143-253 kubenswrapper[2577]: E0422 19:59:05.453419 2577 secret.go:189] Couldn't get secret openshift-monitoring/kube-state-metrics-tls: secret "kube-state-metrics-tls" not found Apr 22 19:59:05.453579 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:05.453508 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/252eb25d-0b26-498e-85d2-b99506d56ed4-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-v94xd\" (UID: \"252eb25d-0b26-498e-85d2-b99506d56ed4\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-v94xd" Apr 22 19:59:05.453579 ip-10-0-143-253 kubenswrapper[2577]: E0422 19:59:05.453571 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/252eb25d-0b26-498e-85d2-b99506d56ed4-kube-state-metrics-tls podName:252eb25d-0b26-498e-85d2-b99506d56ed4 nodeName:}" failed. No retries permitted until 2026-04-22 19:59:05.953550927 +0000 UTC m=+81.268643152 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-state-metrics-tls" (UniqueName: "kubernetes.io/secret/252eb25d-0b26-498e-85d2-b99506d56ed4-kube-state-metrics-tls") pod "kube-state-metrics-69db897b98-v94xd" (UID: "252eb25d-0b26-498e-85d2-b99506d56ed4") : secret "kube-state-metrics-tls" not found Apr 22 19:59:05.453652 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:05.453627 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/252eb25d-0b26-498e-85d2-b99506d56ed4-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-v94xd\" (UID: \"252eb25d-0b26-498e-85d2-b99506d56ed4\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-v94xd" Apr 22 19:59:05.453697 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:05.453676 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/252eb25d-0b26-498e-85d2-b99506d56ed4-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-v94xd\" (UID: \"252eb25d-0b26-498e-85d2-b99506d56ed4\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-v94xd" Apr 22 19:59:05.453864 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:05.453826 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/252eb25d-0b26-498e-85d2-b99506d56ed4-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-v94xd\" (UID: \"252eb25d-0b26-498e-85d2-b99506d56ed4\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-v94xd" Apr 22 19:59:05.454170 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:05.454154 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/252eb25d-0b26-498e-85d2-b99506d56ed4-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-v94xd\" (UID: \"252eb25d-0b26-498e-85d2-b99506d56ed4\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-v94xd" Apr 22 19:59:05.455668 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:05.455651 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/252eb25d-0b26-498e-85d2-b99506d56ed4-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-v94xd\" (UID: \"252eb25d-0b26-498e-85d2-b99506d56ed4\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-v94xd" Apr 22 19:59:05.461853 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:05.461821 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gmbr9\" (UniqueName: \"kubernetes.io/projected/252eb25d-0b26-498e-85d2-b99506d56ed4-kube-api-access-gmbr9\") pod \"kube-state-metrics-69db897b98-v94xd\" (UID: \"252eb25d-0b26-498e-85d2-b99506d56ed4\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-v94xd" Apr 22 19:59:05.856245 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:05.856210 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/41445d5c-895e-4561-8e7f-4520630856ea-node-exporter-tls\") pod \"node-exporter-wg2wh\" (UID: \"41445d5c-895e-4561-8e7f-4520630856ea\") " pod="openshift-monitoring/node-exporter-wg2wh" Apr 22 19:59:05.856443 ip-10-0-143-253 kubenswrapper[2577]: E0422 19:59:05.856364 2577 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 22 19:59:05.856443 ip-10-0-143-253 kubenswrapper[2577]: E0422 19:59:05.856440 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/41445d5c-895e-4561-8e7f-4520630856ea-node-exporter-tls podName:41445d5c-895e-4561-8e7f-4520630856ea nodeName:}" failed. No retries permitted until 2026-04-22 19:59:06.856424728 +0000 UTC m=+82.171516942 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/41445d5c-895e-4561-8e7f-4520630856ea-node-exporter-tls") pod "node-exporter-wg2wh" (UID: "41445d5c-895e-4561-8e7f-4520630856ea") : secret "node-exporter-tls" not found Apr 22 19:59:05.956704 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:05.956668 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/252eb25d-0b26-498e-85d2-b99506d56ed4-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-v94xd\" (UID: \"252eb25d-0b26-498e-85d2-b99506d56ed4\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-v94xd" Apr 22 19:59:05.959007 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:05.958985 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/252eb25d-0b26-498e-85d2-b99506d56ed4-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-v94xd\" (UID: \"252eb25d-0b26-498e-85d2-b99506d56ed4\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-v94xd" Apr 22 19:59:06.152499 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:06.152427 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-v94xd" Apr 22 19:59:06.270921 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:06.270893 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-v94xd"] Apr 22 19:59:06.273328 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:59:06.273298 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod252eb25d_0b26_498e_85d2_b99506d56ed4.slice/crio-44a885ae7fecb038db1adf9ec442965ccd793cc0394c7bfcffd7118b2c4603e8 WatchSource:0}: Error finding container 44a885ae7fecb038db1adf9ec442965ccd793cc0394c7bfcffd7118b2c4603e8: Status 404 returned error can't find the container with id 44a885ae7fecb038db1adf9ec442965ccd793cc0394c7bfcffd7118b2c4603e8 Apr 22 19:59:06.591713 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:06.591682 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-v94xd" event={"ID":"252eb25d-0b26-498e-85d2-b99506d56ed4","Type":"ContainerStarted","Data":"44a885ae7fecb038db1adf9ec442965ccd793cc0394c7bfcffd7118b2c4603e8"} Apr 22 19:59:06.864303 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:06.864210 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/41445d5c-895e-4561-8e7f-4520630856ea-node-exporter-tls\") pod \"node-exporter-wg2wh\" (UID: \"41445d5c-895e-4561-8e7f-4520630856ea\") " pod="openshift-monitoring/node-exporter-wg2wh" Apr 22 19:59:06.866522 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:06.866498 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/41445d5c-895e-4561-8e7f-4520630856ea-node-exporter-tls\") pod \"node-exporter-wg2wh\" (UID: \"41445d5c-895e-4561-8e7f-4520630856ea\") " pod="openshift-monitoring/node-exporter-wg2wh" Apr 22 19:59:06.967481 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:06.967444 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-wg2wh" Apr 22 19:59:06.977260 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:59:06.977229 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod41445d5c_895e_4561_8e7f_4520630856ea.slice/crio-ade495e10b047dee084294c102d8295c729936f58119e2656be77ed902d92ae1 WatchSource:0}: Error finding container ade495e10b047dee084294c102d8295c729936f58119e2656be77ed902d92ae1: Status 404 returned error can't find the container with id ade495e10b047dee084294c102d8295c729936f58119e2656be77ed902d92ae1 Apr 22 19:59:07.038443 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:07.038412 2577 patch_prober.go:28] interesting pod/image-registry-69c5956889-gpgjs container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 22 19:59:07.038575 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:07.038480 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-69c5956889-gpgjs" podUID="f493ef8a-2452-4069-a58f-62b9adde6d11" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 19:59:07.594626 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:07.594603 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-wg2wh" event={"ID":"41445d5c-895e-4561-8e7f-4520630856ea","Type":"ContainerStarted","Data":"ade495e10b047dee084294c102d8295c729936f58119e2656be77ed902d92ae1"} Apr 22 19:59:08.599376 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:08.599342 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-v94xd" event={"ID":"252eb25d-0b26-498e-85d2-b99506d56ed4","Type":"ContainerStarted","Data":"efa1d78ce48159b14f27355eeb8b93449c32b11047f14f8b981ade71d14759c1"} Apr 22 19:59:08.599376 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:08.599381 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-v94xd" event={"ID":"252eb25d-0b26-498e-85d2-b99506d56ed4","Type":"ContainerStarted","Data":"9870613aa8e5d623fdff1f154a6c1184dd7864b7fb5ac9c826fb404e2f97b3cb"} Apr 22 19:59:08.599829 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:08.599393 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-v94xd" event={"ID":"252eb25d-0b26-498e-85d2-b99506d56ed4","Type":"ContainerStarted","Data":"95ecb13d08ecc43065a5ab3870ed72614e04c0c616e6d020be3c324586eaba66"} Apr 22 19:59:08.600779 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:08.600748 2577 generic.go:358] "Generic (PLEG): container finished" podID="41445d5c-895e-4561-8e7f-4520630856ea" containerID="04a19f1df1f54660234e03e9f232aea945cb1221534dd5b34ea2daeea05c8a76" exitCode=0 Apr 22 19:59:08.600896 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:08.600784 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-wg2wh" event={"ID":"41445d5c-895e-4561-8e7f-4520630856ea","Type":"ContainerDied","Data":"04a19f1df1f54660234e03e9f232aea945cb1221534dd5b34ea2daeea05c8a76"} Apr 22 19:59:08.622593 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:08.622542 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-69db897b98-v94xd" podStartSLOduration=2.313082185 podStartE2EDuration="3.622528729s" podCreationTimestamp="2026-04-22 19:59:05 +0000 UTC" firstStartedPulling="2026-04-22 19:59:06.275174433 +0000 UTC m=+81.590266645" lastFinishedPulling="2026-04-22 19:59:07.584620976 +0000 UTC m=+82.899713189" observedRunningTime="2026-04-22 19:59:08.620809742 +0000 UTC m=+83.935901975" watchObservedRunningTime="2026-04-22 19:59:08.622528729 +0000 UTC m=+83.937620960" Apr 22 19:59:09.608390 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:09.608353 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-wg2wh" event={"ID":"41445d5c-895e-4561-8e7f-4520630856ea","Type":"ContainerStarted","Data":"392cef8227e8553f0ac5fec3d2f26eb96b9fd37c0f443b6dd667f24f6fe40884"} Apr 22 19:59:09.608390 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:09.608398 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-wg2wh" event={"ID":"41445d5c-895e-4561-8e7f-4520630856ea","Type":"ContainerStarted","Data":"e848f1d6b598351979bac8e8004440197fc031f042147faf861e07346185f661"} Apr 22 19:59:09.633765 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:09.633723 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-wg2wh" podStartSLOduration=3.6030006930000003 podStartE2EDuration="4.633707175s" podCreationTimestamp="2026-04-22 19:59:05 +0000 UTC" firstStartedPulling="2026-04-22 19:59:06.979108247 +0000 UTC m=+82.294200457" lastFinishedPulling="2026-04-22 19:59:08.009814729 +0000 UTC m=+83.324906939" observedRunningTime="2026-04-22 19:59:09.631895915 +0000 UTC m=+84.946988147" watchObservedRunningTime="2026-04-22 19:59:09.633707175 +0000 UTC m=+84.948799408" Apr 22 19:59:09.743611 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:09.743582 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-7bcf7f978f-v75zc"] Apr 22 19:59:09.746599 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:09.746580 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-7bcf7f978f-v75zc" Apr 22 19:59:09.749522 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:09.749501 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 22 19:59:09.749661 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:09.749641 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-dockercfg-l56x4\"" Apr 22 19:59:09.749732 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:09.749671 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-server-audit-profiles\"" Apr 22 19:59:09.749883 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:09.749870 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-tls\"" Apr 22 19:59:09.750425 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:09.750407 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-client-certs\"" Apr 22 19:59:09.750717 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:09.750704 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-52dgc0s3t66ue\"" Apr 22 19:59:09.757166 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:09.757147 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-7bcf7f978f-v75zc"] Apr 22 19:59:09.891053 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:09.890967 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/061643e4-5536-4b98-a3db-7dc78b143be4-metrics-server-audit-profiles\") pod \"metrics-server-7bcf7f978f-v75zc\" (UID: \"061643e4-5536-4b98-a3db-7dc78b143be4\") " pod="openshift-monitoring/metrics-server-7bcf7f978f-v75zc" Apr 22 19:59:09.891053 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:09.891009 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jp8kj\" (UniqueName: \"kubernetes.io/projected/061643e4-5536-4b98-a3db-7dc78b143be4-kube-api-access-jp8kj\") pod \"metrics-server-7bcf7f978f-v75zc\" (UID: \"061643e4-5536-4b98-a3db-7dc78b143be4\") " pod="openshift-monitoring/metrics-server-7bcf7f978f-v75zc" Apr 22 19:59:09.891246 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:09.891088 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/061643e4-5536-4b98-a3db-7dc78b143be4-audit-log\") pod \"metrics-server-7bcf7f978f-v75zc\" (UID: \"061643e4-5536-4b98-a3db-7dc78b143be4\") " pod="openshift-monitoring/metrics-server-7bcf7f978f-v75zc" Apr 22 19:59:09.891246 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:09.891150 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/061643e4-5536-4b98-a3db-7dc78b143be4-client-ca-bundle\") pod \"metrics-server-7bcf7f978f-v75zc\" (UID: \"061643e4-5536-4b98-a3db-7dc78b143be4\") " pod="openshift-monitoring/metrics-server-7bcf7f978f-v75zc" Apr 22 19:59:09.891246 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:09.891200 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/061643e4-5536-4b98-a3db-7dc78b143be4-secret-metrics-server-tls\") pod \"metrics-server-7bcf7f978f-v75zc\" (UID: \"061643e4-5536-4b98-a3db-7dc78b143be4\") " pod="openshift-monitoring/metrics-server-7bcf7f978f-v75zc" Apr 22 19:59:09.891246 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:09.891229 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/061643e4-5536-4b98-a3db-7dc78b143be4-secret-metrics-server-client-certs\") pod \"metrics-server-7bcf7f978f-v75zc\" (UID: \"061643e4-5536-4b98-a3db-7dc78b143be4\") " pod="openshift-monitoring/metrics-server-7bcf7f978f-v75zc" Apr 22 19:59:09.891383 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:09.891249 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/061643e4-5536-4b98-a3db-7dc78b143be4-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-7bcf7f978f-v75zc\" (UID: \"061643e4-5536-4b98-a3db-7dc78b143be4\") " pod="openshift-monitoring/metrics-server-7bcf7f978f-v75zc" Apr 22 19:59:09.902195 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:09.902167 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-fbwbl"] Apr 22 19:59:09.905167 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:09.905148 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-fbwbl" Apr 22 19:59:09.907222 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:09.907200 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"monitoring-plugin-cert\"" Apr 22 19:59:09.907308 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:09.907265 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"default-dockercfg-642vv\"" Apr 22 19:59:09.916048 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:09.916029 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-fbwbl"] Apr 22 19:59:09.991830 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:09.991796 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/3d17ac7f-8550-4939-b37f-744e02796a0a-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-fbwbl\" (UID: \"3d17ac7f-8550-4939-b37f-744e02796a0a\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-fbwbl" Apr 22 19:59:09.991830 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:09.991849 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/061643e4-5536-4b98-a3db-7dc78b143be4-secret-metrics-server-tls\") pod \"metrics-server-7bcf7f978f-v75zc\" (UID: \"061643e4-5536-4b98-a3db-7dc78b143be4\") " pod="openshift-monitoring/metrics-server-7bcf7f978f-v75zc" Apr 22 19:59:09.992062 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:09.991873 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/061643e4-5536-4b98-a3db-7dc78b143be4-secret-metrics-server-client-certs\") pod \"metrics-server-7bcf7f978f-v75zc\" (UID: \"061643e4-5536-4b98-a3db-7dc78b143be4\") " pod="openshift-monitoring/metrics-server-7bcf7f978f-v75zc" Apr 22 19:59:09.992062 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:09.991892 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/061643e4-5536-4b98-a3db-7dc78b143be4-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-7bcf7f978f-v75zc\" (UID: \"061643e4-5536-4b98-a3db-7dc78b143be4\") " pod="openshift-monitoring/metrics-server-7bcf7f978f-v75zc" Apr 22 19:59:09.992062 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:09.991922 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/061643e4-5536-4b98-a3db-7dc78b143be4-metrics-server-audit-profiles\") pod \"metrics-server-7bcf7f978f-v75zc\" (UID: \"061643e4-5536-4b98-a3db-7dc78b143be4\") " pod="openshift-monitoring/metrics-server-7bcf7f978f-v75zc" Apr 22 19:59:09.992062 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:09.991948 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jp8kj\" (UniqueName: \"kubernetes.io/projected/061643e4-5536-4b98-a3db-7dc78b143be4-kube-api-access-jp8kj\") pod \"metrics-server-7bcf7f978f-v75zc\" (UID: \"061643e4-5536-4b98-a3db-7dc78b143be4\") " pod="openshift-monitoring/metrics-server-7bcf7f978f-v75zc" Apr 22 19:59:09.992062 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:09.991982 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/061643e4-5536-4b98-a3db-7dc78b143be4-audit-log\") pod \"metrics-server-7bcf7f978f-v75zc\" (UID: \"061643e4-5536-4b98-a3db-7dc78b143be4\") " pod="openshift-monitoring/metrics-server-7bcf7f978f-v75zc" Apr 22 19:59:09.992062 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:09.992019 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/061643e4-5536-4b98-a3db-7dc78b143be4-client-ca-bundle\") pod \"metrics-server-7bcf7f978f-v75zc\" (UID: \"061643e4-5536-4b98-a3db-7dc78b143be4\") " pod="openshift-monitoring/metrics-server-7bcf7f978f-v75zc" Apr 22 19:59:09.992451 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:09.992429 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/061643e4-5536-4b98-a3db-7dc78b143be4-audit-log\") pod \"metrics-server-7bcf7f978f-v75zc\" (UID: \"061643e4-5536-4b98-a3db-7dc78b143be4\") " pod="openshift-monitoring/metrics-server-7bcf7f978f-v75zc" Apr 22 19:59:09.992722 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:09.992697 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/061643e4-5536-4b98-a3db-7dc78b143be4-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-7bcf7f978f-v75zc\" (UID: \"061643e4-5536-4b98-a3db-7dc78b143be4\") " pod="openshift-monitoring/metrics-server-7bcf7f978f-v75zc" Apr 22 19:59:09.993158 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:09.993130 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/061643e4-5536-4b98-a3db-7dc78b143be4-metrics-server-audit-profiles\") pod \"metrics-server-7bcf7f978f-v75zc\" (UID: \"061643e4-5536-4b98-a3db-7dc78b143be4\") " pod="openshift-monitoring/metrics-server-7bcf7f978f-v75zc" Apr 22 19:59:09.994393 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:09.994374 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/061643e4-5536-4b98-a3db-7dc78b143be4-secret-metrics-server-tls\") pod \"metrics-server-7bcf7f978f-v75zc\" (UID: \"061643e4-5536-4b98-a3db-7dc78b143be4\") " pod="openshift-monitoring/metrics-server-7bcf7f978f-v75zc" Apr 22 19:59:09.994461 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:09.994403 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/061643e4-5536-4b98-a3db-7dc78b143be4-client-ca-bundle\") pod \"metrics-server-7bcf7f978f-v75zc\" (UID: \"061643e4-5536-4b98-a3db-7dc78b143be4\") " pod="openshift-monitoring/metrics-server-7bcf7f978f-v75zc" Apr 22 19:59:09.994461 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:09.994430 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/061643e4-5536-4b98-a3db-7dc78b143be4-secret-metrics-server-client-certs\") pod \"metrics-server-7bcf7f978f-v75zc\" (UID: \"061643e4-5536-4b98-a3db-7dc78b143be4\") " pod="openshift-monitoring/metrics-server-7bcf7f978f-v75zc" Apr 22 19:59:09.999081 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:09.999060 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jp8kj\" (UniqueName: \"kubernetes.io/projected/061643e4-5536-4b98-a3db-7dc78b143be4-kube-api-access-jp8kj\") pod \"metrics-server-7bcf7f978f-v75zc\" (UID: \"061643e4-5536-4b98-a3db-7dc78b143be4\") " pod="openshift-monitoring/metrics-server-7bcf7f978f-v75zc" Apr 22 19:59:10.055895 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:10.055869 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-7bcf7f978f-v75zc" Apr 22 19:59:10.092621 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:10.092591 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/3d17ac7f-8550-4939-b37f-744e02796a0a-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-fbwbl\" (UID: \"3d17ac7f-8550-4939-b37f-744e02796a0a\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-fbwbl" Apr 22 19:59:10.094854 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:10.094812 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/3d17ac7f-8550-4939-b37f-744e02796a0a-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-fbwbl\" (UID: \"3d17ac7f-8550-4939-b37f-744e02796a0a\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-fbwbl" Apr 22 19:59:10.170054 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:10.170028 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-7bcf7f978f-v75zc"] Apr 22 19:59:10.172411 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:59:10.172378 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod061643e4_5536_4b98_a3db_7dc78b143be4.slice/crio-a6872d1955d53bdb85c92e2e24badb5969dcf64b64431943dfe400fe2618f094 WatchSource:0}: Error finding container a6872d1955d53bdb85c92e2e24badb5969dcf64b64431943dfe400fe2618f094: Status 404 returned error can't find the container with id a6872d1955d53bdb85c92e2e24badb5969dcf64b64431943dfe400fe2618f094 Apr 22 19:59:10.215205 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:10.215173 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-fbwbl" Apr 22 19:59:10.323786 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:10.323757 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-fbwbl"] Apr 22 19:59:10.328956 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:59:10.328923 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3d17ac7f_8550_4939_b37f_744e02796a0a.slice/crio-a9420893f2cae9c4774712453aea221d123250383e755ea2c14188ee851a4786 WatchSource:0}: Error finding container a9420893f2cae9c4774712453aea221d123250383e755ea2c14188ee851a4786: Status 404 returned error can't find the container with id a9420893f2cae9c4774712453aea221d123250383e755ea2c14188ee851a4786 Apr 22 19:59:10.456783 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:10.456712 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-68f67747f4-skfhf"] Apr 22 19:59:10.460235 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:10.460211 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-68f67747f4-skfhf" Apr 22 19:59:10.462542 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:10.462516 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client\"" Apr 22 19:59:10.462672 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:10.462568 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-tls\"" Apr 22 19:59:10.462828 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:10.462803 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"federate-client-certs\"" Apr 22 19:59:10.463150 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:10.463123 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-client-serving-certs-ca-bundle\"" Apr 22 19:59:10.463307 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:10.463246 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-kube-rbac-proxy-config\"" Apr 22 19:59:10.463563 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:10.463545 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-dockercfg-gw8tj\"" Apr 22 19:59:10.469602 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:10.469582 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-trusted-ca-bundle-8i12ta5c71j38\"" Apr 22 19:59:10.473598 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:10.473576 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-68f67747f4-skfhf"] Apr 22 19:59:10.596767 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:10.596729 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/c67727b0-cc9d-4536-8f61-bbb39b7943f6-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-68f67747f4-skfhf\" (UID: \"c67727b0-cc9d-4536-8f61-bbb39b7943f6\") " pod="openshift-monitoring/telemeter-client-68f67747f4-skfhf" Apr 22 19:59:10.596974 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:10.596784 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c67727b0-cc9d-4536-8f61-bbb39b7943f6-serving-certs-ca-bundle\") pod \"telemeter-client-68f67747f4-skfhf\" (UID: \"c67727b0-cc9d-4536-8f61-bbb39b7943f6\") " pod="openshift-monitoring/telemeter-client-68f67747f4-skfhf" Apr 22 19:59:10.596974 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:10.596909 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/c67727b0-cc9d-4536-8f61-bbb39b7943f6-federate-client-tls\") pod \"telemeter-client-68f67747f4-skfhf\" (UID: \"c67727b0-cc9d-4536-8f61-bbb39b7943f6\") " pod="openshift-monitoring/telemeter-client-68f67747f4-skfhf" Apr 22 19:59:10.596974 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:10.596950 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c67727b0-cc9d-4536-8f61-bbb39b7943f6-telemeter-trusted-ca-bundle\") pod \"telemeter-client-68f67747f4-skfhf\" (UID: \"c67727b0-cc9d-4536-8f61-bbb39b7943f6\") " pod="openshift-monitoring/telemeter-client-68f67747f4-skfhf" Apr 22 19:59:10.597131 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:10.596980 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/c67727b0-cc9d-4536-8f61-bbb39b7943f6-telemeter-client-tls\") pod \"telemeter-client-68f67747f4-skfhf\" (UID: \"c67727b0-cc9d-4536-8f61-bbb39b7943f6\") " pod="openshift-monitoring/telemeter-client-68f67747f4-skfhf" Apr 22 19:59:10.597131 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:10.597045 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/c67727b0-cc9d-4536-8f61-bbb39b7943f6-secret-telemeter-client\") pod \"telemeter-client-68f67747f4-skfhf\" (UID: \"c67727b0-cc9d-4536-8f61-bbb39b7943f6\") " pod="openshift-monitoring/telemeter-client-68f67747f4-skfhf" Apr 22 19:59:10.597131 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:10.597084 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c67727b0-cc9d-4536-8f61-bbb39b7943f6-metrics-client-ca\") pod \"telemeter-client-68f67747f4-skfhf\" (UID: \"c67727b0-cc9d-4536-8f61-bbb39b7943f6\") " pod="openshift-monitoring/telemeter-client-68f67747f4-skfhf" Apr 22 19:59:10.597250 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:10.597137 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6pwqr\" (UniqueName: \"kubernetes.io/projected/c67727b0-cc9d-4536-8f61-bbb39b7943f6-kube-api-access-6pwqr\") pod \"telemeter-client-68f67747f4-skfhf\" (UID: \"c67727b0-cc9d-4536-8f61-bbb39b7943f6\") " pod="openshift-monitoring/telemeter-client-68f67747f4-skfhf" Apr 22 19:59:10.612298 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:10.612256 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-fbwbl" event={"ID":"3d17ac7f-8550-4939-b37f-744e02796a0a","Type":"ContainerStarted","Data":"a9420893f2cae9c4774712453aea221d123250383e755ea2c14188ee851a4786"} Apr 22 19:59:10.613392 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:10.613354 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-7bcf7f978f-v75zc" event={"ID":"061643e4-5536-4b98-a3db-7dc78b143be4","Type":"ContainerStarted","Data":"a6872d1955d53bdb85c92e2e24badb5969dcf64b64431943dfe400fe2618f094"} Apr 22 19:59:10.698249 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:10.698216 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/c67727b0-cc9d-4536-8f61-bbb39b7943f6-secret-telemeter-client\") pod \"telemeter-client-68f67747f4-skfhf\" (UID: \"c67727b0-cc9d-4536-8f61-bbb39b7943f6\") " pod="openshift-monitoring/telemeter-client-68f67747f4-skfhf" Apr 22 19:59:10.698434 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:10.698260 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c67727b0-cc9d-4536-8f61-bbb39b7943f6-metrics-client-ca\") pod \"telemeter-client-68f67747f4-skfhf\" (UID: \"c67727b0-cc9d-4536-8f61-bbb39b7943f6\") " pod="openshift-monitoring/telemeter-client-68f67747f4-skfhf" Apr 22 19:59:10.698645 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:10.698606 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6pwqr\" (UniqueName: \"kubernetes.io/projected/c67727b0-cc9d-4536-8f61-bbb39b7943f6-kube-api-access-6pwqr\") pod \"telemeter-client-68f67747f4-skfhf\" (UID: \"c67727b0-cc9d-4536-8f61-bbb39b7943f6\") " pod="openshift-monitoring/telemeter-client-68f67747f4-skfhf" Apr 22 19:59:10.698750 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:10.698652 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/c67727b0-cc9d-4536-8f61-bbb39b7943f6-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-68f67747f4-skfhf\" (UID: \"c67727b0-cc9d-4536-8f61-bbb39b7943f6\") " pod="openshift-monitoring/telemeter-client-68f67747f4-skfhf" Apr 22 19:59:10.699017 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:10.698940 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c67727b0-cc9d-4536-8f61-bbb39b7943f6-serving-certs-ca-bundle\") pod \"telemeter-client-68f67747f4-skfhf\" (UID: \"c67727b0-cc9d-4536-8f61-bbb39b7943f6\") " pod="openshift-monitoring/telemeter-client-68f67747f4-skfhf" Apr 22 19:59:10.699638 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:10.699201 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/c67727b0-cc9d-4536-8f61-bbb39b7943f6-federate-client-tls\") pod \"telemeter-client-68f67747f4-skfhf\" (UID: \"c67727b0-cc9d-4536-8f61-bbb39b7943f6\") " pod="openshift-monitoring/telemeter-client-68f67747f4-skfhf" Apr 22 19:59:10.699638 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:10.699237 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c67727b0-cc9d-4536-8f61-bbb39b7943f6-telemeter-trusted-ca-bundle\") pod \"telemeter-client-68f67747f4-skfhf\" (UID: \"c67727b0-cc9d-4536-8f61-bbb39b7943f6\") " pod="openshift-monitoring/telemeter-client-68f67747f4-skfhf" Apr 22 19:59:10.699638 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:10.699276 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/c67727b0-cc9d-4536-8f61-bbb39b7943f6-telemeter-client-tls\") pod \"telemeter-client-68f67747f4-skfhf\" (UID: \"c67727b0-cc9d-4536-8f61-bbb39b7943f6\") " pod="openshift-monitoring/telemeter-client-68f67747f4-skfhf" Apr 22 19:59:10.700099 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:10.700071 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c67727b0-cc9d-4536-8f61-bbb39b7943f6-metrics-client-ca\") pod \"telemeter-client-68f67747f4-skfhf\" (UID: \"c67727b0-cc9d-4536-8f61-bbb39b7943f6\") " pod="openshift-monitoring/telemeter-client-68f67747f4-skfhf" Apr 22 19:59:10.700647 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:10.700620 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c67727b0-cc9d-4536-8f61-bbb39b7943f6-serving-certs-ca-bundle\") pod \"telemeter-client-68f67747f4-skfhf\" (UID: \"c67727b0-cc9d-4536-8f61-bbb39b7943f6\") " pod="openshift-monitoring/telemeter-client-68f67747f4-skfhf" Apr 22 19:59:10.702643 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:10.702172 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c67727b0-cc9d-4536-8f61-bbb39b7943f6-telemeter-trusted-ca-bundle\") pod \"telemeter-client-68f67747f4-skfhf\" (UID: \"c67727b0-cc9d-4536-8f61-bbb39b7943f6\") " pod="openshift-monitoring/telemeter-client-68f67747f4-skfhf" Apr 22 19:59:10.702643 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:10.702407 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/c67727b0-cc9d-4536-8f61-bbb39b7943f6-federate-client-tls\") pod \"telemeter-client-68f67747f4-skfhf\" (UID: \"c67727b0-cc9d-4536-8f61-bbb39b7943f6\") " pod="openshift-monitoring/telemeter-client-68f67747f4-skfhf" Apr 22 19:59:10.702643 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:10.702457 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/c67727b0-cc9d-4536-8f61-bbb39b7943f6-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-68f67747f4-skfhf\" (UID: \"c67727b0-cc9d-4536-8f61-bbb39b7943f6\") " pod="openshift-monitoring/telemeter-client-68f67747f4-skfhf" Apr 22 19:59:10.702643 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:10.702492 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/c67727b0-cc9d-4536-8f61-bbb39b7943f6-secret-telemeter-client\") pod \"telemeter-client-68f67747f4-skfhf\" (UID: \"c67727b0-cc9d-4536-8f61-bbb39b7943f6\") " pod="openshift-monitoring/telemeter-client-68f67747f4-skfhf" Apr 22 19:59:10.702643 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:10.702602 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/c67727b0-cc9d-4536-8f61-bbb39b7943f6-telemeter-client-tls\") pod \"telemeter-client-68f67747f4-skfhf\" (UID: \"c67727b0-cc9d-4536-8f61-bbb39b7943f6\") " pod="openshift-monitoring/telemeter-client-68f67747f4-skfhf" Apr 22 19:59:10.708551 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:10.708484 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6pwqr\" (UniqueName: \"kubernetes.io/projected/c67727b0-cc9d-4536-8f61-bbb39b7943f6-kube-api-access-6pwqr\") pod \"telemeter-client-68f67747f4-skfhf\" (UID: \"c67727b0-cc9d-4536-8f61-bbb39b7943f6\") " pod="openshift-monitoring/telemeter-client-68f67747f4-skfhf" Apr 22 19:59:10.772471 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:10.772436 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-68f67747f4-skfhf" Apr 22 19:59:10.923757 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:10.923531 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-68f67747f4-skfhf"] Apr 22 19:59:11.355948 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:11.355817 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 19:59:11.366592 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:11.366558 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:59:11.369553 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:11.369162 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-77qnofi8990mi\"" Apr 22 19:59:11.369553 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:11.369185 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 22 19:59:11.369553 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:11.369386 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 22 19:59:11.369553 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:11.369405 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 22 19:59:11.369553 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:11.369523 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-ql42t\"" Apr 22 19:59:11.369912 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:11.369586 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 22 19:59:11.369912 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:11.369691 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 22 19:59:11.370497 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:11.370332 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 22 19:59:11.370497 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:11.370381 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 22 19:59:11.370655 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:11.370552 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 22 19:59:11.370707 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:11.370673 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 22 19:59:11.370763 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:11.370706 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 22 19:59:11.371035 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:11.370881 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 22 19:59:11.372865 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:11.372675 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 22 19:59:11.373525 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:11.373474 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 19:59:11.506166 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:11.506136 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f7bfcf26-9b65-49a5-b2a1-f1339513c430-config\") pod \"prometheus-k8s-0\" (UID: \"f7bfcf26-9b65-49a5-b2a1-f1339513c430\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:59:11.506321 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:11.506173 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f7bfcf26-9b65-49a5-b2a1-f1339513c430-config-out\") pod \"prometheus-k8s-0\" (UID: \"f7bfcf26-9b65-49a5-b2a1-f1339513c430\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:59:11.506321 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:11.506238 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/f7bfcf26-9b65-49a5-b2a1-f1339513c430-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"f7bfcf26-9b65-49a5-b2a1-f1339513c430\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:59:11.506321 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:11.506300 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/f7bfcf26-9b65-49a5-b2a1-f1339513c430-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"f7bfcf26-9b65-49a5-b2a1-f1339513c430\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:59:11.506456 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:11.506324 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/f7bfcf26-9b65-49a5-b2a1-f1339513c430-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"f7bfcf26-9b65-49a5-b2a1-f1339513c430\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:59:11.506456 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:11.506385 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/f7bfcf26-9b65-49a5-b2a1-f1339513c430-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"f7bfcf26-9b65-49a5-b2a1-f1339513c430\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:59:11.506456 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:11.506408 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/f7bfcf26-9b65-49a5-b2a1-f1339513c430-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"f7bfcf26-9b65-49a5-b2a1-f1339513c430\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:59:11.506456 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:11.506437 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f7bfcf26-9b65-49a5-b2a1-f1339513c430-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"f7bfcf26-9b65-49a5-b2a1-f1339513c430\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:59:11.506648 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:11.506512 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f7bfcf26-9b65-49a5-b2a1-f1339513c430-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"f7bfcf26-9b65-49a5-b2a1-f1339513c430\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:59:11.506648 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:11.506591 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f7bfcf26-9b65-49a5-b2a1-f1339513c430-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"f7bfcf26-9b65-49a5-b2a1-f1339513c430\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:59:11.506648 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:11.506628 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f7bfcf26-9b65-49a5-b2a1-f1339513c430-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"f7bfcf26-9b65-49a5-b2a1-f1339513c430\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:59:11.506758 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:11.506655 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/f7bfcf26-9b65-49a5-b2a1-f1339513c430-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"f7bfcf26-9b65-49a5-b2a1-f1339513c430\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:59:11.506758 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:11.506687 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/f7bfcf26-9b65-49a5-b2a1-f1339513c430-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"f7bfcf26-9b65-49a5-b2a1-f1339513c430\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:59:11.506758 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:11.506705 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/f7bfcf26-9b65-49a5-b2a1-f1339513c430-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"f7bfcf26-9b65-49a5-b2a1-f1339513c430\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:59:11.506758 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:11.506733 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/f7bfcf26-9b65-49a5-b2a1-f1339513c430-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"f7bfcf26-9b65-49a5-b2a1-f1339513c430\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:59:11.506947 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:11.506878 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f7bfcf26-9b65-49a5-b2a1-f1339513c430-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"f7bfcf26-9b65-49a5-b2a1-f1339513c430\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:59:11.506947 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:11.506914 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f7bfcf26-9b65-49a5-b2a1-f1339513c430-web-config\") pod \"prometheus-k8s-0\" (UID: \"f7bfcf26-9b65-49a5-b2a1-f1339513c430\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:59:11.507045 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:11.506942 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ps6t8\" (UniqueName: \"kubernetes.io/projected/f7bfcf26-9b65-49a5-b2a1-f1339513c430-kube-api-access-ps6t8\") pod \"prometheus-k8s-0\" (UID: \"f7bfcf26-9b65-49a5-b2a1-f1339513c430\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:59:11.607695 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:11.607606 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/f7bfcf26-9b65-49a5-b2a1-f1339513c430-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"f7bfcf26-9b65-49a5-b2a1-f1339513c430\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:59:11.607695 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:11.607661 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f7bfcf26-9b65-49a5-b2a1-f1339513c430-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"f7bfcf26-9b65-49a5-b2a1-f1339513c430\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:59:11.607695 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:11.607696 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f7bfcf26-9b65-49a5-b2a1-f1339513c430-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"f7bfcf26-9b65-49a5-b2a1-f1339513c430\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:59:11.608000 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:11.607726 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f7bfcf26-9b65-49a5-b2a1-f1339513c430-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"f7bfcf26-9b65-49a5-b2a1-f1339513c430\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:59:11.608000 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:11.607752 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f7bfcf26-9b65-49a5-b2a1-f1339513c430-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"f7bfcf26-9b65-49a5-b2a1-f1339513c430\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:59:11.608000 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:11.607777 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/f7bfcf26-9b65-49a5-b2a1-f1339513c430-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"f7bfcf26-9b65-49a5-b2a1-f1339513c430\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:59:11.608000 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:11.607816 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/f7bfcf26-9b65-49a5-b2a1-f1339513c430-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"f7bfcf26-9b65-49a5-b2a1-f1339513c430\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:59:11.608000 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:11.607868 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/f7bfcf26-9b65-49a5-b2a1-f1339513c430-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"f7bfcf26-9b65-49a5-b2a1-f1339513c430\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:59:11.608000 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:11.607898 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/f7bfcf26-9b65-49a5-b2a1-f1339513c430-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"f7bfcf26-9b65-49a5-b2a1-f1339513c430\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:59:11.608000 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:11.607941 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f7bfcf26-9b65-49a5-b2a1-f1339513c430-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"f7bfcf26-9b65-49a5-b2a1-f1339513c430\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:59:11.608000 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:11.607965 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f7bfcf26-9b65-49a5-b2a1-f1339513c430-web-config\") pod \"prometheus-k8s-0\" (UID: \"f7bfcf26-9b65-49a5-b2a1-f1339513c430\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:59:11.608000 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:11.607987 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ps6t8\" (UniqueName: \"kubernetes.io/projected/f7bfcf26-9b65-49a5-b2a1-f1339513c430-kube-api-access-ps6t8\") pod \"prometheus-k8s-0\" (UID: \"f7bfcf26-9b65-49a5-b2a1-f1339513c430\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:59:11.608464 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:11.608019 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f7bfcf26-9b65-49a5-b2a1-f1339513c430-config\") pod \"prometheus-k8s-0\" (UID: \"f7bfcf26-9b65-49a5-b2a1-f1339513c430\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:59:11.608464 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:11.608060 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f7bfcf26-9b65-49a5-b2a1-f1339513c430-config-out\") pod \"prometheus-k8s-0\" (UID: \"f7bfcf26-9b65-49a5-b2a1-f1339513c430\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:59:11.608464 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:11.608083 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/f7bfcf26-9b65-49a5-b2a1-f1339513c430-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"f7bfcf26-9b65-49a5-b2a1-f1339513c430\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:59:11.608464 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:11.608128 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/f7bfcf26-9b65-49a5-b2a1-f1339513c430-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"f7bfcf26-9b65-49a5-b2a1-f1339513c430\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:59:11.608464 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:11.608154 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/f7bfcf26-9b65-49a5-b2a1-f1339513c430-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"f7bfcf26-9b65-49a5-b2a1-f1339513c430\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:59:11.608464 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:11.608177 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/f7bfcf26-9b65-49a5-b2a1-f1339513c430-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"f7bfcf26-9b65-49a5-b2a1-f1339513c430\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:59:11.608740 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:11.608662 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/f7bfcf26-9b65-49a5-b2a1-f1339513c430-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"f7bfcf26-9b65-49a5-b2a1-f1339513c430\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:59:11.609064 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:11.609037 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f7bfcf26-9b65-49a5-b2a1-f1339513c430-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"f7bfcf26-9b65-49a5-b2a1-f1339513c430\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:59:11.609170 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:11.609104 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f7bfcf26-9b65-49a5-b2a1-f1339513c430-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"f7bfcf26-9b65-49a5-b2a1-f1339513c430\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:59:11.609384 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:11.609360 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f7bfcf26-9b65-49a5-b2a1-f1339513c430-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"f7bfcf26-9b65-49a5-b2a1-f1339513c430\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:59:11.609538 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:11.609492 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f7bfcf26-9b65-49a5-b2a1-f1339513c430-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"f7bfcf26-9b65-49a5-b2a1-f1339513c430\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:59:11.616885 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:11.612265 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/f7bfcf26-9b65-49a5-b2a1-f1339513c430-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"f7bfcf26-9b65-49a5-b2a1-f1339513c430\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:59:11.616885 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:11.612387 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/f7bfcf26-9b65-49a5-b2a1-f1339513c430-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"f7bfcf26-9b65-49a5-b2a1-f1339513c430\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:59:11.616885 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:11.612589 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/f7bfcf26-9b65-49a5-b2a1-f1339513c430-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"f7bfcf26-9b65-49a5-b2a1-f1339513c430\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:59:11.616885 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:11.615736 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/f7bfcf26-9b65-49a5-b2a1-f1339513c430-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"f7bfcf26-9b65-49a5-b2a1-f1339513c430\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:59:11.616885 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:11.616248 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/f7bfcf26-9b65-49a5-b2a1-f1339513c430-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"f7bfcf26-9b65-49a5-b2a1-f1339513c430\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:59:11.617447 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:11.616952 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f7bfcf26-9b65-49a5-b2a1-f1339513c430-config-out\") pod \"prometheus-k8s-0\" (UID: \"f7bfcf26-9b65-49a5-b2a1-f1339513c430\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:59:11.617504 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:11.617460 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/f7bfcf26-9b65-49a5-b2a1-f1339513c430-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"f7bfcf26-9b65-49a5-b2a1-f1339513c430\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:59:11.617504 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:11.617482 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/f7bfcf26-9b65-49a5-b2a1-f1339513c430-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"f7bfcf26-9b65-49a5-b2a1-f1339513c430\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:59:11.618749 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:11.618694 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/f7bfcf26-9b65-49a5-b2a1-f1339513c430-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"f7bfcf26-9b65-49a5-b2a1-f1339513c430\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:59:11.618892 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:11.618752 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f7bfcf26-9b65-49a5-b2a1-f1339513c430-web-config\") pod \"prometheus-k8s-0\" (UID: \"f7bfcf26-9b65-49a5-b2a1-f1339513c430\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:59:11.619101 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:11.619083 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f7bfcf26-9b65-49a5-b2a1-f1339513c430-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"f7bfcf26-9b65-49a5-b2a1-f1339513c430\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:59:11.619169 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:11.619140 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/f7bfcf26-9b65-49a5-b2a1-f1339513c430-config\") pod \"prometheus-k8s-0\" (UID: \"f7bfcf26-9b65-49a5-b2a1-f1339513c430\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:59:11.620283 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:11.620250 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ps6t8\" (UniqueName: \"kubernetes.io/projected/f7bfcf26-9b65-49a5-b2a1-f1339513c430-kube-api-access-ps6t8\") pod \"prometheus-k8s-0\" (UID: \"f7bfcf26-9b65-49a5-b2a1-f1339513c430\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:59:11.682597 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:11.682562 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:59:11.845615 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:59:11.845571 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc67727b0_cc9d_4536_8f61_bbb39b7943f6.slice/crio-fdc98e9b400b713627a34e2d3a0e6fe154a5867f6fc7b674e6b2c1a880db8a89 WatchSource:0}: Error finding container fdc98e9b400b713627a34e2d3a0e6fe154a5867f6fc7b674e6b2c1a880db8a89: Status 404 returned error can't find the container with id fdc98e9b400b713627a34e2d3a0e6fe154a5867f6fc7b674e6b2c1a880db8a89 Apr 22 19:59:12.002962 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:12.002901 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 19:59:12.006446 ip-10-0-143-253 kubenswrapper[2577]: W0422 19:59:12.006411 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf7bfcf26_9b65_49a5_b2a1_f1339513c430.slice/crio-7dce9f2be12d434f90f6de027f4b5b896ca086ab7d36888f20018947b3c615d2 WatchSource:0}: Error finding container 7dce9f2be12d434f90f6de027f4b5b896ca086ab7d36888f20018947b3c615d2: Status 404 returned error can't find the container with id 7dce9f2be12d434f90f6de027f4b5b896ca086ab7d36888f20018947b3c615d2 Apr 22 19:59:12.626530 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:12.626485 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-7bcf7f978f-v75zc" event={"ID":"061643e4-5536-4b98-a3db-7dc78b143be4","Type":"ContainerStarted","Data":"84cb04bbd5618661276040a906fa485751dcc43208235696136e86f23249dbd0"} Apr 22 19:59:12.629098 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:12.629066 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f7bfcf26-9b65-49a5-b2a1-f1339513c430","Type":"ContainerStarted","Data":"7dce9f2be12d434f90f6de027f4b5b896ca086ab7d36888f20018947b3c615d2"} Apr 22 19:59:12.630899 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:12.630869 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-68f67747f4-skfhf" event={"ID":"c67727b0-cc9d-4536-8f61-bbb39b7943f6","Type":"ContainerStarted","Data":"fdc98e9b400b713627a34e2d3a0e6fe154a5867f6fc7b674e6b2c1a880db8a89"} Apr 22 19:59:12.632898 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:12.632866 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-fbwbl" event={"ID":"3d17ac7f-8550-4939-b37f-744e02796a0a","Type":"ContainerStarted","Data":"8b9214475158eb8e00bf8c87915899e8045248520ec65927934694c9b33c56c3"} Apr 22 19:59:12.634258 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:12.634239 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-fbwbl" Apr 22 19:59:12.639759 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:12.639731 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-fbwbl" Apr 22 19:59:12.658381 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:12.658337 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-7bcf7f978f-v75zc" podStartSLOduration=1.957068214 podStartE2EDuration="3.658322587s" podCreationTimestamp="2026-04-22 19:59:09 +0000 UTC" firstStartedPulling="2026-04-22 19:59:10.174311988 +0000 UTC m=+85.489404201" lastFinishedPulling="2026-04-22 19:59:11.875566361 +0000 UTC m=+87.190658574" observedRunningTime="2026-04-22 19:59:12.655019098 +0000 UTC m=+87.970111333" watchObservedRunningTime="2026-04-22 19:59:12.658322587 +0000 UTC m=+87.973414823" Apr 22 19:59:13.637440 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:13.637408 2577 generic.go:358] "Generic (PLEG): container finished" podID="f7bfcf26-9b65-49a5-b2a1-f1339513c430" containerID="01af17d93af70b9f4c3307a643df01d285feded8043f809797002d7647d30266" exitCode=0 Apr 22 19:59:13.637804 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:13.637519 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f7bfcf26-9b65-49a5-b2a1-f1339513c430","Type":"ContainerDied","Data":"01af17d93af70b9f4c3307a643df01d285feded8043f809797002d7647d30266"} Apr 22 19:59:13.639604 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:13.639480 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-68f67747f4-skfhf" event={"ID":"c67727b0-cc9d-4536-8f61-bbb39b7943f6","Type":"ContainerStarted","Data":"a09cc3f6a6414e973984866146b95a11bf1c3d01473bbe5c464075e429552d09"} Apr 22 19:59:13.639604 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:13.639510 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-68f67747f4-skfhf" event={"ID":"c67727b0-cc9d-4536-8f61-bbb39b7943f6","Type":"ContainerStarted","Data":"17c6a6ea9861fb1f4e5c63c721b68fb3028758d41271ab955822b8436995495a"} Apr 22 19:59:13.664628 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:13.664313 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-fbwbl" podStartSLOduration=3.1184812219999998 podStartE2EDuration="4.664294794s" podCreationTimestamp="2026-04-22 19:59:09 +0000 UTC" firstStartedPulling="2026-04-22 19:59:10.330910354 +0000 UTC m=+85.646002567" lastFinishedPulling="2026-04-22 19:59:11.876723929 +0000 UTC m=+87.191816139" observedRunningTime="2026-04-22 19:59:12.678668115 +0000 UTC m=+87.993760347" watchObservedRunningTime="2026-04-22 19:59:13.664294794 +0000 UTC m=+88.979387026" Apr 22 19:59:14.644468 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:14.644424 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-68f67747f4-skfhf" event={"ID":"c67727b0-cc9d-4536-8f61-bbb39b7943f6","Type":"ContainerStarted","Data":"0f92f3959c16ec667b41a6febb9b40d2595248cbfbf202ef6884191e311a3d74"} Apr 22 19:59:14.670504 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:14.670445 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-68f67747f4-skfhf" podStartSLOduration=3.089317558 podStartE2EDuration="4.67042766s" podCreationTimestamp="2026-04-22 19:59:10 +0000 UTC" firstStartedPulling="2026-04-22 19:59:11.868603895 +0000 UTC m=+87.183696105" lastFinishedPulling="2026-04-22 19:59:13.449713987 +0000 UTC m=+88.764806207" observedRunningTime="2026-04-22 19:59:14.669823337 +0000 UTC m=+89.984915573" watchObservedRunningTime="2026-04-22 19:59:14.67042766 +0000 UTC m=+89.985519894" Apr 22 19:59:16.652684 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:16.652646 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f7bfcf26-9b65-49a5-b2a1-f1339513c430","Type":"ContainerStarted","Data":"9d36509763f076b7d9f58d43396cb69a54cdc04781e3deb54bb54ce112e6a593"} Apr 22 19:59:16.653160 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:16.652690 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f7bfcf26-9b65-49a5-b2a1-f1339513c430","Type":"ContainerStarted","Data":"768cb8abd3099f60d7f1c5fb4985a6d2dfdfc40c0bc940ebf8a2f54e8f4c56fe"} Apr 22 19:59:17.036589 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:17.036561 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-69c5956889-gpgjs" Apr 22 19:59:18.661991 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:18.661953 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f7bfcf26-9b65-49a5-b2a1-f1339513c430","Type":"ContainerStarted","Data":"fa7167e4384ad03be1946f1b8f89f15906ae5d3cd5e0c381e349f87197dacb63"} Apr 22 19:59:18.661991 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:18.661990 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f7bfcf26-9b65-49a5-b2a1-f1339513c430","Type":"ContainerStarted","Data":"108759ea53acd24b749f2e8709b558b8b180395f2c88544283df12e6076e4315"} Apr 22 19:59:18.661991 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:18.662000 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f7bfcf26-9b65-49a5-b2a1-f1339513c430","Type":"ContainerStarted","Data":"ff957654289c80d0771b4a1b47a76bf023b5b142ad86b3cdf9b77196c2d15809"} Apr 22 19:59:18.662563 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:18.662009 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f7bfcf26-9b65-49a5-b2a1-f1339513c430","Type":"ContainerStarted","Data":"47255050945cd82cebf7be89240aac373d3c63fb2b0779ce3be7ba3ac0b69b8e"} Apr 22 19:59:18.688553 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:18.688468 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=1.923047082 podStartE2EDuration="7.688449513s" podCreationTimestamp="2026-04-22 19:59:11 +0000 UTC" firstStartedPulling="2026-04-22 19:59:12.008385067 +0000 UTC m=+87.323477280" lastFinishedPulling="2026-04-22 19:59:17.773787496 +0000 UTC m=+93.088879711" observedRunningTime="2026-04-22 19:59:18.687017541 +0000 UTC m=+94.002109786" watchObservedRunningTime="2026-04-22 19:59:18.688449513 +0000 UTC m=+94.003541812" Apr 22 19:59:21.682728 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:21.682690 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:59:22.051498 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:22.051438 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-69c5956889-gpgjs" podUID="f493ef8a-2452-4069-a58f-62b9adde6d11" containerName="registry" containerID="cri-o://13ef629c4eaec4fb00204097b3114117ab893b31a0062a62ea0d14d8f5c81a5d" gracePeriod=30 Apr 22 19:59:22.283338 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:22.283318 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-69c5956889-gpgjs" Apr 22 19:59:22.406675 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:22.406603 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mqpvt\" (UniqueName: \"kubernetes.io/projected/f493ef8a-2452-4069-a58f-62b9adde6d11-kube-api-access-mqpvt\") pod \"f493ef8a-2452-4069-a58f-62b9adde6d11\" (UID: \"f493ef8a-2452-4069-a58f-62b9adde6d11\") " Apr 22 19:59:22.406675 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:22.406642 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f493ef8a-2452-4069-a58f-62b9adde6d11-ca-trust-extracted\") pod \"f493ef8a-2452-4069-a58f-62b9adde6d11\" (UID: \"f493ef8a-2452-4069-a58f-62b9adde6d11\") " Apr 22 19:59:22.406921 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:22.406688 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f493ef8a-2452-4069-a58f-62b9adde6d11-registry-certificates\") pod \"f493ef8a-2452-4069-a58f-62b9adde6d11\" (UID: \"f493ef8a-2452-4069-a58f-62b9adde6d11\") " Apr 22 19:59:22.406921 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:22.406720 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f493ef8a-2452-4069-a58f-62b9adde6d11-installation-pull-secrets\") pod \"f493ef8a-2452-4069-a58f-62b9adde6d11\" (UID: \"f493ef8a-2452-4069-a58f-62b9adde6d11\") " Apr 22 19:59:22.406921 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:22.406743 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f493ef8a-2452-4069-a58f-62b9adde6d11-trusted-ca\") pod \"f493ef8a-2452-4069-a58f-62b9adde6d11\" (UID: \"f493ef8a-2452-4069-a58f-62b9adde6d11\") " Apr 22 19:59:22.406921 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:22.406780 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f493ef8a-2452-4069-a58f-62b9adde6d11-bound-sa-token\") pod \"f493ef8a-2452-4069-a58f-62b9adde6d11\" (UID: \"f493ef8a-2452-4069-a58f-62b9adde6d11\") " Apr 22 19:59:22.406921 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:22.406808 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f493ef8a-2452-4069-a58f-62b9adde6d11-registry-tls\") pod \"f493ef8a-2452-4069-a58f-62b9adde6d11\" (UID: \"f493ef8a-2452-4069-a58f-62b9adde6d11\") " Apr 22 19:59:22.406921 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:22.406875 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/f493ef8a-2452-4069-a58f-62b9adde6d11-image-registry-private-configuration\") pod \"f493ef8a-2452-4069-a58f-62b9adde6d11\" (UID: \"f493ef8a-2452-4069-a58f-62b9adde6d11\") " Apr 22 19:59:22.407217 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:22.407152 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f493ef8a-2452-4069-a58f-62b9adde6d11-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "f493ef8a-2452-4069-a58f-62b9adde6d11" (UID: "f493ef8a-2452-4069-a58f-62b9adde6d11"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 19:59:22.407277 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:22.407254 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f493ef8a-2452-4069-a58f-62b9adde6d11-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "f493ef8a-2452-4069-a58f-62b9adde6d11" (UID: "f493ef8a-2452-4069-a58f-62b9adde6d11"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 19:59:22.409325 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:22.409297 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f493ef8a-2452-4069-a58f-62b9adde6d11-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "f493ef8a-2452-4069-a58f-62b9adde6d11" (UID: "f493ef8a-2452-4069-a58f-62b9adde6d11"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:59:22.409455 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:22.409350 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f493ef8a-2452-4069-a58f-62b9adde6d11-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "f493ef8a-2452-4069-a58f-62b9adde6d11" (UID: "f493ef8a-2452-4069-a58f-62b9adde6d11"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:59:22.409455 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:22.409418 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f493ef8a-2452-4069-a58f-62b9adde6d11-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "f493ef8a-2452-4069-a58f-62b9adde6d11" (UID: "f493ef8a-2452-4069-a58f-62b9adde6d11"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:59:22.409621 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:22.409595 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f493ef8a-2452-4069-a58f-62b9adde6d11-kube-api-access-mqpvt" (OuterVolumeSpecName: "kube-api-access-mqpvt") pod "f493ef8a-2452-4069-a58f-62b9adde6d11" (UID: "f493ef8a-2452-4069-a58f-62b9adde6d11"). InnerVolumeSpecName "kube-api-access-mqpvt". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:59:22.409725 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:22.409636 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f493ef8a-2452-4069-a58f-62b9adde6d11-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "f493ef8a-2452-4069-a58f-62b9adde6d11" (UID: "f493ef8a-2452-4069-a58f-62b9adde6d11"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:59:22.416199 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:22.416176 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f493ef8a-2452-4069-a58f-62b9adde6d11-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "f493ef8a-2452-4069-a58f-62b9adde6d11" (UID: "f493ef8a-2452-4069-a58f-62b9adde6d11"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:59:22.507608 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:22.507582 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-mqpvt\" (UniqueName: \"kubernetes.io/projected/f493ef8a-2452-4069-a58f-62b9adde6d11-kube-api-access-mqpvt\") on node \"ip-10-0-143-253.ec2.internal\" DevicePath \"\"" Apr 22 19:59:22.507608 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:22.507603 2577 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f493ef8a-2452-4069-a58f-62b9adde6d11-ca-trust-extracted\") on node \"ip-10-0-143-253.ec2.internal\" DevicePath \"\"" Apr 22 19:59:22.507735 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:22.507613 2577 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f493ef8a-2452-4069-a58f-62b9adde6d11-registry-certificates\") on node \"ip-10-0-143-253.ec2.internal\" DevicePath \"\"" Apr 22 19:59:22.507735 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:22.507622 2577 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f493ef8a-2452-4069-a58f-62b9adde6d11-installation-pull-secrets\") on node \"ip-10-0-143-253.ec2.internal\" DevicePath \"\"" Apr 22 19:59:22.507735 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:22.507631 2577 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f493ef8a-2452-4069-a58f-62b9adde6d11-trusted-ca\") on node \"ip-10-0-143-253.ec2.internal\" DevicePath \"\"" Apr 22 19:59:22.507735 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:22.507639 2577 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f493ef8a-2452-4069-a58f-62b9adde6d11-bound-sa-token\") on node \"ip-10-0-143-253.ec2.internal\" DevicePath \"\"" Apr 22 19:59:22.507735 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:22.507647 2577 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f493ef8a-2452-4069-a58f-62b9adde6d11-registry-tls\") on node \"ip-10-0-143-253.ec2.internal\" DevicePath \"\"" Apr 22 19:59:22.507735 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:22.507656 2577 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/f493ef8a-2452-4069-a58f-62b9adde6d11-image-registry-private-configuration\") on node \"ip-10-0-143-253.ec2.internal\" DevicePath \"\"" Apr 22 19:59:22.678209 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:22.678136 2577 generic.go:358] "Generic (PLEG): container finished" podID="f493ef8a-2452-4069-a58f-62b9adde6d11" containerID="13ef629c4eaec4fb00204097b3114117ab893b31a0062a62ea0d14d8f5c81a5d" exitCode=0 Apr 22 19:59:22.678209 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:22.678187 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-69c5956889-gpgjs" Apr 22 19:59:22.678362 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:22.678224 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-69c5956889-gpgjs" event={"ID":"f493ef8a-2452-4069-a58f-62b9adde6d11","Type":"ContainerDied","Data":"13ef629c4eaec4fb00204097b3114117ab893b31a0062a62ea0d14d8f5c81a5d"} Apr 22 19:59:22.678362 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:22.678267 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-69c5956889-gpgjs" event={"ID":"f493ef8a-2452-4069-a58f-62b9adde6d11","Type":"ContainerDied","Data":"5d29dd15e64bd936f27ac79f31e4b40e7b116dde735476a60b2ea7f76222be75"} Apr 22 19:59:22.678362 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:22.678282 2577 scope.go:117] "RemoveContainer" containerID="13ef629c4eaec4fb00204097b3114117ab893b31a0062a62ea0d14d8f5c81a5d" Apr 22 19:59:22.686396 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:22.686378 2577 scope.go:117] "RemoveContainer" containerID="13ef629c4eaec4fb00204097b3114117ab893b31a0062a62ea0d14d8f5c81a5d" Apr 22 19:59:22.686675 ip-10-0-143-253 kubenswrapper[2577]: E0422 19:59:22.686644 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"13ef629c4eaec4fb00204097b3114117ab893b31a0062a62ea0d14d8f5c81a5d\": container with ID starting with 13ef629c4eaec4fb00204097b3114117ab893b31a0062a62ea0d14d8f5c81a5d not found: ID does not exist" containerID="13ef629c4eaec4fb00204097b3114117ab893b31a0062a62ea0d14d8f5c81a5d" Apr 22 19:59:22.686730 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:22.686677 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"13ef629c4eaec4fb00204097b3114117ab893b31a0062a62ea0d14d8f5c81a5d"} err="failed to get container status \"13ef629c4eaec4fb00204097b3114117ab893b31a0062a62ea0d14d8f5c81a5d\": rpc error: code = NotFound desc = could not find container \"13ef629c4eaec4fb00204097b3114117ab893b31a0062a62ea0d14d8f5c81a5d\": container with ID starting with 13ef629c4eaec4fb00204097b3114117ab893b31a0062a62ea0d14d8f5c81a5d not found: ID does not exist" Apr 22 19:59:22.698238 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:22.698215 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-69c5956889-gpgjs"] Apr 22 19:59:22.702445 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:22.702424 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-69c5956889-gpgjs"] Apr 22 19:59:23.252289 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:23.252252 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f493ef8a-2452-4069-a58f-62b9adde6d11" path="/var/lib/kubelet/pods/f493ef8a-2452-4069-a58f-62b9adde6d11/volumes" Apr 22 19:59:30.056905 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:30.056873 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-7bcf7f978f-v75zc" Apr 22 19:59:30.057280 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:30.056941 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/metrics-server-7bcf7f978f-v75zc" Apr 22 19:59:50.061256 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:50.061230 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-7bcf7f978f-v75zc" Apr 22 19:59:50.065308 ip-10-0-143-253 kubenswrapper[2577]: I0422 19:59:50.065284 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-7bcf7f978f-v75zc" Apr 22 20:00:11.682918 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:00:11.682829 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:00:11.701267 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:00:11.701241 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:00:11.825269 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:00:11.825239 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:00:29.813891 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:00:29.813857 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 20:00:29.814346 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:00:29.814267 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="f7bfcf26-9b65-49a5-b2a1-f1339513c430" containerName="prometheus" containerID="cri-o://768cb8abd3099f60d7f1c5fb4985a6d2dfdfc40c0bc940ebf8a2f54e8f4c56fe" gracePeriod=600 Apr 22 20:00:29.814426 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:00:29.814330 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="f7bfcf26-9b65-49a5-b2a1-f1339513c430" containerName="config-reloader" containerID="cri-o://9d36509763f076b7d9f58d43396cb69a54cdc04781e3deb54bb54ce112e6a593" gracePeriod=600 Apr 22 20:00:29.814426 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:00:29.814328 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="f7bfcf26-9b65-49a5-b2a1-f1339513c430" containerName="kube-rbac-proxy-web" containerID="cri-o://ff957654289c80d0771b4a1b47a76bf023b5b142ad86b3cdf9b77196c2d15809" gracePeriod=600 Apr 22 20:00:29.814426 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:00:29.814329 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="f7bfcf26-9b65-49a5-b2a1-f1339513c430" containerName="thanos-sidecar" containerID="cri-o://47255050945cd82cebf7be89240aac373d3c63fb2b0779ce3be7ba3ac0b69b8e" gracePeriod=600 Apr 22 20:00:29.814426 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:00:29.814398 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="f7bfcf26-9b65-49a5-b2a1-f1339513c430" containerName="kube-rbac-proxy" containerID="cri-o://108759ea53acd24b749f2e8709b558b8b180395f2c88544283df12e6076e4315" gracePeriod=600 Apr 22 20:00:29.814602 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:00:29.814380 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="f7bfcf26-9b65-49a5-b2a1-f1339513c430" containerName="kube-rbac-proxy-thanos" containerID="cri-o://fa7167e4384ad03be1946f1b8f89f15906ae5d3cd5e0c381e349f87197dacb63" gracePeriod=600 Apr 22 20:00:30.866787 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:00:30.866757 2577 generic.go:358] "Generic (PLEG): container finished" podID="f7bfcf26-9b65-49a5-b2a1-f1339513c430" containerID="fa7167e4384ad03be1946f1b8f89f15906ae5d3cd5e0c381e349f87197dacb63" exitCode=0 Apr 22 20:00:30.866787 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:00:30.866782 2577 generic.go:358] "Generic (PLEG): container finished" podID="f7bfcf26-9b65-49a5-b2a1-f1339513c430" containerID="108759ea53acd24b749f2e8709b558b8b180395f2c88544283df12e6076e4315" exitCode=0 Apr 22 20:00:30.866787 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:00:30.866791 2577 generic.go:358] "Generic (PLEG): container finished" podID="f7bfcf26-9b65-49a5-b2a1-f1339513c430" containerID="47255050945cd82cebf7be89240aac373d3c63fb2b0779ce3be7ba3ac0b69b8e" exitCode=0 Apr 22 20:00:30.867142 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:00:30.866797 2577 generic.go:358] "Generic (PLEG): container finished" podID="f7bfcf26-9b65-49a5-b2a1-f1339513c430" containerID="9d36509763f076b7d9f58d43396cb69a54cdc04781e3deb54bb54ce112e6a593" exitCode=0 Apr 22 20:00:30.867142 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:00:30.866804 2577 generic.go:358] "Generic (PLEG): container finished" podID="f7bfcf26-9b65-49a5-b2a1-f1339513c430" containerID="768cb8abd3099f60d7f1c5fb4985a6d2dfdfc40c0bc940ebf8a2f54e8f4c56fe" exitCode=0 Apr 22 20:00:30.867142 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:00:30.866824 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f7bfcf26-9b65-49a5-b2a1-f1339513c430","Type":"ContainerDied","Data":"fa7167e4384ad03be1946f1b8f89f15906ae5d3cd5e0c381e349f87197dacb63"} Apr 22 20:00:30.867142 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:00:30.866871 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f7bfcf26-9b65-49a5-b2a1-f1339513c430","Type":"ContainerDied","Data":"108759ea53acd24b749f2e8709b558b8b180395f2c88544283df12e6076e4315"} Apr 22 20:00:30.867142 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:00:30.866882 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f7bfcf26-9b65-49a5-b2a1-f1339513c430","Type":"ContainerDied","Data":"47255050945cd82cebf7be89240aac373d3c63fb2b0779ce3be7ba3ac0b69b8e"} Apr 22 20:00:30.867142 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:00:30.866893 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f7bfcf26-9b65-49a5-b2a1-f1339513c430","Type":"ContainerDied","Data":"9d36509763f076b7d9f58d43396cb69a54cdc04781e3deb54bb54ce112e6a593"} Apr 22 20:00:30.867142 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:00:30.866901 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f7bfcf26-9b65-49a5-b2a1-f1339513c430","Type":"ContainerDied","Data":"768cb8abd3099f60d7f1c5fb4985a6d2dfdfc40c0bc940ebf8a2f54e8f4c56fe"} Apr 22 20:00:31.058197 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:00:31.058177 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:00:31.131436 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:00:31.131358 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f7bfcf26-9b65-49a5-b2a1-f1339513c430-config-out\") pod \"f7bfcf26-9b65-49a5-b2a1-f1339513c430\" (UID: \"f7bfcf26-9b65-49a5-b2a1-f1339513c430\") " Apr 22 20:00:31.131436 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:00:31.131394 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/f7bfcf26-9b65-49a5-b2a1-f1339513c430-prometheus-k8s-rulefiles-0\") pod \"f7bfcf26-9b65-49a5-b2a1-f1339513c430\" (UID: \"f7bfcf26-9b65-49a5-b2a1-f1339513c430\") " Apr 22 20:00:31.131436 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:00:31.131422 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f7bfcf26-9b65-49a5-b2a1-f1339513c430-tls-assets\") pod \"f7bfcf26-9b65-49a5-b2a1-f1339513c430\" (UID: \"f7bfcf26-9b65-49a5-b2a1-f1339513c430\") " Apr 22 20:00:31.131692 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:00:31.131452 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ps6t8\" (UniqueName: \"kubernetes.io/projected/f7bfcf26-9b65-49a5-b2a1-f1339513c430-kube-api-access-ps6t8\") pod \"f7bfcf26-9b65-49a5-b2a1-f1339513c430\" (UID: \"f7bfcf26-9b65-49a5-b2a1-f1339513c430\") " Apr 22 20:00:31.131692 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:00:31.131482 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/f7bfcf26-9b65-49a5-b2a1-f1339513c430-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"f7bfcf26-9b65-49a5-b2a1-f1339513c430\" (UID: \"f7bfcf26-9b65-49a5-b2a1-f1339513c430\") " Apr 22 20:00:31.131692 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:00:31.131530 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f7bfcf26-9b65-49a5-b2a1-f1339513c430-config\") pod \"f7bfcf26-9b65-49a5-b2a1-f1339513c430\" (UID: \"f7bfcf26-9b65-49a5-b2a1-f1339513c430\") " Apr 22 20:00:31.131692 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:00:31.131557 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/f7bfcf26-9b65-49a5-b2a1-f1339513c430-secret-metrics-client-certs\") pod \"f7bfcf26-9b65-49a5-b2a1-f1339513c430\" (UID: \"f7bfcf26-9b65-49a5-b2a1-f1339513c430\") " Apr 22 20:00:31.131692 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:00:31.131583 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f7bfcf26-9b65-49a5-b2a1-f1339513c430-configmap-serving-certs-ca-bundle\") pod \"f7bfcf26-9b65-49a5-b2a1-f1339513c430\" (UID: \"f7bfcf26-9b65-49a5-b2a1-f1339513c430\") " Apr 22 20:00:31.131692 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:00:31.131614 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/f7bfcf26-9b65-49a5-b2a1-f1339513c430-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"f7bfcf26-9b65-49a5-b2a1-f1339513c430\" (UID: \"f7bfcf26-9b65-49a5-b2a1-f1339513c430\") " Apr 22 20:00:31.131692 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:00:31.131643 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/f7bfcf26-9b65-49a5-b2a1-f1339513c430-secret-grpc-tls\") pod \"f7bfcf26-9b65-49a5-b2a1-f1339513c430\" (UID: \"f7bfcf26-9b65-49a5-b2a1-f1339513c430\") " Apr 22 20:00:31.131692 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:00:31.131683 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/f7bfcf26-9b65-49a5-b2a1-f1339513c430-thanos-prometheus-http-client-file\") pod \"f7bfcf26-9b65-49a5-b2a1-f1339513c430\" (UID: \"f7bfcf26-9b65-49a5-b2a1-f1339513c430\") " Apr 22 20:00:31.132103 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:00:31.131710 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/f7bfcf26-9b65-49a5-b2a1-f1339513c430-secret-prometheus-k8s-tls\") pod \"f7bfcf26-9b65-49a5-b2a1-f1339513c430\" (UID: \"f7bfcf26-9b65-49a5-b2a1-f1339513c430\") " Apr 22 20:00:31.132103 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:00:31.131743 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f7bfcf26-9b65-49a5-b2a1-f1339513c430-web-config\") pod \"f7bfcf26-9b65-49a5-b2a1-f1339513c430\" (UID: \"f7bfcf26-9b65-49a5-b2a1-f1339513c430\") " Apr 22 20:00:31.132103 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:00:31.131775 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f7bfcf26-9b65-49a5-b2a1-f1339513c430-configmap-kubelet-serving-ca-bundle\") pod \"f7bfcf26-9b65-49a5-b2a1-f1339513c430\" (UID: \"f7bfcf26-9b65-49a5-b2a1-f1339513c430\") " Apr 22 20:00:31.132103 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:00:31.131818 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/f7bfcf26-9b65-49a5-b2a1-f1339513c430-secret-kube-rbac-proxy\") pod \"f7bfcf26-9b65-49a5-b2a1-f1339513c430\" (UID: \"f7bfcf26-9b65-49a5-b2a1-f1339513c430\") " Apr 22 20:00:31.132103 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:00:31.131864 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/f7bfcf26-9b65-49a5-b2a1-f1339513c430-prometheus-k8s-db\") pod \"f7bfcf26-9b65-49a5-b2a1-f1339513c430\" (UID: \"f7bfcf26-9b65-49a5-b2a1-f1339513c430\") " Apr 22 20:00:31.132103 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:00:31.131892 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f7bfcf26-9b65-49a5-b2a1-f1339513c430-prometheus-trusted-ca-bundle\") pod \"f7bfcf26-9b65-49a5-b2a1-f1339513c430\" (UID: \"f7bfcf26-9b65-49a5-b2a1-f1339513c430\") " Apr 22 20:00:31.132103 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:00:31.131918 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f7bfcf26-9b65-49a5-b2a1-f1339513c430-configmap-metrics-client-ca\") pod \"f7bfcf26-9b65-49a5-b2a1-f1339513c430\" (UID: \"f7bfcf26-9b65-49a5-b2a1-f1339513c430\") " Apr 22 20:00:31.132829 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:00:31.132113 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f7bfcf26-9b65-49a5-b2a1-f1339513c430-configmap-serving-certs-ca-bundle" (OuterVolumeSpecName: "configmap-serving-certs-ca-bundle") pod "f7bfcf26-9b65-49a5-b2a1-f1339513c430" (UID: "f7bfcf26-9b65-49a5-b2a1-f1339513c430"). InnerVolumeSpecName "configmap-serving-certs-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 20:00:31.132829 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:00:31.132530 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f7bfcf26-9b65-49a5-b2a1-f1339513c430-prometheus-k8s-rulefiles-0" (OuterVolumeSpecName: "prometheus-k8s-rulefiles-0") pod "f7bfcf26-9b65-49a5-b2a1-f1339513c430" (UID: "f7bfcf26-9b65-49a5-b2a1-f1339513c430"). InnerVolumeSpecName "prometheus-k8s-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 20:00:31.133744 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:00:31.133716 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f7bfcf26-9b65-49a5-b2a1-f1339513c430-configmap-kubelet-serving-ca-bundle" (OuterVolumeSpecName: "configmap-kubelet-serving-ca-bundle") pod "f7bfcf26-9b65-49a5-b2a1-f1339513c430" (UID: "f7bfcf26-9b65-49a5-b2a1-f1339513c430"). InnerVolumeSpecName "configmap-kubelet-serving-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 20:00:31.133972 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:00:31.133945 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f7bfcf26-9b65-49a5-b2a1-f1339513c430-prometheus-trusted-ca-bundle" (OuterVolumeSpecName: "prometheus-trusted-ca-bundle") pod "f7bfcf26-9b65-49a5-b2a1-f1339513c430" (UID: "f7bfcf26-9b65-49a5-b2a1-f1339513c430"). InnerVolumeSpecName "prometheus-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 20:00:31.134238 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:00:31.134209 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f7bfcf26-9b65-49a5-b2a1-f1339513c430-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "f7bfcf26-9b65-49a5-b2a1-f1339513c430" (UID: "f7bfcf26-9b65-49a5-b2a1-f1339513c430"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 20:00:31.134392 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:00:31.134290 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f7bfcf26-9b65-49a5-b2a1-f1339513c430-configmap-metrics-client-ca" (OuterVolumeSpecName: "configmap-metrics-client-ca") pod "f7bfcf26-9b65-49a5-b2a1-f1339513c430" (UID: "f7bfcf26-9b65-49a5-b2a1-f1339513c430"). InnerVolumeSpecName "configmap-metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 20:00:31.136222 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:00:31.136184 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f7bfcf26-9b65-49a5-b2a1-f1339513c430-secret-prometheus-k8s-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-prometheus-k8s-kube-rbac-proxy-web") pod "f7bfcf26-9b65-49a5-b2a1-f1339513c430" (UID: "f7bfcf26-9b65-49a5-b2a1-f1339513c430"). InnerVolumeSpecName "secret-prometheus-k8s-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 20:00:31.136377 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:00:31.136351 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f7bfcf26-9b65-49a5-b2a1-f1339513c430-prometheus-k8s-db" (OuterVolumeSpecName: "prometheus-k8s-db") pod "f7bfcf26-9b65-49a5-b2a1-f1339513c430" (UID: "f7bfcf26-9b65-49a5-b2a1-f1339513c430"). InnerVolumeSpecName "prometheus-k8s-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 20:00:31.136460 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:00:31.136411 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f7bfcf26-9b65-49a5-b2a1-f1339513c430-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "f7bfcf26-9b65-49a5-b2a1-f1339513c430" (UID: "f7bfcf26-9b65-49a5-b2a1-f1339513c430"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 20:00:31.136624 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:00:31.136587 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f7bfcf26-9b65-49a5-b2a1-f1339513c430-secret-kube-rbac-proxy" (OuterVolumeSpecName: "secret-kube-rbac-proxy") pod "f7bfcf26-9b65-49a5-b2a1-f1339513c430" (UID: "f7bfcf26-9b65-49a5-b2a1-f1339513c430"). InnerVolumeSpecName "secret-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 20:00:31.136624 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:00:31.136609 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f7bfcf26-9b65-49a5-b2a1-f1339513c430-config" (OuterVolumeSpecName: "config") pod "f7bfcf26-9b65-49a5-b2a1-f1339513c430" (UID: "f7bfcf26-9b65-49a5-b2a1-f1339513c430"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 20:00:31.136847 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:00:31.136808 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f7bfcf26-9b65-49a5-b2a1-f1339513c430-secret-grpc-tls" (OuterVolumeSpecName: "secret-grpc-tls") pod "f7bfcf26-9b65-49a5-b2a1-f1339513c430" (UID: "f7bfcf26-9b65-49a5-b2a1-f1339513c430"). InnerVolumeSpecName "secret-grpc-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 20:00:31.137276 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:00:31.137247 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f7bfcf26-9b65-49a5-b2a1-f1339513c430-kube-api-access-ps6t8" (OuterVolumeSpecName: "kube-api-access-ps6t8") pod "f7bfcf26-9b65-49a5-b2a1-f1339513c430" (UID: "f7bfcf26-9b65-49a5-b2a1-f1339513c430"). InnerVolumeSpecName "kube-api-access-ps6t8". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 20:00:31.137369 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:00:31.137274 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f7bfcf26-9b65-49a5-b2a1-f1339513c430-config-out" (OuterVolumeSpecName: "config-out") pod "f7bfcf26-9b65-49a5-b2a1-f1339513c430" (UID: "f7bfcf26-9b65-49a5-b2a1-f1339513c430"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 20:00:31.137422 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:00:31.137379 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f7bfcf26-9b65-49a5-b2a1-f1339513c430-secret-prometheus-k8s-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-tls") pod "f7bfcf26-9b65-49a5-b2a1-f1339513c430" (UID: "f7bfcf26-9b65-49a5-b2a1-f1339513c430"). InnerVolumeSpecName "secret-prometheus-k8s-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 20:00:31.137464 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:00:31.137444 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f7bfcf26-9b65-49a5-b2a1-f1339513c430-secret-prometheus-k8s-thanos-sidecar-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-thanos-sidecar-tls") pod "f7bfcf26-9b65-49a5-b2a1-f1339513c430" (UID: "f7bfcf26-9b65-49a5-b2a1-f1339513c430"). InnerVolumeSpecName "secret-prometheus-k8s-thanos-sidecar-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 20:00:31.138450 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:00:31.138422 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f7bfcf26-9b65-49a5-b2a1-f1339513c430-secret-metrics-client-certs" (OuterVolumeSpecName: "secret-metrics-client-certs") pod "f7bfcf26-9b65-49a5-b2a1-f1339513c430" (UID: "f7bfcf26-9b65-49a5-b2a1-f1339513c430"). InnerVolumeSpecName "secret-metrics-client-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 20:00:31.146984 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:00:31.146958 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f7bfcf26-9b65-49a5-b2a1-f1339513c430-web-config" (OuterVolumeSpecName: "web-config") pod "f7bfcf26-9b65-49a5-b2a1-f1339513c430" (UID: "f7bfcf26-9b65-49a5-b2a1-f1339513c430"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 20:00:31.233470 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:00:31.233437 2577 reconciler_common.go:299] "Volume detached for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/f7bfcf26-9b65-49a5-b2a1-f1339513c430-secret-kube-rbac-proxy\") on node \"ip-10-0-143-253.ec2.internal\" DevicePath \"\"" Apr 22 20:00:31.233470 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:00:31.233466 2577 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/f7bfcf26-9b65-49a5-b2a1-f1339513c430-prometheus-k8s-db\") on node \"ip-10-0-143-253.ec2.internal\" DevicePath \"\"" Apr 22 20:00:31.233470 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:00:31.233476 2577 reconciler_common.go:299] "Volume detached for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f7bfcf26-9b65-49a5-b2a1-f1339513c430-prometheus-trusted-ca-bundle\") on node \"ip-10-0-143-253.ec2.internal\" DevicePath \"\"" Apr 22 20:00:31.233678 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:00:31.233488 2577 reconciler_common.go:299] "Volume detached for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f7bfcf26-9b65-49a5-b2a1-f1339513c430-configmap-metrics-client-ca\") on node \"ip-10-0-143-253.ec2.internal\" DevicePath \"\"" Apr 22 20:00:31.233678 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:00:31.233498 2577 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f7bfcf26-9b65-49a5-b2a1-f1339513c430-config-out\") on node \"ip-10-0-143-253.ec2.internal\" DevicePath \"\"" Apr 22 20:00:31.233678 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:00:31.233506 2577 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/f7bfcf26-9b65-49a5-b2a1-f1339513c430-prometheus-k8s-rulefiles-0\") on node \"ip-10-0-143-253.ec2.internal\" DevicePath \"\"" Apr 22 20:00:31.233678 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:00:31.233515 2577 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f7bfcf26-9b65-49a5-b2a1-f1339513c430-tls-assets\") on node \"ip-10-0-143-253.ec2.internal\" DevicePath \"\"" Apr 22 20:00:31.233678 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:00:31.233525 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-ps6t8\" (UniqueName: \"kubernetes.io/projected/f7bfcf26-9b65-49a5-b2a1-f1339513c430-kube-api-access-ps6t8\") on node \"ip-10-0-143-253.ec2.internal\" DevicePath \"\"" Apr 22 20:00:31.233678 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:00:31.233534 2577 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/f7bfcf26-9b65-49a5-b2a1-f1339513c430-secret-prometheus-k8s-thanos-sidecar-tls\") on node \"ip-10-0-143-253.ec2.internal\" DevicePath \"\"" Apr 22 20:00:31.233678 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:00:31.233544 2577 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/f7bfcf26-9b65-49a5-b2a1-f1339513c430-config\") on node \"ip-10-0-143-253.ec2.internal\" DevicePath \"\"" Apr 22 20:00:31.233678 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:00:31.233553 2577 reconciler_common.go:299] "Volume detached for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/f7bfcf26-9b65-49a5-b2a1-f1339513c430-secret-metrics-client-certs\") on node \"ip-10-0-143-253.ec2.internal\" DevicePath \"\"" Apr 22 20:00:31.233678 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:00:31.233562 2577 reconciler_common.go:299] "Volume detached for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f7bfcf26-9b65-49a5-b2a1-f1339513c430-configmap-serving-certs-ca-bundle\") on node \"ip-10-0-143-253.ec2.internal\" DevicePath \"\"" Apr 22 20:00:31.233678 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:00:31.233571 2577 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/f7bfcf26-9b65-49a5-b2a1-f1339513c430-secret-prometheus-k8s-kube-rbac-proxy-web\") on node \"ip-10-0-143-253.ec2.internal\" DevicePath \"\"" Apr 22 20:00:31.233678 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:00:31.233580 2577 reconciler_common.go:299] "Volume detached for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/f7bfcf26-9b65-49a5-b2a1-f1339513c430-secret-grpc-tls\") on node \"ip-10-0-143-253.ec2.internal\" DevicePath \"\"" Apr 22 20:00:31.233678 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:00:31.233590 2577 reconciler_common.go:299] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/f7bfcf26-9b65-49a5-b2a1-f1339513c430-thanos-prometheus-http-client-file\") on node \"ip-10-0-143-253.ec2.internal\" DevicePath \"\"" Apr 22 20:00:31.233678 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:00:31.233599 2577 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/f7bfcf26-9b65-49a5-b2a1-f1339513c430-secret-prometheus-k8s-tls\") on node \"ip-10-0-143-253.ec2.internal\" DevicePath \"\"" Apr 22 20:00:31.233678 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:00:31.233607 2577 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f7bfcf26-9b65-49a5-b2a1-f1339513c430-web-config\") on node \"ip-10-0-143-253.ec2.internal\" DevicePath \"\"" Apr 22 20:00:31.233678 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:00:31.233617 2577 reconciler_common.go:299] "Volume detached for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f7bfcf26-9b65-49a5-b2a1-f1339513c430-configmap-kubelet-serving-ca-bundle\") on node \"ip-10-0-143-253.ec2.internal\" DevicePath \"\"" Apr 22 20:00:31.871975 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:00:31.871943 2577 generic.go:358] "Generic (PLEG): container finished" podID="f7bfcf26-9b65-49a5-b2a1-f1339513c430" containerID="ff957654289c80d0771b4a1b47a76bf023b5b142ad86b3cdf9b77196c2d15809" exitCode=0 Apr 22 20:00:31.872342 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:00:31.872027 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f7bfcf26-9b65-49a5-b2a1-f1339513c430","Type":"ContainerDied","Data":"ff957654289c80d0771b4a1b47a76bf023b5b142ad86b3cdf9b77196c2d15809"} Apr 22 20:00:31.872342 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:00:31.872042 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:00:31.872342 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:00:31.872071 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f7bfcf26-9b65-49a5-b2a1-f1339513c430","Type":"ContainerDied","Data":"7dce9f2be12d434f90f6de027f4b5b896ca086ab7d36888f20018947b3c615d2"} Apr 22 20:00:31.872342 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:00:31.872089 2577 scope.go:117] "RemoveContainer" containerID="fa7167e4384ad03be1946f1b8f89f15906ae5d3cd5e0c381e349f87197dacb63" Apr 22 20:00:31.879090 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:00:31.879072 2577 scope.go:117] "RemoveContainer" containerID="108759ea53acd24b749f2e8709b558b8b180395f2c88544283df12e6076e4315" Apr 22 20:00:31.885348 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:00:31.885334 2577 scope.go:117] "RemoveContainer" containerID="ff957654289c80d0771b4a1b47a76bf023b5b142ad86b3cdf9b77196c2d15809" Apr 22 20:00:31.891426 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:00:31.891404 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 20:00:31.891966 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:00:31.891951 2577 scope.go:117] "RemoveContainer" containerID="47255050945cd82cebf7be89240aac373d3c63fb2b0779ce3be7ba3ac0b69b8e" Apr 22 20:00:31.895278 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:00:31.895255 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 20:00:31.898474 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:00:31.898456 2577 scope.go:117] "RemoveContainer" containerID="9d36509763f076b7d9f58d43396cb69a54cdc04781e3deb54bb54ce112e6a593" Apr 22 20:00:31.904219 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:00:31.904203 2577 scope.go:117] "RemoveContainer" containerID="768cb8abd3099f60d7f1c5fb4985a6d2dfdfc40c0bc940ebf8a2f54e8f4c56fe" Apr 22 20:00:31.910658 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:00:31.910640 2577 scope.go:117] "RemoveContainer" containerID="01af17d93af70b9f4c3307a643df01d285feded8043f809797002d7647d30266" Apr 22 20:00:31.916520 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:00:31.916503 2577 scope.go:117] "RemoveContainer" containerID="fa7167e4384ad03be1946f1b8f89f15906ae5d3cd5e0c381e349f87197dacb63" Apr 22 20:00:31.916758 ip-10-0-143-253 kubenswrapper[2577]: E0422 20:00:31.916739 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fa7167e4384ad03be1946f1b8f89f15906ae5d3cd5e0c381e349f87197dacb63\": container with ID starting with fa7167e4384ad03be1946f1b8f89f15906ae5d3cd5e0c381e349f87197dacb63 not found: ID does not exist" containerID="fa7167e4384ad03be1946f1b8f89f15906ae5d3cd5e0c381e349f87197dacb63" Apr 22 20:00:31.916801 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:00:31.916767 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fa7167e4384ad03be1946f1b8f89f15906ae5d3cd5e0c381e349f87197dacb63"} err="failed to get container status \"fa7167e4384ad03be1946f1b8f89f15906ae5d3cd5e0c381e349f87197dacb63\": rpc error: code = NotFound desc = could not find container \"fa7167e4384ad03be1946f1b8f89f15906ae5d3cd5e0c381e349f87197dacb63\": container with ID starting with fa7167e4384ad03be1946f1b8f89f15906ae5d3cd5e0c381e349f87197dacb63 not found: ID does not exist" Apr 22 20:00:31.916801 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:00:31.916788 2577 scope.go:117] "RemoveContainer" containerID="108759ea53acd24b749f2e8709b558b8b180395f2c88544283df12e6076e4315" Apr 22 20:00:31.917218 ip-10-0-143-253 kubenswrapper[2577]: E0422 20:00:31.917197 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"108759ea53acd24b749f2e8709b558b8b180395f2c88544283df12e6076e4315\": container with ID starting with 108759ea53acd24b749f2e8709b558b8b180395f2c88544283df12e6076e4315 not found: ID does not exist" containerID="108759ea53acd24b749f2e8709b558b8b180395f2c88544283df12e6076e4315" Apr 22 20:00:31.917282 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:00:31.917238 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"108759ea53acd24b749f2e8709b558b8b180395f2c88544283df12e6076e4315"} err="failed to get container status \"108759ea53acd24b749f2e8709b558b8b180395f2c88544283df12e6076e4315\": rpc error: code = NotFound desc = could not find container \"108759ea53acd24b749f2e8709b558b8b180395f2c88544283df12e6076e4315\": container with ID starting with 108759ea53acd24b749f2e8709b558b8b180395f2c88544283df12e6076e4315 not found: ID does not exist" Apr 22 20:00:31.917282 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:00:31.917260 2577 scope.go:117] "RemoveContainer" containerID="ff957654289c80d0771b4a1b47a76bf023b5b142ad86b3cdf9b77196c2d15809" Apr 22 20:00:31.917683 ip-10-0-143-253 kubenswrapper[2577]: E0422 20:00:31.917659 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ff957654289c80d0771b4a1b47a76bf023b5b142ad86b3cdf9b77196c2d15809\": container with ID starting with ff957654289c80d0771b4a1b47a76bf023b5b142ad86b3cdf9b77196c2d15809 not found: ID does not exist" containerID="ff957654289c80d0771b4a1b47a76bf023b5b142ad86b3cdf9b77196c2d15809" Apr 22 20:00:31.917774 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:00:31.917692 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff957654289c80d0771b4a1b47a76bf023b5b142ad86b3cdf9b77196c2d15809"} err="failed to get container status \"ff957654289c80d0771b4a1b47a76bf023b5b142ad86b3cdf9b77196c2d15809\": rpc error: code = NotFound desc = could not find container \"ff957654289c80d0771b4a1b47a76bf023b5b142ad86b3cdf9b77196c2d15809\": container with ID starting with ff957654289c80d0771b4a1b47a76bf023b5b142ad86b3cdf9b77196c2d15809 not found: ID does not exist" Apr 22 20:00:31.917774 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:00:31.917714 2577 scope.go:117] "RemoveContainer" containerID="47255050945cd82cebf7be89240aac373d3c63fb2b0779ce3be7ba3ac0b69b8e" Apr 22 20:00:31.918009 ip-10-0-143-253 kubenswrapper[2577]: E0422 20:00:31.917984 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"47255050945cd82cebf7be89240aac373d3c63fb2b0779ce3be7ba3ac0b69b8e\": container with ID starting with 47255050945cd82cebf7be89240aac373d3c63fb2b0779ce3be7ba3ac0b69b8e not found: ID does not exist" containerID="47255050945cd82cebf7be89240aac373d3c63fb2b0779ce3be7ba3ac0b69b8e" Apr 22 20:00:31.918145 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:00:31.918013 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"47255050945cd82cebf7be89240aac373d3c63fb2b0779ce3be7ba3ac0b69b8e"} err="failed to get container status \"47255050945cd82cebf7be89240aac373d3c63fb2b0779ce3be7ba3ac0b69b8e\": rpc error: code = NotFound desc = could not find container \"47255050945cd82cebf7be89240aac373d3c63fb2b0779ce3be7ba3ac0b69b8e\": container with ID starting with 47255050945cd82cebf7be89240aac373d3c63fb2b0779ce3be7ba3ac0b69b8e not found: ID does not exist" Apr 22 20:00:31.918145 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:00:31.918034 2577 scope.go:117] "RemoveContainer" containerID="9d36509763f076b7d9f58d43396cb69a54cdc04781e3deb54bb54ce112e6a593" Apr 22 20:00:31.918305 ip-10-0-143-253 kubenswrapper[2577]: E0422 20:00:31.918284 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9d36509763f076b7d9f58d43396cb69a54cdc04781e3deb54bb54ce112e6a593\": container with ID starting with 9d36509763f076b7d9f58d43396cb69a54cdc04781e3deb54bb54ce112e6a593 not found: ID does not exist" containerID="9d36509763f076b7d9f58d43396cb69a54cdc04781e3deb54bb54ce112e6a593" Apr 22 20:00:31.918342 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:00:31.918311 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d36509763f076b7d9f58d43396cb69a54cdc04781e3deb54bb54ce112e6a593"} err="failed to get container status \"9d36509763f076b7d9f58d43396cb69a54cdc04781e3deb54bb54ce112e6a593\": rpc error: code = NotFound desc = could not find container \"9d36509763f076b7d9f58d43396cb69a54cdc04781e3deb54bb54ce112e6a593\": container with ID starting with 9d36509763f076b7d9f58d43396cb69a54cdc04781e3deb54bb54ce112e6a593 not found: ID does not exist" Apr 22 20:00:31.918342 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:00:31.918326 2577 scope.go:117] "RemoveContainer" containerID="768cb8abd3099f60d7f1c5fb4985a6d2dfdfc40c0bc940ebf8a2f54e8f4c56fe" Apr 22 20:00:31.918591 ip-10-0-143-253 kubenswrapper[2577]: E0422 20:00:31.918568 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"768cb8abd3099f60d7f1c5fb4985a6d2dfdfc40c0bc940ebf8a2f54e8f4c56fe\": container with ID starting with 768cb8abd3099f60d7f1c5fb4985a6d2dfdfc40c0bc940ebf8a2f54e8f4c56fe not found: ID does not exist" containerID="768cb8abd3099f60d7f1c5fb4985a6d2dfdfc40c0bc940ebf8a2f54e8f4c56fe" Apr 22 20:00:31.918645 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:00:31.918600 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"768cb8abd3099f60d7f1c5fb4985a6d2dfdfc40c0bc940ebf8a2f54e8f4c56fe"} err="failed to get container status \"768cb8abd3099f60d7f1c5fb4985a6d2dfdfc40c0bc940ebf8a2f54e8f4c56fe\": rpc error: code = NotFound desc = could not find container \"768cb8abd3099f60d7f1c5fb4985a6d2dfdfc40c0bc940ebf8a2f54e8f4c56fe\": container with ID starting with 768cb8abd3099f60d7f1c5fb4985a6d2dfdfc40c0bc940ebf8a2f54e8f4c56fe not found: ID does not exist" Apr 22 20:00:31.918645 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:00:31.918623 2577 scope.go:117] "RemoveContainer" containerID="01af17d93af70b9f4c3307a643df01d285feded8043f809797002d7647d30266" Apr 22 20:00:31.918889 ip-10-0-143-253 kubenswrapper[2577]: E0422 20:00:31.918870 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"01af17d93af70b9f4c3307a643df01d285feded8043f809797002d7647d30266\": container with ID starting with 01af17d93af70b9f4c3307a643df01d285feded8043f809797002d7647d30266 not found: ID does not exist" containerID="01af17d93af70b9f4c3307a643df01d285feded8043f809797002d7647d30266" Apr 22 20:00:31.918957 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:00:31.918895 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"01af17d93af70b9f4c3307a643df01d285feded8043f809797002d7647d30266"} err="failed to get container status \"01af17d93af70b9f4c3307a643df01d285feded8043f809797002d7647d30266\": rpc error: code = NotFound desc = could not find container \"01af17d93af70b9f4c3307a643df01d285feded8043f809797002d7647d30266\": container with ID starting with 01af17d93af70b9f4c3307a643df01d285feded8043f809797002d7647d30266 not found: ID does not exist" Apr 22 20:00:31.919137 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:00:31.919122 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 20:00:31.919375 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:00:31.919364 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f7bfcf26-9b65-49a5-b2a1-f1339513c430" containerName="thanos-sidecar" Apr 22 20:00:31.919418 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:00:31.919376 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7bfcf26-9b65-49a5-b2a1-f1339513c430" containerName="thanos-sidecar" Apr 22 20:00:31.919418 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:00:31.919389 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f7bfcf26-9b65-49a5-b2a1-f1339513c430" containerName="kube-rbac-proxy" Apr 22 20:00:31.919418 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:00:31.919394 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7bfcf26-9b65-49a5-b2a1-f1339513c430" containerName="kube-rbac-proxy" Apr 22 20:00:31.919418 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:00:31.919401 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f7bfcf26-9b65-49a5-b2a1-f1339513c430" containerName="kube-rbac-proxy-thanos" Apr 22 20:00:31.919418 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:00:31.919406 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7bfcf26-9b65-49a5-b2a1-f1339513c430" containerName="kube-rbac-proxy-thanos" Apr 22 20:00:31.919418 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:00:31.919414 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f7bfcf26-9b65-49a5-b2a1-f1339513c430" containerName="kube-rbac-proxy-web" Apr 22 20:00:31.919418 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:00:31.919419 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7bfcf26-9b65-49a5-b2a1-f1339513c430" containerName="kube-rbac-proxy-web" Apr 22 20:00:31.919635 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:00:31.919428 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f7bfcf26-9b65-49a5-b2a1-f1339513c430" containerName="prometheus" Apr 22 20:00:31.919635 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:00:31.919433 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7bfcf26-9b65-49a5-b2a1-f1339513c430" containerName="prometheus" Apr 22 20:00:31.919635 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:00:31.919441 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f7bfcf26-9b65-49a5-b2a1-f1339513c430" containerName="init-config-reloader" Apr 22 20:00:31.919635 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:00:31.919446 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7bfcf26-9b65-49a5-b2a1-f1339513c430" containerName="init-config-reloader" Apr 22 20:00:31.919635 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:00:31.919454 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f493ef8a-2452-4069-a58f-62b9adde6d11" containerName="registry" Apr 22 20:00:31.919635 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:00:31.919459 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="f493ef8a-2452-4069-a58f-62b9adde6d11" containerName="registry" Apr 22 20:00:31.919635 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:00:31.919467 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f7bfcf26-9b65-49a5-b2a1-f1339513c430" containerName="config-reloader" Apr 22 20:00:31.919635 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:00:31.919473 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7bfcf26-9b65-49a5-b2a1-f1339513c430" containerName="config-reloader" Apr 22 20:00:31.919635 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:00:31.919513 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="f7bfcf26-9b65-49a5-b2a1-f1339513c430" containerName="kube-rbac-proxy" Apr 22 20:00:31.919635 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:00:31.919519 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="f493ef8a-2452-4069-a58f-62b9adde6d11" containerName="registry" Apr 22 20:00:31.919635 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:00:31.919528 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="f7bfcf26-9b65-49a5-b2a1-f1339513c430" containerName="config-reloader" Apr 22 20:00:31.919635 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:00:31.919534 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="f7bfcf26-9b65-49a5-b2a1-f1339513c430" containerName="kube-rbac-proxy-web" Apr 22 20:00:31.919635 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:00:31.919540 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="f7bfcf26-9b65-49a5-b2a1-f1339513c430" containerName="prometheus" Apr 22 20:00:31.919635 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:00:31.919545 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="f7bfcf26-9b65-49a5-b2a1-f1339513c430" containerName="thanos-sidecar" Apr 22 20:00:31.919635 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:00:31.919551 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="f7bfcf26-9b65-49a5-b2a1-f1339513c430" containerName="kube-rbac-proxy-thanos" Apr 22 20:00:31.923935 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:00:31.923916 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:00:31.926682 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:00:31.926650 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 22 20:00:31.926781 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:00:31.926650 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 22 20:00:31.926781 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:00:31.926728 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-77qnofi8990mi\"" Apr 22 20:00:31.926927 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:00:31.926780 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 22 20:00:31.926927 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:00:31.926782 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 22 20:00:31.927376 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:00:31.927356 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 22 20:00:31.927461 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:00:31.927391 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-ql42t\"" Apr 22 20:00:31.927461 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:00:31.927441 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 22 20:00:31.927461 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:00:31.927454 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 22 20:00:31.927595 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:00:31.927499 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 22 20:00:31.929748 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:00:31.929712 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 22 20:00:31.929876 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:00:31.929777 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 22 20:00:31.933760 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:00:31.933672 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 22 20:00:31.936234 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:00:31.936209 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 22 20:00:31.940776 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:00:31.940756 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 20:00:32.040733 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:00:32.040705 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/5d1ab1f0-f27c-4498-8914-f5927c356290-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"5d1ab1f0-f27c-4498-8914-f5927c356290\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:00:32.040910 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:00:32.040737 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/5d1ab1f0-f27c-4498-8914-f5927c356290-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"5d1ab1f0-f27c-4498-8914-f5927c356290\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:00:32.040910 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:00:32.040762 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/5d1ab1f0-f27c-4498-8914-f5927c356290-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"5d1ab1f0-f27c-4498-8914-f5927c356290\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:00:32.040910 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:00:32.040779 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/5d1ab1f0-f27c-4498-8914-f5927c356290-config-out\") pod \"prometheus-k8s-0\" (UID: \"5d1ab1f0-f27c-4498-8914-f5927c356290\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:00:32.040910 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:00:32.040795 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/5d1ab1f0-f27c-4498-8914-f5927c356290-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"5d1ab1f0-f27c-4498-8914-f5927c356290\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:00:32.040910 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:00:32.040815 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/5d1ab1f0-f27c-4498-8914-f5927c356290-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"5d1ab1f0-f27c-4498-8914-f5927c356290\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:00:32.040910 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:00:32.040860 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/5d1ab1f0-f27c-4498-8914-f5927c356290-web-config\") pod \"prometheus-k8s-0\" (UID: \"5d1ab1f0-f27c-4498-8914-f5927c356290\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:00:32.040910 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:00:32.040881 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/5d1ab1f0-f27c-4498-8914-f5927c356290-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"5d1ab1f0-f27c-4498-8914-f5927c356290\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:00:32.040910 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:00:32.040897 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/5d1ab1f0-f27c-4498-8914-f5927c356290-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"5d1ab1f0-f27c-4498-8914-f5927c356290\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:00:32.041200 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:00:32.040918 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/5d1ab1f0-f27c-4498-8914-f5927c356290-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"5d1ab1f0-f27c-4498-8914-f5927c356290\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:00:32.041200 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:00:32.040938 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5d1ab1f0-f27c-4498-8914-f5927c356290-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"5d1ab1f0-f27c-4498-8914-f5927c356290\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:00:32.041200 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:00:32.040954 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5d1ab1f0-f27c-4498-8914-f5927c356290-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"5d1ab1f0-f27c-4498-8914-f5927c356290\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:00:32.041200 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:00:32.040979 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/5d1ab1f0-f27c-4498-8914-f5927c356290-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"5d1ab1f0-f27c-4498-8914-f5927c356290\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:00:32.041200 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:00:32.040994 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5d1ab1f0-f27c-4498-8914-f5927c356290-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"5d1ab1f0-f27c-4498-8914-f5927c356290\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:00:32.041200 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:00:32.041064 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/5d1ab1f0-f27c-4498-8914-f5927c356290-config\") pod \"prometheus-k8s-0\" (UID: \"5d1ab1f0-f27c-4498-8914-f5927c356290\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:00:32.041200 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:00:32.041090 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/5d1ab1f0-f27c-4498-8914-f5927c356290-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"5d1ab1f0-f27c-4498-8914-f5927c356290\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:00:32.041200 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:00:32.041117 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5d1ab1f0-f27c-4498-8914-f5927c356290-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"5d1ab1f0-f27c-4498-8914-f5927c356290\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:00:32.041200 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:00:32.041137 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2c9rb\" (UniqueName: \"kubernetes.io/projected/5d1ab1f0-f27c-4498-8914-f5927c356290-kube-api-access-2c9rb\") pod \"prometheus-k8s-0\" (UID: \"5d1ab1f0-f27c-4498-8914-f5927c356290\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:00:32.141964 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:00:32.141881 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2c9rb\" (UniqueName: \"kubernetes.io/projected/5d1ab1f0-f27c-4498-8914-f5927c356290-kube-api-access-2c9rb\") pod \"prometheus-k8s-0\" (UID: \"5d1ab1f0-f27c-4498-8914-f5927c356290\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:00:32.141964 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:00:32.141939 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/5d1ab1f0-f27c-4498-8914-f5927c356290-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"5d1ab1f0-f27c-4498-8914-f5927c356290\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:00:32.141964 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:00:32.141968 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/5d1ab1f0-f27c-4498-8914-f5927c356290-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"5d1ab1f0-f27c-4498-8914-f5927c356290\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:00:32.142190 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:00:32.142003 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/5d1ab1f0-f27c-4498-8914-f5927c356290-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"5d1ab1f0-f27c-4498-8914-f5927c356290\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:00:32.142190 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:00:32.142139 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/5d1ab1f0-f27c-4498-8914-f5927c356290-config-out\") pod \"prometheus-k8s-0\" (UID: \"5d1ab1f0-f27c-4498-8914-f5927c356290\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:00:32.142313 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:00:32.142286 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/5d1ab1f0-f27c-4498-8914-f5927c356290-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"5d1ab1f0-f27c-4498-8914-f5927c356290\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:00:32.142386 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:00:32.142345 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/5d1ab1f0-f27c-4498-8914-f5927c356290-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"5d1ab1f0-f27c-4498-8914-f5927c356290\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:00:32.142443 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:00:32.142380 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/5d1ab1f0-f27c-4498-8914-f5927c356290-web-config\") pod \"prometheus-k8s-0\" (UID: \"5d1ab1f0-f27c-4498-8914-f5927c356290\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:00:32.142443 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:00:32.142407 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/5d1ab1f0-f27c-4498-8914-f5927c356290-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"5d1ab1f0-f27c-4498-8914-f5927c356290\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:00:32.142550 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:00:32.142443 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/5d1ab1f0-f27c-4498-8914-f5927c356290-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"5d1ab1f0-f27c-4498-8914-f5927c356290\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:00:32.142550 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:00:32.142484 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/5d1ab1f0-f27c-4498-8914-f5927c356290-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"5d1ab1f0-f27c-4498-8914-f5927c356290\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:00:32.142550 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:00:32.142508 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5d1ab1f0-f27c-4498-8914-f5927c356290-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"5d1ab1f0-f27c-4498-8914-f5927c356290\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:00:32.142550 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:00:32.142533 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5d1ab1f0-f27c-4498-8914-f5927c356290-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"5d1ab1f0-f27c-4498-8914-f5927c356290\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:00:32.142736 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:00:32.142564 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/5d1ab1f0-f27c-4498-8914-f5927c356290-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"5d1ab1f0-f27c-4498-8914-f5927c356290\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:00:32.142736 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:00:32.142586 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5d1ab1f0-f27c-4498-8914-f5927c356290-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"5d1ab1f0-f27c-4498-8914-f5927c356290\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:00:32.142736 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:00:32.142619 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/5d1ab1f0-f27c-4498-8914-f5927c356290-config\") pod \"prometheus-k8s-0\" (UID: \"5d1ab1f0-f27c-4498-8914-f5927c356290\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:00:32.142736 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:00:32.142645 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/5d1ab1f0-f27c-4498-8914-f5927c356290-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"5d1ab1f0-f27c-4498-8914-f5927c356290\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:00:32.142736 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:00:32.142689 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5d1ab1f0-f27c-4498-8914-f5927c356290-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"5d1ab1f0-f27c-4498-8914-f5927c356290\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:00:32.143506 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:00:32.143483 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5d1ab1f0-f27c-4498-8914-f5927c356290-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"5d1ab1f0-f27c-4498-8914-f5927c356290\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:00:32.145031 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:00:32.145005 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/5d1ab1f0-f27c-4498-8914-f5927c356290-config-out\") pod \"prometheus-k8s-0\" (UID: \"5d1ab1f0-f27c-4498-8914-f5927c356290\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:00:32.145197 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:00:32.145162 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/5d1ab1f0-f27c-4498-8914-f5927c356290-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"5d1ab1f0-f27c-4498-8914-f5927c356290\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:00:32.145197 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:00:32.145179 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/5d1ab1f0-f27c-4498-8914-f5927c356290-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"5d1ab1f0-f27c-4498-8914-f5927c356290\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:00:32.145349 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:00:32.145311 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/5d1ab1f0-f27c-4498-8914-f5927c356290-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"5d1ab1f0-f27c-4498-8914-f5927c356290\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:00:32.146152 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:00:32.145658 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/5d1ab1f0-f27c-4498-8914-f5927c356290-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"5d1ab1f0-f27c-4498-8914-f5927c356290\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:00:32.146152 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:00:32.145811 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/5d1ab1f0-f27c-4498-8914-f5927c356290-web-config\") pod \"prometheus-k8s-0\" (UID: \"5d1ab1f0-f27c-4498-8914-f5927c356290\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:00:32.146152 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:00:32.145818 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5d1ab1f0-f27c-4498-8914-f5927c356290-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"5d1ab1f0-f27c-4498-8914-f5927c356290\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:00:32.146152 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:00:32.145878 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5d1ab1f0-f27c-4498-8914-f5927c356290-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"5d1ab1f0-f27c-4498-8914-f5927c356290\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:00:32.146152 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:00:32.146099 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/5d1ab1f0-f27c-4498-8914-f5927c356290-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"5d1ab1f0-f27c-4498-8914-f5927c356290\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:00:32.146502 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:00:32.146470 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5d1ab1f0-f27c-4498-8914-f5927c356290-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"5d1ab1f0-f27c-4498-8914-f5927c356290\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:00:32.146629 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:00:32.146586 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/5d1ab1f0-f27c-4498-8914-f5927c356290-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"5d1ab1f0-f27c-4498-8914-f5927c356290\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:00:32.147278 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:00:32.147257 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/5d1ab1f0-f27c-4498-8914-f5927c356290-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"5d1ab1f0-f27c-4498-8914-f5927c356290\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:00:32.147743 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:00:32.147719 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/5d1ab1f0-f27c-4498-8914-f5927c356290-config\") pod \"prometheus-k8s-0\" (UID: \"5d1ab1f0-f27c-4498-8914-f5927c356290\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:00:32.147815 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:00:32.147768 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/5d1ab1f0-f27c-4498-8914-f5927c356290-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"5d1ab1f0-f27c-4498-8914-f5927c356290\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:00:32.148031 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:00:32.148011 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/5d1ab1f0-f27c-4498-8914-f5927c356290-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"5d1ab1f0-f27c-4498-8914-f5927c356290\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:00:32.148204 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:00:32.148188 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/5d1ab1f0-f27c-4498-8914-f5927c356290-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"5d1ab1f0-f27c-4498-8914-f5927c356290\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:00:32.150317 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:00:32.150294 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2c9rb\" (UniqueName: \"kubernetes.io/projected/5d1ab1f0-f27c-4498-8914-f5927c356290-kube-api-access-2c9rb\") pod \"prometheus-k8s-0\" (UID: \"5d1ab1f0-f27c-4498-8914-f5927c356290\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:00:32.234410 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:00:32.234378 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:00:32.356384 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:00:32.356349 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 20:00:32.359832 ip-10-0-143-253 kubenswrapper[2577]: W0422 20:00:32.359808 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5d1ab1f0_f27c_4498_8914_f5927c356290.slice/crio-43af03307c0594f832077f1da1b3357b450a88faa892ba8731bd1d9513778273 WatchSource:0}: Error finding container 43af03307c0594f832077f1da1b3357b450a88faa892ba8731bd1d9513778273: Status 404 returned error can't find the container with id 43af03307c0594f832077f1da1b3357b450a88faa892ba8731bd1d9513778273 Apr 22 20:00:32.875924 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:00:32.875889 2577 generic.go:358] "Generic (PLEG): container finished" podID="5d1ab1f0-f27c-4498-8914-f5927c356290" containerID="ddd90b55bf76c2ce5236942df99f305cfe6b31cdf2918b3621bb5496c39abf2a" exitCode=0 Apr 22 20:00:32.876323 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:00:32.875977 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"5d1ab1f0-f27c-4498-8914-f5927c356290","Type":"ContainerDied","Data":"ddd90b55bf76c2ce5236942df99f305cfe6b31cdf2918b3621bb5496c39abf2a"} Apr 22 20:00:32.876323 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:00:32.876013 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"5d1ab1f0-f27c-4498-8914-f5927c356290","Type":"ContainerStarted","Data":"43af03307c0594f832077f1da1b3357b450a88faa892ba8731bd1d9513778273"} Apr 22 20:00:33.252043 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:00:33.252018 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f7bfcf26-9b65-49a5-b2a1-f1339513c430" path="/var/lib/kubelet/pods/f7bfcf26-9b65-49a5-b2a1-f1339513c430/volumes" Apr 22 20:00:33.884791 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:00:33.884709 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"5d1ab1f0-f27c-4498-8914-f5927c356290","Type":"ContainerStarted","Data":"0e777e32a2ca4d9ea319042a80efb55a20445d9c9c0b45be2ca4f14481b5e950"} Apr 22 20:00:33.884791 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:00:33.884747 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"5d1ab1f0-f27c-4498-8914-f5927c356290","Type":"ContainerStarted","Data":"ecad78184a2af58d970c4e13e59689b26759cf176e8a504be9ba5970f8f4322e"} Apr 22 20:00:33.884791 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:00:33.884758 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"5d1ab1f0-f27c-4498-8914-f5927c356290","Type":"ContainerStarted","Data":"178d8bc2159fb7967874a054f2d16bd0a007ac65c93c32f3e6ea177bbc864d8c"} Apr 22 20:00:33.884791 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:00:33.884766 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"5d1ab1f0-f27c-4498-8914-f5927c356290","Type":"ContainerStarted","Data":"96d42b7a3d7be1062bfe8bcd0b81ec3bc2eee379458522ad9e4035c89b161fb2"} Apr 22 20:00:33.884791 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:00:33.884774 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"5d1ab1f0-f27c-4498-8914-f5927c356290","Type":"ContainerStarted","Data":"7ec9a7fdb073e5c43cd4e97d3ff109011cba8438e810696cd55c6ce4628e90f7"} Apr 22 20:00:33.884791 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:00:33.884783 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"5d1ab1f0-f27c-4498-8914-f5927c356290","Type":"ContainerStarted","Data":"acb04dbd347d99381d24daed0667ac594aac29823caa7d0e7d1aaa5dc7e8a93d"} Apr 22 20:00:33.914374 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:00:33.914312 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.914293315 podStartE2EDuration="2.914293315s" podCreationTimestamp="2026-04-22 20:00:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 20:00:33.912777276 +0000 UTC m=+169.227869545" watchObservedRunningTime="2026-04-22 20:00:33.914293315 +0000 UTC m=+169.229385549" Apr 22 20:00:37.234567 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:00:37.234537 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:01:32.235235 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:01:32.235193 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:01:32.250323 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:01:32.250290 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:01:33.063450 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:01:33.063424 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:02:45.171172 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:02:45.171147 2577 kubelet.go:1628] "Image garbage collection succeeded" Apr 22 20:03:06.162978 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:03:06.162943 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-68b757865b-fg7f5"] Apr 22 20:03:06.166019 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:03:06.166004 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-68b757865b-fg7f5" Apr 22 20:03:06.168162 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:03:06.168144 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"openshift-service-ca.crt\"" Apr 22 20:03:06.168925 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:03:06.168902 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-cainjector-dockercfg-k754f\"" Apr 22 20:03:06.168925 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:03:06.168917 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"kube-root-ca.crt\"" Apr 22 20:03:06.173103 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:03:06.173081 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-68b757865b-fg7f5"] Apr 22 20:03:06.260006 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:03:06.259961 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/72b04ae7-51ee-4997-a6bb-3cbe059ff755-bound-sa-token\") pod \"cert-manager-cainjector-68b757865b-fg7f5\" (UID: \"72b04ae7-51ee-4997-a6bb-3cbe059ff755\") " pod="cert-manager/cert-manager-cainjector-68b757865b-fg7f5" Apr 22 20:03:06.260181 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:03:06.260034 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wbrmd\" (UniqueName: \"kubernetes.io/projected/72b04ae7-51ee-4997-a6bb-3cbe059ff755-kube-api-access-wbrmd\") pod \"cert-manager-cainjector-68b757865b-fg7f5\" (UID: \"72b04ae7-51ee-4997-a6bb-3cbe059ff755\") " pod="cert-manager/cert-manager-cainjector-68b757865b-fg7f5" Apr 22 20:03:06.360821 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:03:06.360780 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/72b04ae7-51ee-4997-a6bb-3cbe059ff755-bound-sa-token\") pod \"cert-manager-cainjector-68b757865b-fg7f5\" (UID: \"72b04ae7-51ee-4997-a6bb-3cbe059ff755\") " pod="cert-manager/cert-manager-cainjector-68b757865b-fg7f5" Apr 22 20:03:06.361033 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:03:06.360832 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wbrmd\" (UniqueName: \"kubernetes.io/projected/72b04ae7-51ee-4997-a6bb-3cbe059ff755-kube-api-access-wbrmd\") pod \"cert-manager-cainjector-68b757865b-fg7f5\" (UID: \"72b04ae7-51ee-4997-a6bb-3cbe059ff755\") " pod="cert-manager/cert-manager-cainjector-68b757865b-fg7f5" Apr 22 20:03:06.370959 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:03:06.370934 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/72b04ae7-51ee-4997-a6bb-3cbe059ff755-bound-sa-token\") pod \"cert-manager-cainjector-68b757865b-fg7f5\" (UID: \"72b04ae7-51ee-4997-a6bb-3cbe059ff755\") " pod="cert-manager/cert-manager-cainjector-68b757865b-fg7f5" Apr 22 20:03:06.371143 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:03:06.371122 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wbrmd\" (UniqueName: \"kubernetes.io/projected/72b04ae7-51ee-4997-a6bb-3cbe059ff755-kube-api-access-wbrmd\") pod \"cert-manager-cainjector-68b757865b-fg7f5\" (UID: \"72b04ae7-51ee-4997-a6bb-3cbe059ff755\") " pod="cert-manager/cert-manager-cainjector-68b757865b-fg7f5" Apr 22 20:03:06.482765 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:03:06.482689 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-68b757865b-fg7f5" Apr 22 20:03:06.596079 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:03:06.596041 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-68b757865b-fg7f5"] Apr 22 20:03:06.598704 ip-10-0-143-253 kubenswrapper[2577]: W0422 20:03:06.598671 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod72b04ae7_51ee_4997_a6bb_3cbe059ff755.slice/crio-bf03f9e3de3736f3563f25963e92d03a4b4cf9ab5d4003f35a82469bef51fefd WatchSource:0}: Error finding container bf03f9e3de3736f3563f25963e92d03a4b4cf9ab5d4003f35a82469bef51fefd: Status 404 returned error can't find the container with id bf03f9e3de3736f3563f25963e92d03a4b4cf9ab5d4003f35a82469bef51fefd Apr 22 20:03:06.600375 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:03:06.600357 2577 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 20:03:07.297689 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:03:07.297651 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-68b757865b-fg7f5" event={"ID":"72b04ae7-51ee-4997-a6bb-3cbe059ff755","Type":"ContainerStarted","Data":"bf03f9e3de3736f3563f25963e92d03a4b4cf9ab5d4003f35a82469bef51fefd"} Apr 22 20:03:08.370359 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:03:08.370324 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-587ccfb98-9vhkl"] Apr 22 20:03:08.377305 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:03:08.377279 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-587ccfb98-9vhkl" Apr 22 20:03:08.380003 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:03:08.379975 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-webhook-dockercfg-9hm6z\"" Apr 22 20:03:08.380968 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:03:08.380937 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-587ccfb98-9vhkl"] Apr 22 20:03:08.474940 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:03:08.474903 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qj2dv\" (UniqueName: \"kubernetes.io/projected/a48b56ba-2256-46ce-8f19-98bad630496b-kube-api-access-qj2dv\") pod \"cert-manager-webhook-587ccfb98-9vhkl\" (UID: \"a48b56ba-2256-46ce-8f19-98bad630496b\") " pod="cert-manager/cert-manager-webhook-587ccfb98-9vhkl" Apr 22 20:03:08.475101 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:03:08.474989 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a48b56ba-2256-46ce-8f19-98bad630496b-bound-sa-token\") pod \"cert-manager-webhook-587ccfb98-9vhkl\" (UID: \"a48b56ba-2256-46ce-8f19-98bad630496b\") " pod="cert-manager/cert-manager-webhook-587ccfb98-9vhkl" Apr 22 20:03:08.576056 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:03:08.576012 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qj2dv\" (UniqueName: \"kubernetes.io/projected/a48b56ba-2256-46ce-8f19-98bad630496b-kube-api-access-qj2dv\") pod \"cert-manager-webhook-587ccfb98-9vhkl\" (UID: \"a48b56ba-2256-46ce-8f19-98bad630496b\") " pod="cert-manager/cert-manager-webhook-587ccfb98-9vhkl" Apr 22 20:03:08.576224 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:03:08.576119 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a48b56ba-2256-46ce-8f19-98bad630496b-bound-sa-token\") pod \"cert-manager-webhook-587ccfb98-9vhkl\" (UID: \"a48b56ba-2256-46ce-8f19-98bad630496b\") " pod="cert-manager/cert-manager-webhook-587ccfb98-9vhkl" Apr 22 20:03:08.584284 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:03:08.584254 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a48b56ba-2256-46ce-8f19-98bad630496b-bound-sa-token\") pod \"cert-manager-webhook-587ccfb98-9vhkl\" (UID: \"a48b56ba-2256-46ce-8f19-98bad630496b\") " pod="cert-manager/cert-manager-webhook-587ccfb98-9vhkl" Apr 22 20:03:08.584422 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:03:08.584312 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qj2dv\" (UniqueName: \"kubernetes.io/projected/a48b56ba-2256-46ce-8f19-98bad630496b-kube-api-access-qj2dv\") pod \"cert-manager-webhook-587ccfb98-9vhkl\" (UID: \"a48b56ba-2256-46ce-8f19-98bad630496b\") " pod="cert-manager/cert-manager-webhook-587ccfb98-9vhkl" Apr 22 20:03:08.690258 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:03:08.690169 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-587ccfb98-9vhkl" Apr 22 20:03:09.433376 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:03:09.433354 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-587ccfb98-9vhkl"] Apr 22 20:03:09.435093 ip-10-0-143-253 kubenswrapper[2577]: W0422 20:03:09.435062 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda48b56ba_2256_46ce_8f19_98bad630496b.slice/crio-d0ba642218489dae38a5258390203bbd2d16ea4918f39dae8a65874bb7613b5c WatchSource:0}: Error finding container d0ba642218489dae38a5258390203bbd2d16ea4918f39dae8a65874bb7613b5c: Status 404 returned error can't find the container with id d0ba642218489dae38a5258390203bbd2d16ea4918f39dae8a65874bb7613b5c Apr 22 20:03:10.307962 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:03:10.307923 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-68b757865b-fg7f5" event={"ID":"72b04ae7-51ee-4997-a6bb-3cbe059ff755","Type":"ContainerStarted","Data":"71646af50e12a2014248649b603907311ea6c207a0104ee9d857d2c8bea036b2"} Apr 22 20:03:10.309063 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:03:10.309040 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-587ccfb98-9vhkl" event={"ID":"a48b56ba-2256-46ce-8f19-98bad630496b","Type":"ContainerStarted","Data":"cca1a0f9ea96d529276ef3ffc6e674a48c85bbe03ea0399ed23e3b9976d71003"} Apr 22 20:03:10.309174 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:03:10.309068 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-587ccfb98-9vhkl" event={"ID":"a48b56ba-2256-46ce-8f19-98bad630496b","Type":"ContainerStarted","Data":"d0ba642218489dae38a5258390203bbd2d16ea4918f39dae8a65874bb7613b5c"} Apr 22 20:03:10.309174 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:03:10.309164 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="cert-manager/cert-manager-webhook-587ccfb98-9vhkl" Apr 22 20:03:10.323932 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:03:10.323883 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-68b757865b-fg7f5" podStartSLOduration=1.542292583 podStartE2EDuration="4.323870631s" podCreationTimestamp="2026-04-22 20:03:06 +0000 UTC" firstStartedPulling="2026-04-22 20:03:06.600490774 +0000 UTC m=+321.915582984" lastFinishedPulling="2026-04-22 20:03:09.382068822 +0000 UTC m=+324.697161032" observedRunningTime="2026-04-22 20:03:10.322402319 +0000 UTC m=+325.637494551" watchObservedRunningTime="2026-04-22 20:03:10.323870631 +0000 UTC m=+325.638962855" Apr 22 20:03:10.337271 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:03:10.337223 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-587ccfb98-9vhkl" podStartSLOduration=2.337204595 podStartE2EDuration="2.337204595s" podCreationTimestamp="2026-04-22 20:03:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 20:03:10.336093347 +0000 UTC m=+325.651185587" watchObservedRunningTime="2026-04-22 20:03:10.337204595 +0000 UTC m=+325.652296832" Apr 22 20:03:16.314693 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:03:16.314655 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-587ccfb98-9vhkl" Apr 22 20:03:20.351776 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:03:20.351739 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-zws8j"] Apr 22 20:03:20.358684 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:03:20.358656 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-zws8j" Apr 22 20:03:20.361320 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:03:20.361293 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"openshift-service-ca.crt\"" Apr 22 20:03:20.361439 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:03:20.361297 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"kube-root-ca.crt\"" Apr 22 20:03:20.362188 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:03:20.362175 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"openshift-lws-operator-dockercfg-kp85j\"" Apr 22 20:03:20.368248 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:03:20.368226 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-zws8j"] Apr 22 20:03:20.452667 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:03:20.452636 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/9fbe25ee-cbc0-42fd-9592-6015d395cd1f-tmp\") pod \"openshift-lws-operator-bfc7f696d-zws8j\" (UID: \"9fbe25ee-cbc0-42fd-9592-6015d395cd1f\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-zws8j" Apr 22 20:03:20.452667 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:03:20.452669 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4l8x4\" (UniqueName: \"kubernetes.io/projected/9fbe25ee-cbc0-42fd-9592-6015d395cd1f-kube-api-access-4l8x4\") pod \"openshift-lws-operator-bfc7f696d-zws8j\" (UID: \"9fbe25ee-cbc0-42fd-9592-6015d395cd1f\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-zws8j" Apr 22 20:03:20.553541 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:03:20.553507 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/9fbe25ee-cbc0-42fd-9592-6015d395cd1f-tmp\") pod \"openshift-lws-operator-bfc7f696d-zws8j\" (UID: \"9fbe25ee-cbc0-42fd-9592-6015d395cd1f\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-zws8j" Apr 22 20:03:20.553541 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:03:20.553541 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4l8x4\" (UniqueName: \"kubernetes.io/projected/9fbe25ee-cbc0-42fd-9592-6015d395cd1f-kube-api-access-4l8x4\") pod \"openshift-lws-operator-bfc7f696d-zws8j\" (UID: \"9fbe25ee-cbc0-42fd-9592-6015d395cd1f\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-zws8j" Apr 22 20:03:20.553972 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:03:20.553949 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/9fbe25ee-cbc0-42fd-9592-6015d395cd1f-tmp\") pod \"openshift-lws-operator-bfc7f696d-zws8j\" (UID: \"9fbe25ee-cbc0-42fd-9592-6015d395cd1f\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-zws8j" Apr 22 20:03:20.562022 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:03:20.562002 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4l8x4\" (UniqueName: \"kubernetes.io/projected/9fbe25ee-cbc0-42fd-9592-6015d395cd1f-kube-api-access-4l8x4\") pod \"openshift-lws-operator-bfc7f696d-zws8j\" (UID: \"9fbe25ee-cbc0-42fd-9592-6015d395cd1f\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-zws8j" Apr 22 20:03:20.668956 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:03:20.668878 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-zws8j" Apr 22 20:03:20.782137 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:03:20.782113 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-zws8j"] Apr 22 20:03:20.784866 ip-10-0-143-253 kubenswrapper[2577]: W0422 20:03:20.784815 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9fbe25ee_cbc0_42fd_9592_6015d395cd1f.slice/crio-2378b7790738ed28ddfad71c46b22186b83cd86e7db501081d11251de0ab3c6e WatchSource:0}: Error finding container 2378b7790738ed28ddfad71c46b22186b83cd86e7db501081d11251de0ab3c6e: Status 404 returned error can't find the container with id 2378b7790738ed28ddfad71c46b22186b83cd86e7db501081d11251de0ab3c6e Apr 22 20:03:21.255270 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:03:21.255240 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-79c8d999ff-b6cj8"] Apr 22 20:03:21.259726 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:03:21.259704 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-79c8d999ff-b6cj8" Apr 22 20:03:21.261902 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:03:21.261877 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-dockercfg-vgbqp\"" Apr 22 20:03:21.264984 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:03:21.264964 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-79c8d999ff-b6cj8"] Apr 22 20:03:21.341025 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:03:21.340993 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-zws8j" event={"ID":"9fbe25ee-cbc0-42fd-9592-6015d395cd1f","Type":"ContainerStarted","Data":"2378b7790738ed28ddfad71c46b22186b83cd86e7db501081d11251de0ab3c6e"} Apr 22 20:03:21.357953 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:03:21.357931 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9kjxb\" (UniqueName: \"kubernetes.io/projected/e539b585-55da-4149-a6a5-acf759c55565-kube-api-access-9kjxb\") pod \"cert-manager-79c8d999ff-b6cj8\" (UID: \"e539b585-55da-4149-a6a5-acf759c55565\") " pod="cert-manager/cert-manager-79c8d999ff-b6cj8" Apr 22 20:03:21.358306 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:03:21.357982 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e539b585-55da-4149-a6a5-acf759c55565-bound-sa-token\") pod \"cert-manager-79c8d999ff-b6cj8\" (UID: \"e539b585-55da-4149-a6a5-acf759c55565\") " pod="cert-manager/cert-manager-79c8d999ff-b6cj8" Apr 22 20:03:21.459125 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:03:21.459100 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9kjxb\" (UniqueName: \"kubernetes.io/projected/e539b585-55da-4149-a6a5-acf759c55565-kube-api-access-9kjxb\") pod \"cert-manager-79c8d999ff-b6cj8\" (UID: \"e539b585-55da-4149-a6a5-acf759c55565\") " pod="cert-manager/cert-manager-79c8d999ff-b6cj8" Apr 22 20:03:21.459292 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:03:21.459145 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e539b585-55da-4149-a6a5-acf759c55565-bound-sa-token\") pod \"cert-manager-79c8d999ff-b6cj8\" (UID: \"e539b585-55da-4149-a6a5-acf759c55565\") " pod="cert-manager/cert-manager-79c8d999ff-b6cj8" Apr 22 20:03:21.469070 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:03:21.469040 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9kjxb\" (UniqueName: \"kubernetes.io/projected/e539b585-55da-4149-a6a5-acf759c55565-kube-api-access-9kjxb\") pod \"cert-manager-79c8d999ff-b6cj8\" (UID: \"e539b585-55da-4149-a6a5-acf759c55565\") " pod="cert-manager/cert-manager-79c8d999ff-b6cj8" Apr 22 20:03:21.469070 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:03:21.469051 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e539b585-55da-4149-a6a5-acf759c55565-bound-sa-token\") pod \"cert-manager-79c8d999ff-b6cj8\" (UID: \"e539b585-55da-4149-a6a5-acf759c55565\") " pod="cert-manager/cert-manager-79c8d999ff-b6cj8" Apr 22 20:03:21.570359 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:03:21.570329 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-79c8d999ff-b6cj8" Apr 22 20:03:21.692778 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:03:21.692652 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-79c8d999ff-b6cj8"] Apr 22 20:03:21.695746 ip-10-0-143-253 kubenswrapper[2577]: W0422 20:03:21.695715 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode539b585_55da_4149_a6a5_acf759c55565.slice/crio-3c33b8efe55079f317440bfa0bf803aec266c61031c43e9cb8c7968c8abf610f WatchSource:0}: Error finding container 3c33b8efe55079f317440bfa0bf803aec266c61031c43e9cb8c7968c8abf610f: Status 404 returned error can't find the container with id 3c33b8efe55079f317440bfa0bf803aec266c61031c43e9cb8c7968c8abf610f Apr 22 20:03:22.344778 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:03:22.344739 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-79c8d999ff-b6cj8" event={"ID":"e539b585-55da-4149-a6a5-acf759c55565","Type":"ContainerStarted","Data":"8e78f652937b294440d30d79f4516387dcaa663798f07e5c9ab15a29b4b46823"} Apr 22 20:03:22.344778 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:03:22.344780 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-79c8d999ff-b6cj8" event={"ID":"e539b585-55da-4149-a6a5-acf759c55565","Type":"ContainerStarted","Data":"3c33b8efe55079f317440bfa0bf803aec266c61031c43e9cb8c7968c8abf610f"} Apr 22 20:03:22.359793 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:03:22.359742 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-79c8d999ff-b6cj8" podStartSLOduration=1.359727786 podStartE2EDuration="1.359727786s" podCreationTimestamp="2026-04-22 20:03:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 20:03:22.358680891 +0000 UTC m=+337.673773122" watchObservedRunningTime="2026-04-22 20:03:22.359727786 +0000 UTC m=+337.674820018" Apr 22 20:03:25.355981 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:03:25.355949 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-zws8j" event={"ID":"9fbe25ee-cbc0-42fd-9592-6015d395cd1f","Type":"ContainerStarted","Data":"aa396b11b7e0cba78c4f9e9e1f8379e749bcdd0f83859f02dca244fde52b793f"} Apr 22 20:03:25.371868 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:03:25.371804 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-zws8j" podStartSLOduration=1.395446693 podStartE2EDuration="5.371791999s" podCreationTimestamp="2026-04-22 20:03:20 +0000 UTC" firstStartedPulling="2026-04-22 20:03:20.788141821 +0000 UTC m=+336.103234031" lastFinishedPulling="2026-04-22 20:03:24.764487123 +0000 UTC m=+340.079579337" observedRunningTime="2026-04-22 20:03:25.370102514 +0000 UTC m=+340.685194746" watchObservedRunningTime="2026-04-22 20:03:25.371791999 +0000 UTC m=+340.686884231" Apr 22 20:03:54.229213 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:03:54.229177 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/lws-controller-manager-5846f88986-g272f"] Apr 22 20:03:54.235111 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:03:54.235087 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-5846f88986-g272f" Apr 22 20:03:54.241271 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:03:54.241247 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"metrics-server-cert\"" Apr 22 20:03:54.242136 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:03:54.242113 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"webhook-server-cert\"" Apr 22 20:03:54.244937 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:03:54.244915 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"lws-manager-config\"" Apr 22 20:03:54.245211 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:03:54.245050 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"lws-controller-manager-dockercfg-f4nfg\"" Apr 22 20:03:54.246507 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:03:54.246489 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-5846f88986-g272f"] Apr 22 20:03:54.324821 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:03:54.324784 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e6f57a3c-e388-4c16-aab1-ab242ce9d1bb-cert\") pod \"lws-controller-manager-5846f88986-g272f\" (UID: \"e6f57a3c-e388-4c16-aab1-ab242ce9d1bb\") " pod="openshift-lws-operator/lws-controller-manager-5846f88986-g272f" Apr 22 20:03:54.325096 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:03:54.324829 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/e6f57a3c-e388-4c16-aab1-ab242ce9d1bb-metrics-cert\") pod \"lws-controller-manager-5846f88986-g272f\" (UID: \"e6f57a3c-e388-4c16-aab1-ab242ce9d1bb\") " pod="openshift-lws-operator/lws-controller-manager-5846f88986-g272f" Apr 22 20:03:54.325096 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:03:54.325006 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/e6f57a3c-e388-4c16-aab1-ab242ce9d1bb-manager-config\") pod \"lws-controller-manager-5846f88986-g272f\" (UID: \"e6f57a3c-e388-4c16-aab1-ab242ce9d1bb\") " pod="openshift-lws-operator/lws-controller-manager-5846f88986-g272f" Apr 22 20:03:54.325096 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:03:54.325061 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n48fx\" (UniqueName: \"kubernetes.io/projected/e6f57a3c-e388-4c16-aab1-ab242ce9d1bb-kube-api-access-n48fx\") pod \"lws-controller-manager-5846f88986-g272f\" (UID: \"e6f57a3c-e388-4c16-aab1-ab242ce9d1bb\") " pod="openshift-lws-operator/lws-controller-manager-5846f88986-g272f" Apr 22 20:03:54.426384 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:03:54.426344 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/e6f57a3c-e388-4c16-aab1-ab242ce9d1bb-manager-config\") pod \"lws-controller-manager-5846f88986-g272f\" (UID: \"e6f57a3c-e388-4c16-aab1-ab242ce9d1bb\") " pod="openshift-lws-operator/lws-controller-manager-5846f88986-g272f" Apr 22 20:03:54.426564 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:03:54.426407 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n48fx\" (UniqueName: \"kubernetes.io/projected/e6f57a3c-e388-4c16-aab1-ab242ce9d1bb-kube-api-access-n48fx\") pod \"lws-controller-manager-5846f88986-g272f\" (UID: \"e6f57a3c-e388-4c16-aab1-ab242ce9d1bb\") " pod="openshift-lws-operator/lws-controller-manager-5846f88986-g272f" Apr 22 20:03:54.426564 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:03:54.426444 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e6f57a3c-e388-4c16-aab1-ab242ce9d1bb-cert\") pod \"lws-controller-manager-5846f88986-g272f\" (UID: \"e6f57a3c-e388-4c16-aab1-ab242ce9d1bb\") " pod="openshift-lws-operator/lws-controller-manager-5846f88986-g272f" Apr 22 20:03:54.426564 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:03:54.426465 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/e6f57a3c-e388-4c16-aab1-ab242ce9d1bb-metrics-cert\") pod \"lws-controller-manager-5846f88986-g272f\" (UID: \"e6f57a3c-e388-4c16-aab1-ab242ce9d1bb\") " pod="openshift-lws-operator/lws-controller-manager-5846f88986-g272f" Apr 22 20:03:54.427191 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:03:54.427166 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/e6f57a3c-e388-4c16-aab1-ab242ce9d1bb-manager-config\") pod \"lws-controller-manager-5846f88986-g272f\" (UID: \"e6f57a3c-e388-4c16-aab1-ab242ce9d1bb\") " pod="openshift-lws-operator/lws-controller-manager-5846f88986-g272f" Apr 22 20:03:54.428931 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:03:54.428912 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e6f57a3c-e388-4c16-aab1-ab242ce9d1bb-cert\") pod \"lws-controller-manager-5846f88986-g272f\" (UID: \"e6f57a3c-e388-4c16-aab1-ab242ce9d1bb\") " pod="openshift-lws-operator/lws-controller-manager-5846f88986-g272f" Apr 22 20:03:54.429093 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:03:54.429073 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/e6f57a3c-e388-4c16-aab1-ab242ce9d1bb-metrics-cert\") pod \"lws-controller-manager-5846f88986-g272f\" (UID: \"e6f57a3c-e388-4c16-aab1-ab242ce9d1bb\") " pod="openshift-lws-operator/lws-controller-manager-5846f88986-g272f" Apr 22 20:03:54.437647 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:03:54.437621 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n48fx\" (UniqueName: \"kubernetes.io/projected/e6f57a3c-e388-4c16-aab1-ab242ce9d1bb-kube-api-access-n48fx\") pod \"lws-controller-manager-5846f88986-g272f\" (UID: \"e6f57a3c-e388-4c16-aab1-ab242ce9d1bb\") " pod="openshift-lws-operator/lws-controller-manager-5846f88986-g272f" Apr 22 20:03:54.544753 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:03:54.544716 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-5846f88986-g272f" Apr 22 20:03:54.670246 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:03:54.670214 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-5846f88986-g272f"] Apr 22 20:03:54.672002 ip-10-0-143-253 kubenswrapper[2577]: W0422 20:03:54.671971 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode6f57a3c_e388_4c16_aab1_ab242ce9d1bb.slice/crio-0acf6608f4d14433db1094163a4019d6cb30f014992e882a930af0442dccd4ac WatchSource:0}: Error finding container 0acf6608f4d14433db1094163a4019d6cb30f014992e882a930af0442dccd4ac: Status 404 returned error can't find the container with id 0acf6608f4d14433db1094163a4019d6cb30f014992e882a930af0442dccd4ac Apr 22 20:03:55.440905 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:03:55.440863 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-5846f88986-g272f" event={"ID":"e6f57a3c-e388-4c16-aab1-ab242ce9d1bb","Type":"ContainerStarted","Data":"0acf6608f4d14433db1094163a4019d6cb30f014992e882a930af0442dccd4ac"} Apr 22 20:03:57.447480 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:03:57.447433 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-5846f88986-g272f" event={"ID":"e6f57a3c-e388-4c16-aab1-ab242ce9d1bb","Type":"ContainerStarted","Data":"24fc084b567a83a66080fa3f3fddb13fd58093d9acd89e59c43469e62b98e13f"} Apr 22 20:03:57.447921 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:03:57.447670 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-lws-operator/lws-controller-manager-5846f88986-g272f" Apr 22 20:03:57.467294 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:03:57.467241 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/lws-controller-manager-5846f88986-g272f" podStartSLOduration=1.391408402 podStartE2EDuration="3.467222412s" podCreationTimestamp="2026-04-22 20:03:54 +0000 UTC" firstStartedPulling="2026-04-22 20:03:54.673932929 +0000 UTC m=+369.989025141" lastFinishedPulling="2026-04-22 20:03:56.749746937 +0000 UTC m=+372.064839151" observedRunningTime="2026-04-22 20:03:57.465026559 +0000 UTC m=+372.780118839" watchObservedRunningTime="2026-04-22 20:03:57.467222412 +0000 UTC m=+372.782314648" Apr 22 20:04:08.452681 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:04:08.452646 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-lws-operator/lws-controller-manager-5846f88986-g272f" Apr 22 20:04:32.226715 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:04:32.226685 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-operator-7587b89b76-7bw52"] Apr 22 20:04:32.229110 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:04:32.229093 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-operator-7587b89b76-7bw52" Apr 22 20:04:32.231624 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:04:32.231601 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 22 20:04:32.231736 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:04:32.231629 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 22 20:04:32.232120 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:04:32.232100 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-operator-dockercfg-hc8vs\"" Apr 22 20:04:32.239784 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:04:32.239764 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-operator-7587b89b76-7bw52"] Apr 22 20:04:32.336383 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:04:32.336349 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ccjmc\" (UniqueName: \"kubernetes.io/projected/6971e1a7-75bb-4cf5-9121-501f2d314714-kube-api-access-ccjmc\") pod \"authorino-operator-7587b89b76-7bw52\" (UID: \"6971e1a7-75bb-4cf5-9121-501f2d314714\") " pod="kuadrant-system/authorino-operator-7587b89b76-7bw52" Apr 22 20:04:32.437576 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:04:32.437538 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ccjmc\" (UniqueName: \"kubernetes.io/projected/6971e1a7-75bb-4cf5-9121-501f2d314714-kube-api-access-ccjmc\") pod \"authorino-operator-7587b89b76-7bw52\" (UID: \"6971e1a7-75bb-4cf5-9121-501f2d314714\") " pod="kuadrant-system/authorino-operator-7587b89b76-7bw52" Apr 22 20:04:32.447328 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:04:32.447307 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ccjmc\" (UniqueName: \"kubernetes.io/projected/6971e1a7-75bb-4cf5-9121-501f2d314714-kube-api-access-ccjmc\") pod \"authorino-operator-7587b89b76-7bw52\" (UID: \"6971e1a7-75bb-4cf5-9121-501f2d314714\") " pod="kuadrant-system/authorino-operator-7587b89b76-7bw52" Apr 22 20:04:32.540473 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:04:32.540431 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-operator-7587b89b76-7bw52" Apr 22 20:04:32.657695 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:04:32.657663 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-operator-7587b89b76-7bw52"] Apr 22 20:04:32.661706 ip-10-0-143-253 kubenswrapper[2577]: W0422 20:04:32.661679 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6971e1a7_75bb_4cf5_9121_501f2d314714.slice/crio-fbe10ca06e4d9a4767b4341c1ab304eab24d651c59da6e0b3c7c77a57d1c4221 WatchSource:0}: Error finding container fbe10ca06e4d9a4767b4341c1ab304eab24d651c59da6e0b3c7c77a57d1c4221: Status 404 returned error can't find the container with id fbe10ca06e4d9a4767b4341c1ab304eab24d651c59da6e0b3c7c77a57d1c4221 Apr 22 20:04:33.562006 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:04:33.561974 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-operator-7587b89b76-7bw52" event={"ID":"6971e1a7-75bb-4cf5-9121-501f2d314714","Type":"ContainerStarted","Data":"fbe10ca06e4d9a4767b4341c1ab304eab24d651c59da6e0b3c7c77a57d1c4221"} Apr 22 20:04:36.572350 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:04:36.572313 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-operator-7587b89b76-7bw52" event={"ID":"6971e1a7-75bb-4cf5-9121-501f2d314714","Type":"ContainerStarted","Data":"94fdf3245e8047449579ea4662f0ad0d909abd46574e2c8d95909a8242b61fc2"} Apr 22 20:04:36.572727 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:04:36.572443 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/authorino-operator-7587b89b76-7bw52" Apr 22 20:04:36.590515 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:04:36.590465 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-operator-7587b89b76-7bw52" podStartSLOduration=1.370520753 podStartE2EDuration="4.590451246s" podCreationTimestamp="2026-04-22 20:04:32 +0000 UTC" firstStartedPulling="2026-04-22 20:04:32.663556613 +0000 UTC m=+407.978648823" lastFinishedPulling="2026-04-22 20:04:35.883487105 +0000 UTC m=+411.198579316" observedRunningTime="2026-04-22 20:04:36.588417419 +0000 UTC m=+411.903509652" watchObservedRunningTime="2026-04-22 20:04:36.590451246 +0000 UTC m=+411.905543480" Apr 22 20:04:47.577869 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:04:47.577816 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/authorino-operator-7587b89b76-7bw52" Apr 22 20:05:23.479288 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:05:23.479251 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-limitador-67566c68b4-x98p9"] Apr 22 20:05:23.483729 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:05:23.483708 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-67566c68b4-x98p9" Apr 22 20:05:23.485858 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:05:23.485811 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"limitador-limits-config-limitador\"" Apr 22 20:05:23.485972 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:05:23.485819 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-ghvlf\"" Apr 22 20:05:23.488596 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:05:23.488574 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-67566c68b4-x98p9"] Apr 22 20:05:23.515630 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:05:23.515586 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-67566c68b4-x98p9"] Apr 22 20:05:23.560527 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:05:23.560489 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/e0e8f835-af26-4eab-89bc-aaba452e1d80-config-file\") pod \"limitador-limitador-67566c68b4-x98p9\" (UID: \"e0e8f835-af26-4eab-89bc-aaba452e1d80\") " pod="kuadrant-system/limitador-limitador-67566c68b4-x98p9" Apr 22 20:05:23.560527 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:05:23.560528 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fc8nq\" (UniqueName: \"kubernetes.io/projected/e0e8f835-af26-4eab-89bc-aaba452e1d80-kube-api-access-fc8nq\") pod \"limitador-limitador-67566c68b4-x98p9\" (UID: \"e0e8f835-af26-4eab-89bc-aaba452e1d80\") " pod="kuadrant-system/limitador-limitador-67566c68b4-x98p9" Apr 22 20:05:23.661744 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:05:23.661704 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/e0e8f835-af26-4eab-89bc-aaba452e1d80-config-file\") pod \"limitador-limitador-67566c68b4-x98p9\" (UID: \"e0e8f835-af26-4eab-89bc-aaba452e1d80\") " pod="kuadrant-system/limitador-limitador-67566c68b4-x98p9" Apr 22 20:05:23.661744 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:05:23.661747 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fc8nq\" (UniqueName: \"kubernetes.io/projected/e0e8f835-af26-4eab-89bc-aaba452e1d80-kube-api-access-fc8nq\") pod \"limitador-limitador-67566c68b4-x98p9\" (UID: \"e0e8f835-af26-4eab-89bc-aaba452e1d80\") " pod="kuadrant-system/limitador-limitador-67566c68b4-x98p9" Apr 22 20:05:23.662359 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:05:23.662332 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/e0e8f835-af26-4eab-89bc-aaba452e1d80-config-file\") pod \"limitador-limitador-67566c68b4-x98p9\" (UID: \"e0e8f835-af26-4eab-89bc-aaba452e1d80\") " pod="kuadrant-system/limitador-limitador-67566c68b4-x98p9" Apr 22 20:05:23.669710 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:05:23.669674 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fc8nq\" (UniqueName: \"kubernetes.io/projected/e0e8f835-af26-4eab-89bc-aaba452e1d80-kube-api-access-fc8nq\") pod \"limitador-limitador-67566c68b4-x98p9\" (UID: \"e0e8f835-af26-4eab-89bc-aaba452e1d80\") " pod="kuadrant-system/limitador-limitador-67566c68b4-x98p9" Apr 22 20:05:23.796679 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:05:23.796646 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-67566c68b4-x98p9" Apr 22 20:05:23.922825 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:05:23.922792 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-67566c68b4-x98p9"] Apr 22 20:05:23.926470 ip-10-0-143-253 kubenswrapper[2577]: W0422 20:05:23.926433 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode0e8f835_af26_4eab_89bc_aaba452e1d80.slice/crio-ba788763bbe5258228bc1ba20971c7ec3fe7cef0509c312b1b271b3147e4da79 WatchSource:0}: Error finding container ba788763bbe5258228bc1ba20971c7ec3fe7cef0509c312b1b271b3147e4da79: Status 404 returned error can't find the container with id ba788763bbe5258228bc1ba20971c7ec3fe7cef0509c312b1b271b3147e4da79 Apr 22 20:05:24.722201 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:05:24.722151 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-67566c68b4-x98p9" event={"ID":"e0e8f835-af26-4eab-89bc-aaba452e1d80","Type":"ContainerStarted","Data":"ba788763bbe5258228bc1ba20971c7ec3fe7cef0509c312b1b271b3147e4da79"} Apr 22 20:05:27.737806 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:05:27.737767 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-67566c68b4-x98p9" event={"ID":"e0e8f835-af26-4eab-89bc-aaba452e1d80","Type":"ContainerStarted","Data":"760a2ebe1aebf3e3660719e17309f49419bd111ad5811f771a862d0647d895e3"} Apr 22 20:05:27.738285 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:05:27.737899 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-limitador-67566c68b4-x98p9" Apr 22 20:05:27.754080 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:05:27.754015 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-limitador-67566c68b4-x98p9" podStartSLOduration=1.122245006 podStartE2EDuration="4.75399249s" podCreationTimestamp="2026-04-22 20:05:23 +0000 UTC" firstStartedPulling="2026-04-22 20:05:23.928705525 +0000 UTC m=+459.243797735" lastFinishedPulling="2026-04-22 20:05:27.560452991 +0000 UTC m=+462.875545219" observedRunningTime="2026-04-22 20:05:27.752111796 +0000 UTC m=+463.067204040" watchObservedRunningTime="2026-04-22 20:05:27.75399249 +0000 UTC m=+463.069084722" Apr 22 20:05:38.741589 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:05:38.741553 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-limitador-67566c68b4-x98p9" Apr 22 20:08:34.396811 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:08:34.396778 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-798f86c66f445pd"] Apr 22 20:08:34.399135 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:08:34.399119 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-798f86c66f445pd" Apr 22 20:08:34.401556 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:08:34.401530 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 22 20:08:34.401556 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:08:34.401556 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 22 20:08:34.402374 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:08:34.402356 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-tlcdw\"" Apr 22 20:08:34.402477 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:08:34.402404 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvad71fa5348b85aebd404221bba611457-kserve-self-signed-certs\"" Apr 22 20:08:34.409206 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:08:34.409185 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-798f86c66f445pd"] Apr 22 20:08:34.438700 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:08:34.438675 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/e5eef0e7-98de-4a8d-94b5-467e0feef2bd-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-798f86c66f445pd\" (UID: \"e5eef0e7-98de-4a8d-94b5-467e0feef2bd\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-798f86c66f445pd" Apr 22 20:08:34.438823 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:08:34.438706 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/e5eef0e7-98de-4a8d-94b5-467e0feef2bd-dshm\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-798f86c66f445pd\" (UID: \"e5eef0e7-98de-4a8d-94b5-467e0feef2bd\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-798f86c66f445pd" Apr 22 20:08:34.438823 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:08:34.438725 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cs2k5\" (UniqueName: \"kubernetes.io/projected/e5eef0e7-98de-4a8d-94b5-467e0feef2bd-kube-api-access-cs2k5\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-798f86c66f445pd\" (UID: \"e5eef0e7-98de-4a8d-94b5-467e0feef2bd\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-798f86c66f445pd" Apr 22 20:08:34.438823 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:08:34.438785 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e5eef0e7-98de-4a8d-94b5-467e0feef2bd-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-798f86c66f445pd\" (UID: \"e5eef0e7-98de-4a8d-94b5-467e0feef2bd\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-798f86c66f445pd" Apr 22 20:08:34.438823 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:08:34.438802 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/e5eef0e7-98de-4a8d-94b5-467e0feef2bd-home\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-798f86c66f445pd\" (UID: \"e5eef0e7-98de-4a8d-94b5-467e0feef2bd\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-798f86c66f445pd" Apr 22 20:08:34.439002 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:08:34.438829 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e5eef0e7-98de-4a8d-94b5-467e0feef2bd-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-798f86c66f445pd\" (UID: \"e5eef0e7-98de-4a8d-94b5-467e0feef2bd\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-798f86c66f445pd" Apr 22 20:08:34.539862 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:08:34.539801 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/e5eef0e7-98de-4a8d-94b5-467e0feef2bd-home\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-798f86c66f445pd\" (UID: \"e5eef0e7-98de-4a8d-94b5-467e0feef2bd\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-798f86c66f445pd" Apr 22 20:08:34.540049 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:08:34.539874 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e5eef0e7-98de-4a8d-94b5-467e0feef2bd-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-798f86c66f445pd\" (UID: \"e5eef0e7-98de-4a8d-94b5-467e0feef2bd\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-798f86c66f445pd" Apr 22 20:08:34.540049 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:08:34.539927 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/e5eef0e7-98de-4a8d-94b5-467e0feef2bd-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-798f86c66f445pd\" (UID: \"e5eef0e7-98de-4a8d-94b5-467e0feef2bd\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-798f86c66f445pd" Apr 22 20:08:34.540049 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:08:34.539974 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/e5eef0e7-98de-4a8d-94b5-467e0feef2bd-dshm\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-798f86c66f445pd\" (UID: \"e5eef0e7-98de-4a8d-94b5-467e0feef2bd\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-798f86c66f445pd" Apr 22 20:08:34.540049 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:08:34.540010 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cs2k5\" (UniqueName: \"kubernetes.io/projected/e5eef0e7-98de-4a8d-94b5-467e0feef2bd-kube-api-access-cs2k5\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-798f86c66f445pd\" (UID: \"e5eef0e7-98de-4a8d-94b5-467e0feef2bd\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-798f86c66f445pd" Apr 22 20:08:34.540264 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:08:34.540055 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e5eef0e7-98de-4a8d-94b5-467e0feef2bd-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-798f86c66f445pd\" (UID: \"e5eef0e7-98de-4a8d-94b5-467e0feef2bd\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-798f86c66f445pd" Apr 22 20:08:34.540321 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:08:34.540274 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/e5eef0e7-98de-4a8d-94b5-467e0feef2bd-home\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-798f86c66f445pd\" (UID: \"e5eef0e7-98de-4a8d-94b5-467e0feef2bd\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-798f86c66f445pd" Apr 22 20:08:34.540363 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:08:34.540334 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/e5eef0e7-98de-4a8d-94b5-467e0feef2bd-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-798f86c66f445pd\" (UID: \"e5eef0e7-98de-4a8d-94b5-467e0feef2bd\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-798f86c66f445pd" Apr 22 20:08:34.540437 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:08:34.540419 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e5eef0e7-98de-4a8d-94b5-467e0feef2bd-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-798f86c66f445pd\" (UID: \"e5eef0e7-98de-4a8d-94b5-467e0feef2bd\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-798f86c66f445pd" Apr 22 20:08:34.542207 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:08:34.542187 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/e5eef0e7-98de-4a8d-94b5-467e0feef2bd-dshm\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-798f86c66f445pd\" (UID: \"e5eef0e7-98de-4a8d-94b5-467e0feef2bd\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-798f86c66f445pd" Apr 22 20:08:34.542434 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:08:34.542418 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e5eef0e7-98de-4a8d-94b5-467e0feef2bd-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-798f86c66f445pd\" (UID: \"e5eef0e7-98de-4a8d-94b5-467e0feef2bd\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-798f86c66f445pd" Apr 22 20:08:34.547522 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:08:34.547495 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cs2k5\" (UniqueName: \"kubernetes.io/projected/e5eef0e7-98de-4a8d-94b5-467e0feef2bd-kube-api-access-cs2k5\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-798f86c66f445pd\" (UID: \"e5eef0e7-98de-4a8d-94b5-467e0feef2bd\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-798f86c66f445pd" Apr 22 20:08:34.709591 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:08:34.709505 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-798f86c66f445pd" Apr 22 20:08:34.829971 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:08:34.829947 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-798f86c66f445pd"] Apr 22 20:08:34.832258 ip-10-0-143-253 kubenswrapper[2577]: W0422 20:08:34.832233 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode5eef0e7_98de_4a8d_94b5_467e0feef2bd.slice/crio-501af5c8f1213fe8eefc22d5a7ed47c52f30ddef5cc30dbd8739241ed3f6732e WatchSource:0}: Error finding container 501af5c8f1213fe8eefc22d5a7ed47c52f30ddef5cc30dbd8739241ed3f6732e: Status 404 returned error can't find the container with id 501af5c8f1213fe8eefc22d5a7ed47c52f30ddef5cc30dbd8739241ed3f6732e Apr 22 20:08:34.834003 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:08:34.833983 2577 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 20:08:35.338434 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:08:35.338387 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-798f86c66f445pd" event={"ID":"e5eef0e7-98de-4a8d-94b5-467e0feef2bd","Type":"ContainerStarted","Data":"501af5c8f1213fe8eefc22d5a7ed47c52f30ddef5cc30dbd8739241ed3f6732e"} Apr 22 20:08:38.352182 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:08:38.352144 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-798f86c66f445pd" event={"ID":"e5eef0e7-98de-4a8d-94b5-467e0feef2bd","Type":"ContainerStarted","Data":"fce4c3ccf538453502c49049f7a30c0c75628a341c7992c889408f24ad6ae1d4"} Apr 22 20:09:04.435696 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:09:04.435609 2577 generic.go:358] "Generic (PLEG): container finished" podID="e5eef0e7-98de-4a8d-94b5-467e0feef2bd" containerID="fce4c3ccf538453502c49049f7a30c0c75628a341c7992c889408f24ad6ae1d4" exitCode=0 Apr 22 20:09:04.436078 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:09:04.435685 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-798f86c66f445pd" event={"ID":"e5eef0e7-98de-4a8d-94b5-467e0feef2bd","Type":"ContainerDied","Data":"fce4c3ccf538453502c49049f7a30c0c75628a341c7992c889408f24ad6ae1d4"} Apr 22 20:09:06.445187 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:09:06.445155 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-798f86c66f445pd" event={"ID":"e5eef0e7-98de-4a8d-94b5-467e0feef2bd","Type":"ContainerStarted","Data":"f8a2f5d50a72ef1647860e582afe99ca22650e1292f53cc0e3ff9ce14ad4a1fb"} Apr 22 20:09:06.462448 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:09:06.462399 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-798f86c66f445pd" podStartSLOduration=1.8566563249999999 podStartE2EDuration="32.462385409s" podCreationTimestamp="2026-04-22 20:08:34 +0000 UTC" firstStartedPulling="2026-04-22 20:08:34.834173215 +0000 UTC m=+650.149265429" lastFinishedPulling="2026-04-22 20:09:05.439902303 +0000 UTC m=+680.754994513" observedRunningTime="2026-04-22 20:09:06.461136371 +0000 UTC m=+681.776228603" watchObservedRunningTime="2026-04-22 20:09:06.462385409 +0000 UTC m=+681.777477641" Apr 22 20:09:11.266910 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:09:11.266873 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7c5b756hz4df"] Apr 22 20:09:11.289027 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:09:11.288997 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7c5b756hz4df"] Apr 22 20:09:11.289200 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:09:11.289115 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7c5b756hz4df" Apr 22 20:09:11.291274 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:09:11.291249 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-ha-replicas-test-epp-sa-dockercfg-25ljw\"" Apr 22 20:09:11.291393 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:09:11.291288 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-ha-replicas-test-kserve-self-signed-certs\"" Apr 22 20:09:11.357224 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:09:11.357184 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g7gvf\" (UniqueName: \"kubernetes.io/projected/c9b8c404-033c-4669-b391-6d1393c91a25-kube-api-access-g7gvf\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-7c5b756hz4df\" (UID: \"c9b8c404-033c-4669-b391-6d1393c91a25\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7c5b756hz4df" Apr 22 20:09:11.357224 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:09:11.357227 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/c9b8c404-033c-4669-b391-6d1393c91a25-tokenizer-uds\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-7c5b756hz4df\" (UID: \"c9b8c404-033c-4669-b391-6d1393c91a25\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7c5b756hz4df" Apr 22 20:09:11.357405 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:09:11.357282 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/c9b8c404-033c-4669-b391-6d1393c91a25-tokenizer-cache\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-7c5b756hz4df\" (UID: \"c9b8c404-033c-4669-b391-6d1393c91a25\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7c5b756hz4df" Apr 22 20:09:11.357405 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:09:11.357314 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c9b8c404-033c-4669-b391-6d1393c91a25-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-7c5b756hz4df\" (UID: \"c9b8c404-033c-4669-b391-6d1393c91a25\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7c5b756hz4df" Apr 22 20:09:11.357405 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:09:11.357364 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c9b8c404-033c-4669-b391-6d1393c91a25-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-7c5b756hz4df\" (UID: \"c9b8c404-033c-4669-b391-6d1393c91a25\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7c5b756hz4df" Apr 22 20:09:11.357405 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:09:11.357403 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/c9b8c404-033c-4669-b391-6d1393c91a25-tokenizer-tmp\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-7c5b756hz4df\" (UID: \"c9b8c404-033c-4669-b391-6d1393c91a25\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7c5b756hz4df" Apr 22 20:09:11.458366 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:09:11.458324 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/c9b8c404-033c-4669-b391-6d1393c91a25-tokenizer-cache\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-7c5b756hz4df\" (UID: \"c9b8c404-033c-4669-b391-6d1393c91a25\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7c5b756hz4df" Apr 22 20:09:11.458550 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:09:11.458383 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c9b8c404-033c-4669-b391-6d1393c91a25-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-7c5b756hz4df\" (UID: \"c9b8c404-033c-4669-b391-6d1393c91a25\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7c5b756hz4df" Apr 22 20:09:11.458550 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:09:11.458422 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c9b8c404-033c-4669-b391-6d1393c91a25-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-7c5b756hz4df\" (UID: \"c9b8c404-033c-4669-b391-6d1393c91a25\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7c5b756hz4df" Apr 22 20:09:11.458550 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:09:11.458456 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/c9b8c404-033c-4669-b391-6d1393c91a25-tokenizer-tmp\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-7c5b756hz4df\" (UID: \"c9b8c404-033c-4669-b391-6d1393c91a25\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7c5b756hz4df" Apr 22 20:09:11.458550 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:09:11.458485 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g7gvf\" (UniqueName: \"kubernetes.io/projected/c9b8c404-033c-4669-b391-6d1393c91a25-kube-api-access-g7gvf\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-7c5b756hz4df\" (UID: \"c9b8c404-033c-4669-b391-6d1393c91a25\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7c5b756hz4df" Apr 22 20:09:11.458550 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:09:11.458504 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/c9b8c404-033c-4669-b391-6d1393c91a25-tokenizer-uds\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-7c5b756hz4df\" (UID: \"c9b8c404-033c-4669-b391-6d1393c91a25\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7c5b756hz4df" Apr 22 20:09:11.458810 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:09:11.458773 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/c9b8c404-033c-4669-b391-6d1393c91a25-tokenizer-cache\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-7c5b756hz4df\" (UID: \"c9b8c404-033c-4669-b391-6d1393c91a25\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7c5b756hz4df" Apr 22 20:09:11.458885 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:09:11.458813 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c9b8c404-033c-4669-b391-6d1393c91a25-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-7c5b756hz4df\" (UID: \"c9b8c404-033c-4669-b391-6d1393c91a25\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7c5b756hz4df" Apr 22 20:09:11.458930 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:09:11.458899 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/c9b8c404-033c-4669-b391-6d1393c91a25-tokenizer-tmp\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-7c5b756hz4df\" (UID: \"c9b8c404-033c-4669-b391-6d1393c91a25\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7c5b756hz4df" Apr 22 20:09:11.459012 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:09:11.458987 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/c9b8c404-033c-4669-b391-6d1393c91a25-tokenizer-uds\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-7c5b756hz4df\" (UID: \"c9b8c404-033c-4669-b391-6d1393c91a25\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7c5b756hz4df" Apr 22 20:09:11.461330 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:09:11.461310 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c9b8c404-033c-4669-b391-6d1393c91a25-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-7c5b756hz4df\" (UID: \"c9b8c404-033c-4669-b391-6d1393c91a25\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7c5b756hz4df" Apr 22 20:09:11.468867 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:09:11.468825 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g7gvf\" (UniqueName: \"kubernetes.io/projected/c9b8c404-033c-4669-b391-6d1393c91a25-kube-api-access-g7gvf\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-7c5b756hz4df\" (UID: \"c9b8c404-033c-4669-b391-6d1393c91a25\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7c5b756hz4df" Apr 22 20:09:11.599142 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:09:11.599099 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7c5b756hz4df" Apr 22 20:09:11.729442 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:09:11.719676 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7c5b756hz4df"] Apr 22 20:09:11.731368 ip-10-0-143-253 kubenswrapper[2577]: W0422 20:09:11.731332 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc9b8c404_033c_4669_b391_6d1393c91a25.slice/crio-8a8840ec2e4afe321f8c475c36bcfadcbecad57c33c44483b80ae6026913b75f WatchSource:0}: Error finding container 8a8840ec2e4afe321f8c475c36bcfadcbecad57c33c44483b80ae6026913b75f: Status 404 returned error can't find the container with id 8a8840ec2e4afe321f8c475c36bcfadcbecad57c33c44483b80ae6026913b75f Apr 22 20:09:12.466018 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:09:12.465976 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7c5b756hz4df" event={"ID":"c9b8c404-033c-4669-b391-6d1393c91a25","Type":"ContainerStarted","Data":"28f64ca52305f82a8974d38ec3806f19b72b702a5ea7a818c34a276ad76a8992"} Apr 22 20:09:12.466018 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:09:12.466019 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7c5b756hz4df" event={"ID":"c9b8c404-033c-4669-b391-6d1393c91a25","Type":"ContainerStarted","Data":"8a8840ec2e4afe321f8c475c36bcfadcbecad57c33c44483b80ae6026913b75f"} Apr 22 20:09:13.470328 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:09:13.470293 2577 generic.go:358] "Generic (PLEG): container finished" podID="c9b8c404-033c-4669-b391-6d1393c91a25" containerID="28f64ca52305f82a8974d38ec3806f19b72b702a5ea7a818c34a276ad76a8992" exitCode=0 Apr 22 20:09:13.470696 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:09:13.470381 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7c5b756hz4df" event={"ID":"c9b8c404-033c-4669-b391-6d1393c91a25","Type":"ContainerDied","Data":"28f64ca52305f82a8974d38ec3806f19b72b702a5ea7a818c34a276ad76a8992"} Apr 22 20:09:14.709973 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:09:14.709936 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-798f86c66f445pd" Apr 22 20:09:14.709973 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:09:14.709978 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-798f86c66f445pd" Apr 22 20:09:14.722298 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:09:14.722276 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-798f86c66f445pd" Apr 22 20:09:15.478474 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:09:15.478437 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7c5b756hz4df" event={"ID":"c9b8c404-033c-4669-b391-6d1393c91a25","Type":"ContainerStarted","Data":"c4b5a977b719bf35af8a08fe152054ea9193e7c998cfc7392ba46de95425781a"} Apr 22 20:09:15.490032 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:09:15.489992 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-798f86c66f445pd" Apr 22 20:09:37.004335 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:09:37.004257 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-798f86c66f445pd"] Apr 22 20:09:37.004824 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:09:37.004587 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-798f86c66f445pd" podUID="e5eef0e7-98de-4a8d-94b5-467e0feef2bd" containerName="main" containerID="cri-o://f8a2f5d50a72ef1647860e582afe99ca22650e1292f53cc0e3ff9ce14ad4a1fb" gracePeriod=30 Apr 22 20:09:44.486003 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:09:44.485982 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-798f86c66f445pd" Apr 22 20:09:44.570305 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:09:44.570222 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e5eef0e7-98de-4a8d-94b5-467e0feef2bd-tls-certs\") pod \"e5eef0e7-98de-4a8d-94b5-467e0feef2bd\" (UID: \"e5eef0e7-98de-4a8d-94b5-467e0feef2bd\") " Apr 22 20:09:44.570305 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:09:44.570260 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e5eef0e7-98de-4a8d-94b5-467e0feef2bd-kserve-provision-location\") pod \"e5eef0e7-98de-4a8d-94b5-467e0feef2bd\" (UID: \"e5eef0e7-98de-4a8d-94b5-467e0feef2bd\") " Apr 22 20:09:44.570305 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:09:44.570293 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/e5eef0e7-98de-4a8d-94b5-467e0feef2bd-dshm\") pod \"e5eef0e7-98de-4a8d-94b5-467e0feef2bd\" (UID: \"e5eef0e7-98de-4a8d-94b5-467e0feef2bd\") " Apr 22 20:09:44.570583 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:09:44.570321 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cs2k5\" (UniqueName: \"kubernetes.io/projected/e5eef0e7-98de-4a8d-94b5-467e0feef2bd-kube-api-access-cs2k5\") pod \"e5eef0e7-98de-4a8d-94b5-467e0feef2bd\" (UID: \"e5eef0e7-98de-4a8d-94b5-467e0feef2bd\") " Apr 22 20:09:44.570583 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:09:44.570351 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/e5eef0e7-98de-4a8d-94b5-467e0feef2bd-model-cache\") pod \"e5eef0e7-98de-4a8d-94b5-467e0feef2bd\" (UID: \"e5eef0e7-98de-4a8d-94b5-467e0feef2bd\") " Apr 22 20:09:44.570583 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:09:44.570390 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/e5eef0e7-98de-4a8d-94b5-467e0feef2bd-home\") pod \"e5eef0e7-98de-4a8d-94b5-467e0feef2bd\" (UID: \"e5eef0e7-98de-4a8d-94b5-467e0feef2bd\") " Apr 22 20:09:44.570745 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:09:44.570685 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e5eef0e7-98de-4a8d-94b5-467e0feef2bd-model-cache" (OuterVolumeSpecName: "model-cache") pod "e5eef0e7-98de-4a8d-94b5-467e0feef2bd" (UID: "e5eef0e7-98de-4a8d-94b5-467e0feef2bd"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 20:09:44.570862 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:09:44.570826 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e5eef0e7-98de-4a8d-94b5-467e0feef2bd-home" (OuterVolumeSpecName: "home") pod "e5eef0e7-98de-4a8d-94b5-467e0feef2bd" (UID: "e5eef0e7-98de-4a8d-94b5-467e0feef2bd"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 20:09:44.573366 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:09:44.572715 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e5eef0e7-98de-4a8d-94b5-467e0feef2bd-dshm" (OuterVolumeSpecName: "dshm") pod "e5eef0e7-98de-4a8d-94b5-467e0feef2bd" (UID: "e5eef0e7-98de-4a8d-94b5-467e0feef2bd"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 20:09:44.573366 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:09:44.572821 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e5eef0e7-98de-4a8d-94b5-467e0feef2bd-kube-api-access-cs2k5" (OuterVolumeSpecName: "kube-api-access-cs2k5") pod "e5eef0e7-98de-4a8d-94b5-467e0feef2bd" (UID: "e5eef0e7-98de-4a8d-94b5-467e0feef2bd"). InnerVolumeSpecName "kube-api-access-cs2k5". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 20:09:44.573366 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:09:44.572927 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5eef0e7-98de-4a8d-94b5-467e0feef2bd-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "e5eef0e7-98de-4a8d-94b5-467e0feef2bd" (UID: "e5eef0e7-98de-4a8d-94b5-467e0feef2bd"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 20:09:44.590155 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:09:44.590128 2577 generic.go:358] "Generic (PLEG): container finished" podID="e5eef0e7-98de-4a8d-94b5-467e0feef2bd" containerID="f8a2f5d50a72ef1647860e582afe99ca22650e1292f53cc0e3ff9ce14ad4a1fb" exitCode=0 Apr 22 20:09:44.590339 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:09:44.590205 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-798f86c66f445pd" event={"ID":"e5eef0e7-98de-4a8d-94b5-467e0feef2bd","Type":"ContainerDied","Data":"f8a2f5d50a72ef1647860e582afe99ca22650e1292f53cc0e3ff9ce14ad4a1fb"} Apr 22 20:09:44.590339 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:09:44.590235 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-798f86c66f445pd" event={"ID":"e5eef0e7-98de-4a8d-94b5-467e0feef2bd","Type":"ContainerDied","Data":"501af5c8f1213fe8eefc22d5a7ed47c52f30ddef5cc30dbd8739241ed3f6732e"} Apr 22 20:09:44.590339 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:09:44.590254 2577 scope.go:117] "RemoveContainer" containerID="f8a2f5d50a72ef1647860e582afe99ca22650e1292f53cc0e3ff9ce14ad4a1fb" Apr 22 20:09:44.590339 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:09:44.590268 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-798f86c66f445pd" Apr 22 20:09:44.625420 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:09:44.625382 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e5eef0e7-98de-4a8d-94b5-467e0feef2bd-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "e5eef0e7-98de-4a8d-94b5-467e0feef2bd" (UID: "e5eef0e7-98de-4a8d-94b5-467e0feef2bd"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 20:09:44.671294 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:09:44.671266 2577 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e5eef0e7-98de-4a8d-94b5-467e0feef2bd-tls-certs\") on node \"ip-10-0-143-253.ec2.internal\" DevicePath \"\"" Apr 22 20:09:44.671294 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:09:44.671292 2577 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e5eef0e7-98de-4a8d-94b5-467e0feef2bd-kserve-provision-location\") on node \"ip-10-0-143-253.ec2.internal\" DevicePath \"\"" Apr 22 20:09:44.671481 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:09:44.671306 2577 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/e5eef0e7-98de-4a8d-94b5-467e0feef2bd-dshm\") on node \"ip-10-0-143-253.ec2.internal\" DevicePath \"\"" Apr 22 20:09:44.671481 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:09:44.671321 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-cs2k5\" (UniqueName: \"kubernetes.io/projected/e5eef0e7-98de-4a8d-94b5-467e0feef2bd-kube-api-access-cs2k5\") on node \"ip-10-0-143-253.ec2.internal\" DevicePath \"\"" Apr 22 20:09:44.671481 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:09:44.671335 2577 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/e5eef0e7-98de-4a8d-94b5-467e0feef2bd-model-cache\") on node \"ip-10-0-143-253.ec2.internal\" DevicePath \"\"" Apr 22 20:09:44.671481 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:09:44.671346 2577 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/e5eef0e7-98de-4a8d-94b5-467e0feef2bd-home\") on node \"ip-10-0-143-253.ec2.internal\" DevicePath \"\"" Apr 22 20:09:44.683981 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:09:44.683960 2577 scope.go:117] "RemoveContainer" containerID="fce4c3ccf538453502c49049f7a30c0c75628a341c7992c889408f24ad6ae1d4" Apr 22 20:09:44.860913 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:09:44.860879 2577 scope.go:117] "RemoveContainer" containerID="f8a2f5d50a72ef1647860e582afe99ca22650e1292f53cc0e3ff9ce14ad4a1fb" Apr 22 20:09:44.861244 ip-10-0-143-253 kubenswrapper[2577]: E0422 20:09:44.861212 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f8a2f5d50a72ef1647860e582afe99ca22650e1292f53cc0e3ff9ce14ad4a1fb\": container with ID starting with f8a2f5d50a72ef1647860e582afe99ca22650e1292f53cc0e3ff9ce14ad4a1fb not found: ID does not exist" containerID="f8a2f5d50a72ef1647860e582afe99ca22650e1292f53cc0e3ff9ce14ad4a1fb" Apr 22 20:09:44.861340 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:09:44.861257 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f8a2f5d50a72ef1647860e582afe99ca22650e1292f53cc0e3ff9ce14ad4a1fb"} err="failed to get container status \"f8a2f5d50a72ef1647860e582afe99ca22650e1292f53cc0e3ff9ce14ad4a1fb\": rpc error: code = NotFound desc = could not find container \"f8a2f5d50a72ef1647860e582afe99ca22650e1292f53cc0e3ff9ce14ad4a1fb\": container with ID starting with f8a2f5d50a72ef1647860e582afe99ca22650e1292f53cc0e3ff9ce14ad4a1fb not found: ID does not exist" Apr 22 20:09:44.861340 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:09:44.861291 2577 scope.go:117] "RemoveContainer" containerID="fce4c3ccf538453502c49049f7a30c0c75628a341c7992c889408f24ad6ae1d4" Apr 22 20:09:44.861593 ip-10-0-143-253 kubenswrapper[2577]: E0422 20:09:44.861572 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fce4c3ccf538453502c49049f7a30c0c75628a341c7992c889408f24ad6ae1d4\": container with ID starting with fce4c3ccf538453502c49049f7a30c0c75628a341c7992c889408f24ad6ae1d4 not found: ID does not exist" containerID="fce4c3ccf538453502c49049f7a30c0c75628a341c7992c889408f24ad6ae1d4" Apr 22 20:09:44.861634 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:09:44.861602 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fce4c3ccf538453502c49049f7a30c0c75628a341c7992c889408f24ad6ae1d4"} err="failed to get container status \"fce4c3ccf538453502c49049f7a30c0c75628a341c7992c889408f24ad6ae1d4\": rpc error: code = NotFound desc = could not find container \"fce4c3ccf538453502c49049f7a30c0c75628a341c7992c889408f24ad6ae1d4\": container with ID starting with fce4c3ccf538453502c49049f7a30c0c75628a341c7992c889408f24ad6ae1d4 not found: ID does not exist" Apr 22 20:09:44.911705 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:09:44.911679 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-798f86c66f445pd"] Apr 22 20:09:44.913927 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:09:44.913905 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-798f86c66f445pd"] Apr 22 20:09:45.253976 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:09:45.253943 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e5eef0e7-98de-4a8d-94b5-467e0feef2bd" path="/var/lib/kubelet/pods/e5eef0e7-98de-4a8d-94b5-467e0feef2bd/volumes" Apr 22 20:09:45.596539 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:09:45.596508 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7c5b756hz4df" event={"ID":"c9b8c404-033c-4669-b391-6d1393c91a25","Type":"ContainerStarted","Data":"1fbc8774aa3f8d41ca11d3a1c9df6b2bfedf484937c02a3e45468436dfb83a86"} Apr 22 20:09:45.596976 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:09:45.596752 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7c5b756hz4df" Apr 22 20:09:45.598984 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:09:45.598959 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7c5b756hz4df" podUID="c9b8c404-033c-4669-b391-6d1393c91a25" containerName="main" probeResult="failure" output="service unhealthy (responded with \"NOT_SERVING\")" Apr 22 20:09:45.616084 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:09:45.616026 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7c5b756hz4df" podStartSLOduration=3.1326252549999998 podStartE2EDuration="34.61600747s" podCreationTimestamp="2026-04-22 20:09:11 +0000 UTC" firstStartedPulling="2026-04-22 20:09:13.47154351 +0000 UTC m=+688.786635720" lastFinishedPulling="2026-04-22 20:09:44.954925708 +0000 UTC m=+720.270017935" observedRunningTime="2026-04-22 20:09:45.613642191 +0000 UTC m=+720.928734422" watchObservedRunningTime="2026-04-22 20:09:45.61600747 +0000 UTC m=+720.931099703" Apr 22 20:09:46.607005 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:09:46.606968 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7c5b756hz4df" podUID="c9b8c404-033c-4669-b391-6d1393c91a25" containerName="main" probeResult="failure" output="service unhealthy (responded with \"NOT_SERVING\")" Apr 22 20:09:51.599401 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:09:51.599364 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7c5b756hz4df" Apr 22 20:09:51.599816 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:09:51.599413 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7c5b756hz4df" Apr 22 20:09:51.599816 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:09:51.599704 2577 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7c5b756hz4df" podUID="c9b8c404-033c-4669-b391-6d1393c91a25" containerName="tokenizer" probeResult="failure" output="Get \"http://10.133.0.26:8082/healthz\": dial tcp 10.133.0.26:8082: connect: connection refused" Apr 22 20:09:51.600973 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:09:51.600934 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7c5b756hz4df" podUID="c9b8c404-033c-4669-b391-6d1393c91a25" containerName="main" probeResult="failure" output="service unhealthy (responded with \"NOT_SERVING\")" Apr 22 20:09:57.785469 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:09:57.785439 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7c5b756hz4df"] Apr 22 20:09:57.785984 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:09:57.785737 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7c5b756hz4df" podUID="c9b8c404-033c-4669-b391-6d1393c91a25" containerName="main" containerID="cri-o://c4b5a977b719bf35af8a08fe152054ea9193e7c998cfc7392ba46de95425781a" gracePeriod=30 Apr 22 20:09:57.785984 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:09:57.785786 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7c5b756hz4df" podUID="c9b8c404-033c-4669-b391-6d1393c91a25" containerName="tokenizer" containerID="cri-o://1fbc8774aa3f8d41ca11d3a1c9df6b2bfedf484937c02a3e45468436dfb83a86" gracePeriod=30 Apr 22 20:09:57.787029 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:09:57.787003 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7c5b756hz4df" podUID="c9b8c404-033c-4669-b391-6d1393c91a25" containerName="main" probeResult="failure" output="service unhealthy (responded with \"NOT_SERVING\")" Apr 22 20:09:58.642992 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:09:58.642957 2577 generic.go:358] "Generic (PLEG): container finished" podID="c9b8c404-033c-4669-b391-6d1393c91a25" containerID="c4b5a977b719bf35af8a08fe152054ea9193e7c998cfc7392ba46de95425781a" exitCode=0 Apr 22 20:09:58.643150 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:09:58.643028 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7c5b756hz4df" event={"ID":"c9b8c404-033c-4669-b391-6d1393c91a25","Type":"ContainerDied","Data":"c4b5a977b719bf35af8a08fe152054ea9193e7c998cfc7392ba46de95425781a"} Apr 22 20:09:59.043772 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:09:59.043748 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7c5b756hz4df" Apr 22 20:09:59.089160 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:09:59.089095 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/c9b8c404-033c-4669-b391-6d1393c91a25-tokenizer-cache\") pod \"c9b8c404-033c-4669-b391-6d1393c91a25\" (UID: \"c9b8c404-033c-4669-b391-6d1393c91a25\") " Apr 22 20:09:59.089160 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:09:59.089128 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c9b8c404-033c-4669-b391-6d1393c91a25-kserve-provision-location\") pod \"c9b8c404-033c-4669-b391-6d1393c91a25\" (UID: \"c9b8c404-033c-4669-b391-6d1393c91a25\") " Apr 22 20:09:59.089160 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:09:59.089153 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g7gvf\" (UniqueName: \"kubernetes.io/projected/c9b8c404-033c-4669-b391-6d1393c91a25-kube-api-access-g7gvf\") pod \"c9b8c404-033c-4669-b391-6d1393c91a25\" (UID: \"c9b8c404-033c-4669-b391-6d1393c91a25\") " Apr 22 20:09:59.089402 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:09:59.089191 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/c9b8c404-033c-4669-b391-6d1393c91a25-tokenizer-uds\") pod \"c9b8c404-033c-4669-b391-6d1393c91a25\" (UID: \"c9b8c404-033c-4669-b391-6d1393c91a25\") " Apr 22 20:09:59.089402 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:09:59.089214 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c9b8c404-033c-4669-b391-6d1393c91a25-tls-certs\") pod \"c9b8c404-033c-4669-b391-6d1393c91a25\" (UID: \"c9b8c404-033c-4669-b391-6d1393c91a25\") " Apr 22 20:09:59.089402 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:09:59.089278 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/c9b8c404-033c-4669-b391-6d1393c91a25-tokenizer-tmp\") pod \"c9b8c404-033c-4669-b391-6d1393c91a25\" (UID: \"c9b8c404-033c-4669-b391-6d1393c91a25\") " Apr 22 20:09:59.089555 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:09:59.089428 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c9b8c404-033c-4669-b391-6d1393c91a25-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "c9b8c404-033c-4669-b391-6d1393c91a25" (UID: "c9b8c404-033c-4669-b391-6d1393c91a25"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 20:09:59.089555 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:09:59.089453 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c9b8c404-033c-4669-b391-6d1393c91a25-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "c9b8c404-033c-4669-b391-6d1393c91a25" (UID: "c9b8c404-033c-4669-b391-6d1393c91a25"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 20:09:59.089555 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:09:59.089528 2577 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/c9b8c404-033c-4669-b391-6d1393c91a25-tokenizer-cache\") on node \"ip-10-0-143-253.ec2.internal\" DevicePath \"\"" Apr 22 20:09:59.089555 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:09:59.089546 2577 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/c9b8c404-033c-4669-b391-6d1393c91a25-tokenizer-uds\") on node \"ip-10-0-143-253.ec2.internal\" DevicePath \"\"" Apr 22 20:09:59.089731 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:09:59.089658 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c9b8c404-033c-4669-b391-6d1393c91a25-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "c9b8c404-033c-4669-b391-6d1393c91a25" (UID: "c9b8c404-033c-4669-b391-6d1393c91a25"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 20:09:59.089936 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:09:59.089910 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c9b8c404-033c-4669-b391-6d1393c91a25-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "c9b8c404-033c-4669-b391-6d1393c91a25" (UID: "c9b8c404-033c-4669-b391-6d1393c91a25"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 20:09:59.091257 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:09:59.091237 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c9b8c404-033c-4669-b391-6d1393c91a25-kube-api-access-g7gvf" (OuterVolumeSpecName: "kube-api-access-g7gvf") pod "c9b8c404-033c-4669-b391-6d1393c91a25" (UID: "c9b8c404-033c-4669-b391-6d1393c91a25"). InnerVolumeSpecName "kube-api-access-g7gvf". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 20:09:59.091314 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:09:59.091255 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9b8c404-033c-4669-b391-6d1393c91a25-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "c9b8c404-033c-4669-b391-6d1393c91a25" (UID: "c9b8c404-033c-4669-b391-6d1393c91a25"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 20:09:59.190578 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:09:59.190549 2577 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c9b8c404-033c-4669-b391-6d1393c91a25-kserve-provision-location\") on node \"ip-10-0-143-253.ec2.internal\" DevicePath \"\"" Apr 22 20:09:59.190578 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:09:59.190576 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-g7gvf\" (UniqueName: \"kubernetes.io/projected/c9b8c404-033c-4669-b391-6d1393c91a25-kube-api-access-g7gvf\") on node \"ip-10-0-143-253.ec2.internal\" DevicePath \"\"" Apr 22 20:09:59.190578 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:09:59.190587 2577 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c9b8c404-033c-4669-b391-6d1393c91a25-tls-certs\") on node \"ip-10-0-143-253.ec2.internal\" DevicePath \"\"" Apr 22 20:09:59.190785 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:09:59.190596 2577 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/c9b8c404-033c-4669-b391-6d1393c91a25-tokenizer-tmp\") on node \"ip-10-0-143-253.ec2.internal\" DevicePath \"\"" Apr 22 20:09:59.648172 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:09:59.648139 2577 generic.go:358] "Generic (PLEG): container finished" podID="c9b8c404-033c-4669-b391-6d1393c91a25" containerID="1fbc8774aa3f8d41ca11d3a1c9df6b2bfedf484937c02a3e45468436dfb83a86" exitCode=0 Apr 22 20:09:59.648339 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:09:59.648217 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7c5b756hz4df" event={"ID":"c9b8c404-033c-4669-b391-6d1393c91a25","Type":"ContainerDied","Data":"1fbc8774aa3f8d41ca11d3a1c9df6b2bfedf484937c02a3e45468436dfb83a86"} Apr 22 20:09:59.648339 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:09:59.648261 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7c5b756hz4df" event={"ID":"c9b8c404-033c-4669-b391-6d1393c91a25","Type":"ContainerDied","Data":"8a8840ec2e4afe321f8c475c36bcfadcbecad57c33c44483b80ae6026913b75f"} Apr 22 20:09:59.648339 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:09:59.648276 2577 scope.go:117] "RemoveContainer" containerID="1fbc8774aa3f8d41ca11d3a1c9df6b2bfedf484937c02a3e45468436dfb83a86" Apr 22 20:09:59.648339 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:09:59.648232 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7c5b756hz4df" Apr 22 20:09:59.656269 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:09:59.656239 2577 scope.go:117] "RemoveContainer" containerID="c4b5a977b719bf35af8a08fe152054ea9193e7c998cfc7392ba46de95425781a" Apr 22 20:09:59.663319 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:09:59.663293 2577 scope.go:117] "RemoveContainer" containerID="28f64ca52305f82a8974d38ec3806f19b72b702a5ea7a818c34a276ad76a8992" Apr 22 20:09:59.666171 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:09:59.666147 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7c5b756hz4df"] Apr 22 20:09:59.670049 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:09:59.670024 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7c5b756hz4df"] Apr 22 20:09:59.671024 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:09:59.671010 2577 scope.go:117] "RemoveContainer" containerID="1fbc8774aa3f8d41ca11d3a1c9df6b2bfedf484937c02a3e45468436dfb83a86" Apr 22 20:09:59.671256 ip-10-0-143-253 kubenswrapper[2577]: E0422 20:09:59.671238 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1fbc8774aa3f8d41ca11d3a1c9df6b2bfedf484937c02a3e45468436dfb83a86\": container with ID starting with 1fbc8774aa3f8d41ca11d3a1c9df6b2bfedf484937c02a3e45468436dfb83a86 not found: ID does not exist" containerID="1fbc8774aa3f8d41ca11d3a1c9df6b2bfedf484937c02a3e45468436dfb83a86" Apr 22 20:09:59.671302 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:09:59.671264 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1fbc8774aa3f8d41ca11d3a1c9df6b2bfedf484937c02a3e45468436dfb83a86"} err="failed to get container status \"1fbc8774aa3f8d41ca11d3a1c9df6b2bfedf484937c02a3e45468436dfb83a86\": rpc error: code = NotFound desc = could not find container \"1fbc8774aa3f8d41ca11d3a1c9df6b2bfedf484937c02a3e45468436dfb83a86\": container with ID starting with 1fbc8774aa3f8d41ca11d3a1c9df6b2bfedf484937c02a3e45468436dfb83a86 not found: ID does not exist" Apr 22 20:09:59.671302 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:09:59.671281 2577 scope.go:117] "RemoveContainer" containerID="c4b5a977b719bf35af8a08fe152054ea9193e7c998cfc7392ba46de95425781a" Apr 22 20:09:59.671489 ip-10-0-143-253 kubenswrapper[2577]: E0422 20:09:59.671475 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c4b5a977b719bf35af8a08fe152054ea9193e7c998cfc7392ba46de95425781a\": container with ID starting with c4b5a977b719bf35af8a08fe152054ea9193e7c998cfc7392ba46de95425781a not found: ID does not exist" containerID="c4b5a977b719bf35af8a08fe152054ea9193e7c998cfc7392ba46de95425781a" Apr 22 20:09:59.671526 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:09:59.671493 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c4b5a977b719bf35af8a08fe152054ea9193e7c998cfc7392ba46de95425781a"} err="failed to get container status \"c4b5a977b719bf35af8a08fe152054ea9193e7c998cfc7392ba46de95425781a\": rpc error: code = NotFound desc = could not find container \"c4b5a977b719bf35af8a08fe152054ea9193e7c998cfc7392ba46de95425781a\": container with ID starting with c4b5a977b719bf35af8a08fe152054ea9193e7c998cfc7392ba46de95425781a not found: ID does not exist" Apr 22 20:09:59.671526 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:09:59.671506 2577 scope.go:117] "RemoveContainer" containerID="28f64ca52305f82a8974d38ec3806f19b72b702a5ea7a818c34a276ad76a8992" Apr 22 20:09:59.671697 ip-10-0-143-253 kubenswrapper[2577]: E0422 20:09:59.671683 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"28f64ca52305f82a8974d38ec3806f19b72b702a5ea7a818c34a276ad76a8992\": container with ID starting with 28f64ca52305f82a8974d38ec3806f19b72b702a5ea7a818c34a276ad76a8992 not found: ID does not exist" containerID="28f64ca52305f82a8974d38ec3806f19b72b702a5ea7a818c34a276ad76a8992" Apr 22 20:09:59.671739 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:09:59.671701 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"28f64ca52305f82a8974d38ec3806f19b72b702a5ea7a818c34a276ad76a8992"} err="failed to get container status \"28f64ca52305f82a8974d38ec3806f19b72b702a5ea7a818c34a276ad76a8992\": rpc error: code = NotFound desc = could not find container \"28f64ca52305f82a8974d38ec3806f19b72b702a5ea7a818c34a276ad76a8992\": container with ID starting with 28f64ca52305f82a8974d38ec3806f19b72b702a5ea7a818c34a276ad76a8992 not found: ID does not exist" Apr 22 20:10:01.252411 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:10:01.252376 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c9b8c404-033c-4669-b391-6d1393c91a25" path="/var/lib/kubelet/pods/c9b8c404-033c-4669-b391-6d1393c91a25/volumes" Apr 22 20:10:08.510779 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:10:08.510745 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-77b55c8fc-5vjbr"] Apr 22 20:10:08.511271 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:10:08.511125 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c9b8c404-033c-4669-b391-6d1393c91a25" containerName="main" Apr 22 20:10:08.511271 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:10:08.511137 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9b8c404-033c-4669-b391-6d1393c91a25" containerName="main" Apr 22 20:10:08.511271 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:10:08.511149 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e5eef0e7-98de-4a8d-94b5-467e0feef2bd" containerName="storage-initializer" Apr 22 20:10:08.511271 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:10:08.511156 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5eef0e7-98de-4a8d-94b5-467e0feef2bd" containerName="storage-initializer" Apr 22 20:10:08.511271 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:10:08.511167 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e5eef0e7-98de-4a8d-94b5-467e0feef2bd" containerName="main" Apr 22 20:10:08.511271 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:10:08.511173 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5eef0e7-98de-4a8d-94b5-467e0feef2bd" containerName="main" Apr 22 20:10:08.511271 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:10:08.511185 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c9b8c404-033c-4669-b391-6d1393c91a25" containerName="tokenizer" Apr 22 20:10:08.511271 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:10:08.511191 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9b8c404-033c-4669-b391-6d1393c91a25" containerName="tokenizer" Apr 22 20:10:08.511271 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:10:08.511203 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c9b8c404-033c-4669-b391-6d1393c91a25" containerName="storage-initializer" Apr 22 20:10:08.511271 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:10:08.511210 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9b8c404-033c-4669-b391-6d1393c91a25" containerName="storage-initializer" Apr 22 20:10:08.511271 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:10:08.511261 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="c9b8c404-033c-4669-b391-6d1393c91a25" containerName="main" Apr 22 20:10:08.511271 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:10:08.511270 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="e5eef0e7-98de-4a8d-94b5-467e0feef2bd" containerName="main" Apr 22 20:10:08.511271 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:10:08.511278 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="c9b8c404-033c-4669-b391-6d1393c91a25" containerName="tokenizer" Apr 22 20:10:08.523624 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:10:08.523589 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-77b55c8fc-5vjbr"] Apr 22 20:10:08.523773 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:10:08.523678 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-77b55c8fc-5vjbr" Apr 22 20:10:08.525978 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:10:08.525954 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 22 20:10:08.526848 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:10:08.526814 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 22 20:10:08.526946 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:10:08.526865 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-tlcdw\"" Apr 22 20:10:08.526946 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:10:08.526893 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"precise-prefix-cache-test-kserve-self-signed-certs\"" Apr 22 20:10:08.573531 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:10:08.573503 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/c26a5363-3d10-48b9-b18d-675de5ef7789-dshm\") pod \"precise-prefix-cache-test-kserve-77b55c8fc-5vjbr\" (UID: \"c26a5363-3d10-48b9-b18d-675de5ef7789\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-77b55c8fc-5vjbr" Apr 22 20:10:08.573682 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:10:08.573544 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c26a5363-3d10-48b9-b18d-675de5ef7789-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-77b55c8fc-5vjbr\" (UID: \"c26a5363-3d10-48b9-b18d-675de5ef7789\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-77b55c8fc-5vjbr" Apr 22 20:10:08.573682 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:10:08.573574 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/c26a5363-3d10-48b9-b18d-675de5ef7789-model-cache\") pod \"precise-prefix-cache-test-kserve-77b55c8fc-5vjbr\" (UID: \"c26a5363-3d10-48b9-b18d-675de5ef7789\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-77b55c8fc-5vjbr" Apr 22 20:10:08.573682 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:10:08.573628 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c26a5363-3d10-48b9-b18d-675de5ef7789-tls-certs\") pod \"precise-prefix-cache-test-kserve-77b55c8fc-5vjbr\" (UID: \"c26a5363-3d10-48b9-b18d-675de5ef7789\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-77b55c8fc-5vjbr" Apr 22 20:10:08.573808 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:10:08.573687 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/c26a5363-3d10-48b9-b18d-675de5ef7789-home\") pod \"precise-prefix-cache-test-kserve-77b55c8fc-5vjbr\" (UID: \"c26a5363-3d10-48b9-b18d-675de5ef7789\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-77b55c8fc-5vjbr" Apr 22 20:10:08.573808 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:10:08.573720 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g9gkt\" (UniqueName: \"kubernetes.io/projected/c26a5363-3d10-48b9-b18d-675de5ef7789-kube-api-access-g9gkt\") pod \"precise-prefix-cache-test-kserve-77b55c8fc-5vjbr\" (UID: \"c26a5363-3d10-48b9-b18d-675de5ef7789\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-77b55c8fc-5vjbr" Apr 22 20:10:08.674144 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:10:08.674105 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/c26a5363-3d10-48b9-b18d-675de5ef7789-home\") pod \"precise-prefix-cache-test-kserve-77b55c8fc-5vjbr\" (UID: \"c26a5363-3d10-48b9-b18d-675de5ef7789\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-77b55c8fc-5vjbr" Apr 22 20:10:08.674144 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:10:08.674147 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g9gkt\" (UniqueName: \"kubernetes.io/projected/c26a5363-3d10-48b9-b18d-675de5ef7789-kube-api-access-g9gkt\") pod \"precise-prefix-cache-test-kserve-77b55c8fc-5vjbr\" (UID: \"c26a5363-3d10-48b9-b18d-675de5ef7789\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-77b55c8fc-5vjbr" Apr 22 20:10:08.674380 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:10:08.674202 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/c26a5363-3d10-48b9-b18d-675de5ef7789-dshm\") pod \"precise-prefix-cache-test-kserve-77b55c8fc-5vjbr\" (UID: \"c26a5363-3d10-48b9-b18d-675de5ef7789\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-77b55c8fc-5vjbr" Apr 22 20:10:08.674380 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:10:08.674237 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c26a5363-3d10-48b9-b18d-675de5ef7789-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-77b55c8fc-5vjbr\" (UID: \"c26a5363-3d10-48b9-b18d-675de5ef7789\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-77b55c8fc-5vjbr" Apr 22 20:10:08.674380 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:10:08.674262 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/c26a5363-3d10-48b9-b18d-675de5ef7789-model-cache\") pod \"precise-prefix-cache-test-kserve-77b55c8fc-5vjbr\" (UID: \"c26a5363-3d10-48b9-b18d-675de5ef7789\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-77b55c8fc-5vjbr" Apr 22 20:10:08.674380 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:10:08.674288 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c26a5363-3d10-48b9-b18d-675de5ef7789-tls-certs\") pod \"precise-prefix-cache-test-kserve-77b55c8fc-5vjbr\" (UID: \"c26a5363-3d10-48b9-b18d-675de5ef7789\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-77b55c8fc-5vjbr" Apr 22 20:10:08.674588 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:10:08.674539 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/c26a5363-3d10-48b9-b18d-675de5ef7789-home\") pod \"precise-prefix-cache-test-kserve-77b55c8fc-5vjbr\" (UID: \"c26a5363-3d10-48b9-b18d-675de5ef7789\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-77b55c8fc-5vjbr" Apr 22 20:10:08.674809 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:10:08.674784 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/c26a5363-3d10-48b9-b18d-675de5ef7789-model-cache\") pod \"precise-prefix-cache-test-kserve-77b55c8fc-5vjbr\" (UID: \"c26a5363-3d10-48b9-b18d-675de5ef7789\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-77b55c8fc-5vjbr" Apr 22 20:10:08.675069 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:10:08.674885 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c26a5363-3d10-48b9-b18d-675de5ef7789-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-77b55c8fc-5vjbr\" (UID: \"c26a5363-3d10-48b9-b18d-675de5ef7789\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-77b55c8fc-5vjbr" Apr 22 20:10:08.676594 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:10:08.676575 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/c26a5363-3d10-48b9-b18d-675de5ef7789-dshm\") pod \"precise-prefix-cache-test-kserve-77b55c8fc-5vjbr\" (UID: \"c26a5363-3d10-48b9-b18d-675de5ef7789\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-77b55c8fc-5vjbr" Apr 22 20:10:08.677035 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:10:08.677017 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c26a5363-3d10-48b9-b18d-675de5ef7789-tls-certs\") pod \"precise-prefix-cache-test-kserve-77b55c8fc-5vjbr\" (UID: \"c26a5363-3d10-48b9-b18d-675de5ef7789\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-77b55c8fc-5vjbr" Apr 22 20:10:08.682288 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:10:08.682269 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g9gkt\" (UniqueName: \"kubernetes.io/projected/c26a5363-3d10-48b9-b18d-675de5ef7789-kube-api-access-g9gkt\") pod \"precise-prefix-cache-test-kserve-77b55c8fc-5vjbr\" (UID: \"c26a5363-3d10-48b9-b18d-675de5ef7789\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-77b55c8fc-5vjbr" Apr 22 20:10:08.834493 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:10:08.834458 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-77b55c8fc-5vjbr" Apr 22 20:10:08.954730 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:10:08.954698 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-77b55c8fc-5vjbr"] Apr 22 20:10:08.957237 ip-10-0-143-253 kubenswrapper[2577]: W0422 20:10:08.957213 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc26a5363_3d10_48b9_b18d_675de5ef7789.slice/crio-e47be345db3fed2e5628de1b23ad88697793b42fb0b4bdb523a0aef0fdbd6b74 WatchSource:0}: Error finding container e47be345db3fed2e5628de1b23ad88697793b42fb0b4bdb523a0aef0fdbd6b74: Status 404 returned error can't find the container with id e47be345db3fed2e5628de1b23ad88697793b42fb0b4bdb523a0aef0fdbd6b74 Apr 22 20:10:09.681854 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:10:09.681797 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-77b55c8fc-5vjbr" event={"ID":"c26a5363-3d10-48b9-b18d-675de5ef7789","Type":"ContainerStarted","Data":"32b5370f719e44587d010601527f9779267d65411cc0b4f8e5e99ba9d80b193a"} Apr 22 20:10:09.682238 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:10:09.681863 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-77b55c8fc-5vjbr" event={"ID":"c26a5363-3d10-48b9-b18d-675de5ef7789","Type":"ContainerStarted","Data":"e47be345db3fed2e5628de1b23ad88697793b42fb0b4bdb523a0aef0fdbd6b74"} Apr 22 20:10:15.703650 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:10:15.703613 2577 generic.go:358] "Generic (PLEG): container finished" podID="c26a5363-3d10-48b9-b18d-675de5ef7789" containerID="32b5370f719e44587d010601527f9779267d65411cc0b4f8e5e99ba9d80b193a" exitCode=0 Apr 22 20:10:15.704079 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:10:15.703674 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-77b55c8fc-5vjbr" event={"ID":"c26a5363-3d10-48b9-b18d-675de5ef7789","Type":"ContainerDied","Data":"32b5370f719e44587d010601527f9779267d65411cc0b4f8e5e99ba9d80b193a"} Apr 22 20:10:16.708067 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:10:16.708033 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-77b55c8fc-5vjbr" event={"ID":"c26a5363-3d10-48b9-b18d-675de5ef7789","Type":"ContainerStarted","Data":"6d31a78f6a218babc926ea4fa96924263526480f806f580b21458c342b54e7a5"} Apr 22 20:10:16.725943 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:10:16.725889 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-77b55c8fc-5vjbr" podStartSLOduration=8.725868453 podStartE2EDuration="8.725868453s" podCreationTimestamp="2026-04-22 20:10:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 20:10:16.724446558 +0000 UTC m=+752.039538791" watchObservedRunningTime="2026-04-22 20:10:16.725868453 +0000 UTC m=+752.040960683" Apr 22 20:10:18.835104 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:10:18.835061 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-77b55c8fc-5vjbr" Apr 22 20:10:18.835104 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:10:18.835102 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-77b55c8fc-5vjbr" Apr 22 20:10:18.847584 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:10:18.847559 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-77b55c8fc-5vjbr" Apr 22 20:10:19.728996 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:10:19.728969 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-77b55c8fc-5vjbr" Apr 22 20:10:42.429770 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:10:42.429667 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-77b55c8fc-5vjbr"] Apr 22 20:10:42.430356 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:10:42.430064 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-77b55c8fc-5vjbr" podUID="c26a5363-3d10-48b9-b18d-675de5ef7789" containerName="main" containerID="cri-o://6d31a78f6a218babc926ea4fa96924263526480f806f580b21458c342b54e7a5" gracePeriod=30 Apr 22 20:10:42.673322 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:10:42.673300 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-77b55c8fc-5vjbr" Apr 22 20:10:42.768045 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:10:42.767954 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/c26a5363-3d10-48b9-b18d-675de5ef7789-dshm\") pod \"c26a5363-3d10-48b9-b18d-675de5ef7789\" (UID: \"c26a5363-3d10-48b9-b18d-675de5ef7789\") " Apr 22 20:10:42.768045 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:10:42.768006 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g9gkt\" (UniqueName: \"kubernetes.io/projected/c26a5363-3d10-48b9-b18d-675de5ef7789-kube-api-access-g9gkt\") pod \"c26a5363-3d10-48b9-b18d-675de5ef7789\" (UID: \"c26a5363-3d10-48b9-b18d-675de5ef7789\") " Apr 22 20:10:42.768045 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:10:42.768021 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/c26a5363-3d10-48b9-b18d-675de5ef7789-home\") pod \"c26a5363-3d10-48b9-b18d-675de5ef7789\" (UID: \"c26a5363-3d10-48b9-b18d-675de5ef7789\") " Apr 22 20:10:42.768327 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:10:42.768073 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c26a5363-3d10-48b9-b18d-675de5ef7789-tls-certs\") pod \"c26a5363-3d10-48b9-b18d-675de5ef7789\" (UID: \"c26a5363-3d10-48b9-b18d-675de5ef7789\") " Apr 22 20:10:42.768327 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:10:42.768120 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c26a5363-3d10-48b9-b18d-675de5ef7789-kserve-provision-location\") pod \"c26a5363-3d10-48b9-b18d-675de5ef7789\" (UID: \"c26a5363-3d10-48b9-b18d-675de5ef7789\") " Apr 22 20:10:42.768327 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:10:42.768139 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/c26a5363-3d10-48b9-b18d-675de5ef7789-model-cache\") pod \"c26a5363-3d10-48b9-b18d-675de5ef7789\" (UID: \"c26a5363-3d10-48b9-b18d-675de5ef7789\") " Apr 22 20:10:42.768327 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:10:42.768228 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c26a5363-3d10-48b9-b18d-675de5ef7789-home" (OuterVolumeSpecName: "home") pod "c26a5363-3d10-48b9-b18d-675de5ef7789" (UID: "c26a5363-3d10-48b9-b18d-675de5ef7789"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 20:10:42.768525 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:10:42.768347 2577 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/c26a5363-3d10-48b9-b18d-675de5ef7789-home\") on node \"ip-10-0-143-253.ec2.internal\" DevicePath \"\"" Apr 22 20:10:42.768525 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:10:42.768444 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c26a5363-3d10-48b9-b18d-675de5ef7789-model-cache" (OuterVolumeSpecName: "model-cache") pod "c26a5363-3d10-48b9-b18d-675de5ef7789" (UID: "c26a5363-3d10-48b9-b18d-675de5ef7789"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 20:10:42.770145 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:10:42.770125 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c26a5363-3d10-48b9-b18d-675de5ef7789-dshm" (OuterVolumeSpecName: "dshm") pod "c26a5363-3d10-48b9-b18d-675de5ef7789" (UID: "c26a5363-3d10-48b9-b18d-675de5ef7789"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 20:10:42.770249 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:10:42.770231 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c26a5363-3d10-48b9-b18d-675de5ef7789-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "c26a5363-3d10-48b9-b18d-675de5ef7789" (UID: "c26a5363-3d10-48b9-b18d-675de5ef7789"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 20:10:42.770453 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:10:42.770439 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c26a5363-3d10-48b9-b18d-675de5ef7789-kube-api-access-g9gkt" (OuterVolumeSpecName: "kube-api-access-g9gkt") pod "c26a5363-3d10-48b9-b18d-675de5ef7789" (UID: "c26a5363-3d10-48b9-b18d-675de5ef7789"). InnerVolumeSpecName "kube-api-access-g9gkt". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 20:10:42.790073 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:10:42.790048 2577 generic.go:358] "Generic (PLEG): container finished" podID="c26a5363-3d10-48b9-b18d-675de5ef7789" containerID="6d31a78f6a218babc926ea4fa96924263526480f806f580b21458c342b54e7a5" exitCode=0 Apr 22 20:10:42.790192 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:10:42.790081 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-77b55c8fc-5vjbr" event={"ID":"c26a5363-3d10-48b9-b18d-675de5ef7789","Type":"ContainerDied","Data":"6d31a78f6a218babc926ea4fa96924263526480f806f580b21458c342b54e7a5"} Apr 22 20:10:42.790192 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:10:42.790104 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-77b55c8fc-5vjbr" event={"ID":"c26a5363-3d10-48b9-b18d-675de5ef7789","Type":"ContainerDied","Data":"e47be345db3fed2e5628de1b23ad88697793b42fb0b4bdb523a0aef0fdbd6b74"} Apr 22 20:10:42.790192 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:10:42.790124 2577 scope.go:117] "RemoveContainer" containerID="6d31a78f6a218babc926ea4fa96924263526480f806f580b21458c342b54e7a5" Apr 22 20:10:42.790192 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:10:42.790132 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-77b55c8fc-5vjbr" Apr 22 20:10:42.797651 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:10:42.797632 2577 scope.go:117] "RemoveContainer" containerID="32b5370f719e44587d010601527f9779267d65411cc0b4f8e5e99ba9d80b193a" Apr 22 20:10:42.827387 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:10:42.827340 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c26a5363-3d10-48b9-b18d-675de5ef7789-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "c26a5363-3d10-48b9-b18d-675de5ef7789" (UID: "c26a5363-3d10-48b9-b18d-675de5ef7789"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 20:10:42.859888 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:10:42.859865 2577 scope.go:117] "RemoveContainer" containerID="6d31a78f6a218babc926ea4fa96924263526480f806f580b21458c342b54e7a5" Apr 22 20:10:42.860178 ip-10-0-143-253 kubenswrapper[2577]: E0422 20:10:42.860156 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6d31a78f6a218babc926ea4fa96924263526480f806f580b21458c342b54e7a5\": container with ID starting with 6d31a78f6a218babc926ea4fa96924263526480f806f580b21458c342b54e7a5 not found: ID does not exist" containerID="6d31a78f6a218babc926ea4fa96924263526480f806f580b21458c342b54e7a5" Apr 22 20:10:42.860230 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:10:42.860189 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d31a78f6a218babc926ea4fa96924263526480f806f580b21458c342b54e7a5"} err="failed to get container status \"6d31a78f6a218babc926ea4fa96924263526480f806f580b21458c342b54e7a5\": rpc error: code = NotFound desc = could not find container \"6d31a78f6a218babc926ea4fa96924263526480f806f580b21458c342b54e7a5\": container with ID starting with 6d31a78f6a218babc926ea4fa96924263526480f806f580b21458c342b54e7a5 not found: ID does not exist" Apr 22 20:10:42.860230 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:10:42.860208 2577 scope.go:117] "RemoveContainer" containerID="32b5370f719e44587d010601527f9779267d65411cc0b4f8e5e99ba9d80b193a" Apr 22 20:10:42.860471 ip-10-0-143-253 kubenswrapper[2577]: E0422 20:10:42.860452 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"32b5370f719e44587d010601527f9779267d65411cc0b4f8e5e99ba9d80b193a\": container with ID starting with 32b5370f719e44587d010601527f9779267d65411cc0b4f8e5e99ba9d80b193a not found: ID does not exist" containerID="32b5370f719e44587d010601527f9779267d65411cc0b4f8e5e99ba9d80b193a" Apr 22 20:10:42.860523 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:10:42.860476 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"32b5370f719e44587d010601527f9779267d65411cc0b4f8e5e99ba9d80b193a"} err="failed to get container status \"32b5370f719e44587d010601527f9779267d65411cc0b4f8e5e99ba9d80b193a\": rpc error: code = NotFound desc = could not find container \"32b5370f719e44587d010601527f9779267d65411cc0b4f8e5e99ba9d80b193a\": container with ID starting with 32b5370f719e44587d010601527f9779267d65411cc0b4f8e5e99ba9d80b193a not found: ID does not exist" Apr 22 20:10:42.868898 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:10:42.868875 2577 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c26a5363-3d10-48b9-b18d-675de5ef7789-kserve-provision-location\") on node \"ip-10-0-143-253.ec2.internal\" DevicePath \"\"" Apr 22 20:10:42.868898 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:10:42.868898 2577 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/c26a5363-3d10-48b9-b18d-675de5ef7789-model-cache\") on node \"ip-10-0-143-253.ec2.internal\" DevicePath \"\"" Apr 22 20:10:42.869050 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:10:42.868908 2577 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/c26a5363-3d10-48b9-b18d-675de5ef7789-dshm\") on node \"ip-10-0-143-253.ec2.internal\" DevicePath \"\"" Apr 22 20:10:42.869050 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:10:42.868918 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-g9gkt\" (UniqueName: \"kubernetes.io/projected/c26a5363-3d10-48b9-b18d-675de5ef7789-kube-api-access-g9gkt\") on node \"ip-10-0-143-253.ec2.internal\" DevicePath \"\"" Apr 22 20:10:42.869050 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:10:42.868927 2577 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c26a5363-3d10-48b9-b18d-675de5ef7789-tls-certs\") on node \"ip-10-0-143-253.ec2.internal\" DevicePath \"\"" Apr 22 20:10:43.111170 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:10:43.111143 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-77b55c8fc-5vjbr"] Apr 22 20:10:43.115351 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:10:43.115329 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-77b55c8fc-5vjbr"] Apr 22 20:10:43.252516 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:10:43.252486 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c26a5363-3d10-48b9-b18d-675de5ef7789" path="/var/lib/kubelet/pods/c26a5363-3d10-48b9-b18d-675de5ef7789/volumes" Apr 22 20:11:07.643957 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:11:07.643869 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-8755cc8d4-4nqql"] Apr 22 20:11:07.644369 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:11:07.644200 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c26a5363-3d10-48b9-b18d-675de5ef7789" containerName="main" Apr 22 20:11:07.644369 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:11:07.644212 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="c26a5363-3d10-48b9-b18d-675de5ef7789" containerName="main" Apr 22 20:11:07.644369 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:11:07.644230 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c26a5363-3d10-48b9-b18d-675de5ef7789" containerName="storage-initializer" Apr 22 20:11:07.644369 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:11:07.644239 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="c26a5363-3d10-48b9-b18d-675de5ef7789" containerName="storage-initializer" Apr 22 20:11:07.644369 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:11:07.644299 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="c26a5363-3d10-48b9-b18d-675de5ef7789" containerName="main" Apr 22 20:11:07.649374 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:11:07.649353 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-8755cc8d4-4nqql" Apr 22 20:11:07.651659 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:11:07.651634 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 22 20:11:07.651786 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:11:07.651657 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 22 20:11:07.651886 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:11:07.651821 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"stop-feature-test-kserve-self-signed-certs\"" Apr 22 20:11:07.652338 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:11:07.652319 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"stop-feature-test-epp-sa-dockercfg-j5m5z\"" Apr 22 20:11:07.652459 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:11:07.652362 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-tlcdw\"" Apr 22 20:11:07.661855 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:11:07.659609 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-8755cc8d4-4nqql"] Apr 22 20:11:07.758200 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:11:07.758172 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/c90ff58b-5721-4fb5-b68b-cb94c7ac5e0f-tokenizer-tmp\") pod \"stop-feature-test-kserve-router-scheduler-8755cc8d4-4nqql\" (UID: \"c90ff58b-5721-4fb5-b68b-cb94c7ac5e0f\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-8755cc8d4-4nqql" Apr 22 20:11:07.758378 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:11:07.758215 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/c90ff58b-5721-4fb5-b68b-cb94c7ac5e0f-tokenizer-uds\") pod \"stop-feature-test-kserve-router-scheduler-8755cc8d4-4nqql\" (UID: \"c90ff58b-5721-4fb5-b68b-cb94c7ac5e0f\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-8755cc8d4-4nqql" Apr 22 20:11:07.758378 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:11:07.758239 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/c90ff58b-5721-4fb5-b68b-cb94c7ac5e0f-tokenizer-cache\") pod \"stop-feature-test-kserve-router-scheduler-8755cc8d4-4nqql\" (UID: \"c90ff58b-5721-4fb5-b68b-cb94c7ac5e0f\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-8755cc8d4-4nqql" Apr 22 20:11:07.758378 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:11:07.758333 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c90ff58b-5721-4fb5-b68b-cb94c7ac5e0f-kserve-provision-location\") pod \"stop-feature-test-kserve-router-scheduler-8755cc8d4-4nqql\" (UID: \"c90ff58b-5721-4fb5-b68b-cb94c7ac5e0f\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-8755cc8d4-4nqql" Apr 22 20:11:07.758543 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:11:07.758393 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c90ff58b-5721-4fb5-b68b-cb94c7ac5e0f-tls-certs\") pod \"stop-feature-test-kserve-router-scheduler-8755cc8d4-4nqql\" (UID: \"c90ff58b-5721-4fb5-b68b-cb94c7ac5e0f\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-8755cc8d4-4nqql" Apr 22 20:11:07.758543 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:11:07.758461 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kmgpb\" (UniqueName: \"kubernetes.io/projected/c90ff58b-5721-4fb5-b68b-cb94c7ac5e0f-kube-api-access-kmgpb\") pod \"stop-feature-test-kserve-router-scheduler-8755cc8d4-4nqql\" (UID: \"c90ff58b-5721-4fb5-b68b-cb94c7ac5e0f\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-8755cc8d4-4nqql" Apr 22 20:11:07.859268 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:11:07.859227 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kmgpb\" (UniqueName: \"kubernetes.io/projected/c90ff58b-5721-4fb5-b68b-cb94c7ac5e0f-kube-api-access-kmgpb\") pod \"stop-feature-test-kserve-router-scheduler-8755cc8d4-4nqql\" (UID: \"c90ff58b-5721-4fb5-b68b-cb94c7ac5e0f\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-8755cc8d4-4nqql" Apr 22 20:11:07.859457 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:11:07.859274 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/c90ff58b-5721-4fb5-b68b-cb94c7ac5e0f-tokenizer-tmp\") pod \"stop-feature-test-kserve-router-scheduler-8755cc8d4-4nqql\" (UID: \"c90ff58b-5721-4fb5-b68b-cb94c7ac5e0f\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-8755cc8d4-4nqql" Apr 22 20:11:07.859457 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:11:07.859313 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/c90ff58b-5721-4fb5-b68b-cb94c7ac5e0f-tokenizer-uds\") pod \"stop-feature-test-kserve-router-scheduler-8755cc8d4-4nqql\" (UID: \"c90ff58b-5721-4fb5-b68b-cb94c7ac5e0f\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-8755cc8d4-4nqql" Apr 22 20:11:07.859457 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:11:07.859344 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/c90ff58b-5721-4fb5-b68b-cb94c7ac5e0f-tokenizer-cache\") pod \"stop-feature-test-kserve-router-scheduler-8755cc8d4-4nqql\" (UID: \"c90ff58b-5721-4fb5-b68b-cb94c7ac5e0f\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-8755cc8d4-4nqql" Apr 22 20:11:07.859604 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:11:07.859502 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c90ff58b-5721-4fb5-b68b-cb94c7ac5e0f-kserve-provision-location\") pod \"stop-feature-test-kserve-router-scheduler-8755cc8d4-4nqql\" (UID: \"c90ff58b-5721-4fb5-b68b-cb94c7ac5e0f\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-8755cc8d4-4nqql" Apr 22 20:11:07.859604 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:11:07.859586 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c90ff58b-5721-4fb5-b68b-cb94c7ac5e0f-tls-certs\") pod \"stop-feature-test-kserve-router-scheduler-8755cc8d4-4nqql\" (UID: \"c90ff58b-5721-4fb5-b68b-cb94c7ac5e0f\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-8755cc8d4-4nqql" Apr 22 20:11:07.859739 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:11:07.859723 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/c90ff58b-5721-4fb5-b68b-cb94c7ac5e0f-tokenizer-uds\") pod \"stop-feature-test-kserve-router-scheduler-8755cc8d4-4nqql\" (UID: \"c90ff58b-5721-4fb5-b68b-cb94c7ac5e0f\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-8755cc8d4-4nqql" Apr 22 20:11:07.859806 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:11:07.859784 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/c90ff58b-5721-4fb5-b68b-cb94c7ac5e0f-tokenizer-tmp\") pod \"stop-feature-test-kserve-router-scheduler-8755cc8d4-4nqql\" (UID: \"c90ff58b-5721-4fb5-b68b-cb94c7ac5e0f\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-8755cc8d4-4nqql" Apr 22 20:11:07.859903 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:11:07.859829 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/c90ff58b-5721-4fb5-b68b-cb94c7ac5e0f-tokenizer-cache\") pod \"stop-feature-test-kserve-router-scheduler-8755cc8d4-4nqql\" (UID: \"c90ff58b-5721-4fb5-b68b-cb94c7ac5e0f\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-8755cc8d4-4nqql" Apr 22 20:11:07.859903 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:11:07.859881 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c90ff58b-5721-4fb5-b68b-cb94c7ac5e0f-kserve-provision-location\") pod \"stop-feature-test-kserve-router-scheduler-8755cc8d4-4nqql\" (UID: \"c90ff58b-5721-4fb5-b68b-cb94c7ac5e0f\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-8755cc8d4-4nqql" Apr 22 20:11:07.862115 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:11:07.862094 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c90ff58b-5721-4fb5-b68b-cb94c7ac5e0f-tls-certs\") pod \"stop-feature-test-kserve-router-scheduler-8755cc8d4-4nqql\" (UID: \"c90ff58b-5721-4fb5-b68b-cb94c7ac5e0f\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-8755cc8d4-4nqql" Apr 22 20:11:07.866413 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:11:07.866396 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kmgpb\" (UniqueName: \"kubernetes.io/projected/c90ff58b-5721-4fb5-b68b-cb94c7ac5e0f-kube-api-access-kmgpb\") pod \"stop-feature-test-kserve-router-scheduler-8755cc8d4-4nqql\" (UID: \"c90ff58b-5721-4fb5-b68b-cb94c7ac5e0f\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-8755cc8d4-4nqql" Apr 22 20:11:07.965335 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:11:07.965224 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-8755cc8d4-4nqql" Apr 22 20:11:08.087819 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:11:08.087784 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-8755cc8d4-4nqql"] Apr 22 20:11:08.090906 ip-10-0-143-253 kubenswrapper[2577]: W0422 20:11:08.090880 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc90ff58b_5721_4fb5_b68b_cb94c7ac5e0f.slice/crio-c86f378444380bb3498f65d3e8b7f193e1e1dbb8beb9a58cf67f3c8a82920f66 WatchSource:0}: Error finding container c86f378444380bb3498f65d3e8b7f193e1e1dbb8beb9a58cf67f3c8a82920f66: Status 404 returned error can't find the container with id c86f378444380bb3498f65d3e8b7f193e1e1dbb8beb9a58cf67f3c8a82920f66 Apr 22 20:11:08.884169 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:11:08.884130 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-8755cc8d4-4nqql" event={"ID":"c90ff58b-5721-4fb5-b68b-cb94c7ac5e0f","Type":"ContainerStarted","Data":"4d8247fbe10b5548cc42a32b3233cc0a93778e35312427f073b116455124d286"} Apr 22 20:11:08.884169 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:11:08.884168 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-8755cc8d4-4nqql" event={"ID":"c90ff58b-5721-4fb5-b68b-cb94c7ac5e0f","Type":"ContainerStarted","Data":"c86f378444380bb3498f65d3e8b7f193e1e1dbb8beb9a58cf67f3c8a82920f66"} Apr 22 20:11:09.888413 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:11:09.888380 2577 generic.go:358] "Generic (PLEG): container finished" podID="c90ff58b-5721-4fb5-b68b-cb94c7ac5e0f" containerID="4d8247fbe10b5548cc42a32b3233cc0a93778e35312427f073b116455124d286" exitCode=0 Apr 22 20:11:09.888812 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:11:09.888437 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-8755cc8d4-4nqql" event={"ID":"c90ff58b-5721-4fb5-b68b-cb94c7ac5e0f","Type":"ContainerDied","Data":"4d8247fbe10b5548cc42a32b3233cc0a93778e35312427f073b116455124d286"} Apr 22 20:11:10.894092 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:11:10.894054 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-8755cc8d4-4nqql" event={"ID":"c90ff58b-5721-4fb5-b68b-cb94c7ac5e0f","Type":"ContainerStarted","Data":"7643ea63a56ce192804006c525b233a9160f26ec71c5857a5c8d22e27c3e5c0a"} Apr 22 20:11:10.894542 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:11:10.894097 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-8755cc8d4-4nqql" event={"ID":"c90ff58b-5721-4fb5-b68b-cb94c7ac5e0f","Type":"ContainerStarted","Data":"08088ccb98df16993649bc732e600b9aff4d50c1204e9774ef72e83b432c4472"} Apr 22 20:11:10.894542 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:11:10.894193 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-8755cc8d4-4nqql" Apr 22 20:11:10.917013 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:11:10.916968 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-8755cc8d4-4nqql" podStartSLOduration=3.916945594 podStartE2EDuration="3.916945594s" podCreationTimestamp="2026-04-22 20:11:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 20:11:10.915353659 +0000 UTC m=+806.230445890" watchObservedRunningTime="2026-04-22 20:11:10.916945594 +0000 UTC m=+806.232037827" Apr 22 20:11:17.965944 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:11:17.965900 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-8755cc8d4-4nqql" Apr 22 20:11:17.965944 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:11:17.965951 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-8755cc8d4-4nqql" Apr 22 20:11:17.968414 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:11:17.968392 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-8755cc8d4-4nqql" Apr 22 20:11:18.923433 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:11:18.923401 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-8755cc8d4-4nqql" Apr 22 20:11:39.927267 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:11:39.927239 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-8755cc8d4-4nqql" Apr 22 20:12:59.062462 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:12:59.062431 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-8755cc8d4-4nqql"] Apr 22 20:12:59.242484 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:12:59.242413 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-8755cc8d4-4nqql" podUID="c90ff58b-5721-4fb5-b68b-cb94c7ac5e0f" containerName="main" containerID="cri-o://08088ccb98df16993649bc732e600b9aff4d50c1204e9774ef72e83b432c4472" gracePeriod=30 Apr 22 20:12:59.242484 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:12:59.242438 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-8755cc8d4-4nqql" podUID="c90ff58b-5721-4fb5-b68b-cb94c7ac5e0f" containerName="tokenizer" containerID="cri-o://7643ea63a56ce192804006c525b233a9160f26ec71c5857a5c8d22e27c3e5c0a" gracePeriod=30 Apr 22 20:12:59.927172 ip-10-0-143-253 kubenswrapper[2577]: W0422 20:12:59.927139 2577 logging.go:55] [core] [Channel #89 SubChannel #90]grpc: addrConn.createTransport failed to connect to {Addr: "10.133.0.28:9003", ServerName: "10.133.0.28:9003", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 10.133.0.28:9003: connect: connection refused" Apr 22 20:13:00.248474 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:13:00.248442 2577 generic.go:358] "Generic (PLEG): container finished" podID="c90ff58b-5721-4fb5-b68b-cb94c7ac5e0f" containerID="08088ccb98df16993649bc732e600b9aff4d50c1204e9774ef72e83b432c4472" exitCode=0 Apr 22 20:13:00.248778 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:13:00.248505 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-8755cc8d4-4nqql" event={"ID":"c90ff58b-5721-4fb5-b68b-cb94c7ac5e0f","Type":"ContainerDied","Data":"08088ccb98df16993649bc732e600b9aff4d50c1204e9774ef72e83b432c4472"} Apr 22 20:13:00.385154 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:13:00.385133 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-8755cc8d4-4nqql" Apr 22 20:13:00.513344 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:13:00.513254 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kmgpb\" (UniqueName: \"kubernetes.io/projected/c90ff58b-5721-4fb5-b68b-cb94c7ac5e0f-kube-api-access-kmgpb\") pod \"c90ff58b-5721-4fb5-b68b-cb94c7ac5e0f\" (UID: \"c90ff58b-5721-4fb5-b68b-cb94c7ac5e0f\") " Apr 22 20:13:00.513344 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:13:00.513335 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/c90ff58b-5721-4fb5-b68b-cb94c7ac5e0f-tokenizer-tmp\") pod \"c90ff58b-5721-4fb5-b68b-cb94c7ac5e0f\" (UID: \"c90ff58b-5721-4fb5-b68b-cb94c7ac5e0f\") " Apr 22 20:13:00.513570 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:13:00.513363 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c90ff58b-5721-4fb5-b68b-cb94c7ac5e0f-tls-certs\") pod \"c90ff58b-5721-4fb5-b68b-cb94c7ac5e0f\" (UID: \"c90ff58b-5721-4fb5-b68b-cb94c7ac5e0f\") " Apr 22 20:13:00.513570 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:13:00.513396 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/c90ff58b-5721-4fb5-b68b-cb94c7ac5e0f-tokenizer-cache\") pod \"c90ff58b-5721-4fb5-b68b-cb94c7ac5e0f\" (UID: \"c90ff58b-5721-4fb5-b68b-cb94c7ac5e0f\") " Apr 22 20:13:00.513570 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:13:00.513437 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c90ff58b-5721-4fb5-b68b-cb94c7ac5e0f-kserve-provision-location\") pod \"c90ff58b-5721-4fb5-b68b-cb94c7ac5e0f\" (UID: \"c90ff58b-5721-4fb5-b68b-cb94c7ac5e0f\") " Apr 22 20:13:00.513570 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:13:00.513493 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/c90ff58b-5721-4fb5-b68b-cb94c7ac5e0f-tokenizer-uds\") pod \"c90ff58b-5721-4fb5-b68b-cb94c7ac5e0f\" (UID: \"c90ff58b-5721-4fb5-b68b-cb94c7ac5e0f\") " Apr 22 20:13:00.513750 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:13:00.513646 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c90ff58b-5721-4fb5-b68b-cb94c7ac5e0f-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "c90ff58b-5721-4fb5-b68b-cb94c7ac5e0f" (UID: "c90ff58b-5721-4fb5-b68b-cb94c7ac5e0f"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 20:13:00.513798 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:13:00.513726 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c90ff58b-5721-4fb5-b68b-cb94c7ac5e0f-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "c90ff58b-5721-4fb5-b68b-cb94c7ac5e0f" (UID: "c90ff58b-5721-4fb5-b68b-cb94c7ac5e0f"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 20:13:00.513798 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:13:00.513775 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c90ff58b-5721-4fb5-b68b-cb94c7ac5e0f-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "c90ff58b-5721-4fb5-b68b-cb94c7ac5e0f" (UID: "c90ff58b-5721-4fb5-b68b-cb94c7ac5e0f"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 20:13:00.514314 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:13:00.514287 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c90ff58b-5721-4fb5-b68b-cb94c7ac5e0f-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "c90ff58b-5721-4fb5-b68b-cb94c7ac5e0f" (UID: "c90ff58b-5721-4fb5-b68b-cb94c7ac5e0f"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 20:13:00.515338 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:13:00.515315 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c90ff58b-5721-4fb5-b68b-cb94c7ac5e0f-kube-api-access-kmgpb" (OuterVolumeSpecName: "kube-api-access-kmgpb") pod "c90ff58b-5721-4fb5-b68b-cb94c7ac5e0f" (UID: "c90ff58b-5721-4fb5-b68b-cb94c7ac5e0f"). InnerVolumeSpecName "kube-api-access-kmgpb". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 20:13:00.515434 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:13:00.515416 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c90ff58b-5721-4fb5-b68b-cb94c7ac5e0f-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "c90ff58b-5721-4fb5-b68b-cb94c7ac5e0f" (UID: "c90ff58b-5721-4fb5-b68b-cb94c7ac5e0f"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 20:13:00.614822 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:13:00.614783 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-kmgpb\" (UniqueName: \"kubernetes.io/projected/c90ff58b-5721-4fb5-b68b-cb94c7ac5e0f-kube-api-access-kmgpb\") on node \"ip-10-0-143-253.ec2.internal\" DevicePath \"\"" Apr 22 20:13:00.614822 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:13:00.614814 2577 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/c90ff58b-5721-4fb5-b68b-cb94c7ac5e0f-tokenizer-tmp\") on node \"ip-10-0-143-253.ec2.internal\" DevicePath \"\"" Apr 22 20:13:00.614822 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:13:00.614825 2577 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c90ff58b-5721-4fb5-b68b-cb94c7ac5e0f-tls-certs\") on node \"ip-10-0-143-253.ec2.internal\" DevicePath \"\"" Apr 22 20:13:00.615081 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:13:00.614861 2577 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/c90ff58b-5721-4fb5-b68b-cb94c7ac5e0f-tokenizer-cache\") on node \"ip-10-0-143-253.ec2.internal\" DevicePath \"\"" Apr 22 20:13:00.615081 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:13:00.614871 2577 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c90ff58b-5721-4fb5-b68b-cb94c7ac5e0f-kserve-provision-location\") on node \"ip-10-0-143-253.ec2.internal\" DevicePath \"\"" Apr 22 20:13:00.615081 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:13:00.614880 2577 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/c90ff58b-5721-4fb5-b68b-cb94c7ac5e0f-tokenizer-uds\") on node \"ip-10-0-143-253.ec2.internal\" DevicePath \"\"" Apr 22 20:13:00.926451 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:13:00.926412 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-8755cc8d4-4nqql" podUID="c90ff58b-5721-4fb5-b68b-cb94c7ac5e0f" containerName="main" probeResult="failure" output="timeout: failed to connect service \"10.133.0.28:9003\" within 1s: context deadline exceeded" Apr 22 20:13:01.252964 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:13:01.252883 2577 generic.go:358] "Generic (PLEG): container finished" podID="c90ff58b-5721-4fb5-b68b-cb94c7ac5e0f" containerID="7643ea63a56ce192804006c525b233a9160f26ec71c5857a5c8d22e27c3e5c0a" exitCode=0 Apr 22 20:13:01.252964 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:13:01.252952 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-8755cc8d4-4nqql" Apr 22 20:13:01.253407 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:13:01.252952 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-8755cc8d4-4nqql" event={"ID":"c90ff58b-5721-4fb5-b68b-cb94c7ac5e0f","Type":"ContainerDied","Data":"7643ea63a56ce192804006c525b233a9160f26ec71c5857a5c8d22e27c3e5c0a"} Apr 22 20:13:01.253407 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:13:01.253049 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-8755cc8d4-4nqql" event={"ID":"c90ff58b-5721-4fb5-b68b-cb94c7ac5e0f","Type":"ContainerDied","Data":"c86f378444380bb3498f65d3e8b7f193e1e1dbb8beb9a58cf67f3c8a82920f66"} Apr 22 20:13:01.253407 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:13:01.253066 2577 scope.go:117] "RemoveContainer" containerID="7643ea63a56ce192804006c525b233a9160f26ec71c5857a5c8d22e27c3e5c0a" Apr 22 20:13:01.261444 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:13:01.261319 2577 scope.go:117] "RemoveContainer" containerID="08088ccb98df16993649bc732e600b9aff4d50c1204e9774ef72e83b432c4472" Apr 22 20:13:01.268790 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:13:01.268771 2577 scope.go:117] "RemoveContainer" containerID="4d8247fbe10b5548cc42a32b3233cc0a93778e35312427f073b116455124d286" Apr 22 20:13:01.275290 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:13:01.275270 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-8755cc8d4-4nqql"] Apr 22 20:13:01.275695 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:13:01.275679 2577 scope.go:117] "RemoveContainer" containerID="7643ea63a56ce192804006c525b233a9160f26ec71c5857a5c8d22e27c3e5c0a" Apr 22 20:13:01.276015 ip-10-0-143-253 kubenswrapper[2577]: E0422 20:13:01.275991 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7643ea63a56ce192804006c525b233a9160f26ec71c5857a5c8d22e27c3e5c0a\": container with ID starting with 7643ea63a56ce192804006c525b233a9160f26ec71c5857a5c8d22e27c3e5c0a not found: ID does not exist" containerID="7643ea63a56ce192804006c525b233a9160f26ec71c5857a5c8d22e27c3e5c0a" Apr 22 20:13:01.276080 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:13:01.276022 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7643ea63a56ce192804006c525b233a9160f26ec71c5857a5c8d22e27c3e5c0a"} err="failed to get container status \"7643ea63a56ce192804006c525b233a9160f26ec71c5857a5c8d22e27c3e5c0a\": rpc error: code = NotFound desc = could not find container \"7643ea63a56ce192804006c525b233a9160f26ec71c5857a5c8d22e27c3e5c0a\": container with ID starting with 7643ea63a56ce192804006c525b233a9160f26ec71c5857a5c8d22e27c3e5c0a not found: ID does not exist" Apr 22 20:13:01.276080 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:13:01.276037 2577 scope.go:117] "RemoveContainer" containerID="08088ccb98df16993649bc732e600b9aff4d50c1204e9774ef72e83b432c4472" Apr 22 20:13:01.276315 ip-10-0-143-253 kubenswrapper[2577]: E0422 20:13:01.276279 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"08088ccb98df16993649bc732e600b9aff4d50c1204e9774ef72e83b432c4472\": container with ID starting with 08088ccb98df16993649bc732e600b9aff4d50c1204e9774ef72e83b432c4472 not found: ID does not exist" containerID="08088ccb98df16993649bc732e600b9aff4d50c1204e9774ef72e83b432c4472" Apr 22 20:13:01.276390 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:13:01.276321 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"08088ccb98df16993649bc732e600b9aff4d50c1204e9774ef72e83b432c4472"} err="failed to get container status \"08088ccb98df16993649bc732e600b9aff4d50c1204e9774ef72e83b432c4472\": rpc error: code = NotFound desc = could not find container \"08088ccb98df16993649bc732e600b9aff4d50c1204e9774ef72e83b432c4472\": container with ID starting with 08088ccb98df16993649bc732e600b9aff4d50c1204e9774ef72e83b432c4472 not found: ID does not exist" Apr 22 20:13:01.276390 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:13:01.276337 2577 scope.go:117] "RemoveContainer" containerID="4d8247fbe10b5548cc42a32b3233cc0a93778e35312427f073b116455124d286" Apr 22 20:13:01.276558 ip-10-0-143-253 kubenswrapper[2577]: E0422 20:13:01.276541 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4d8247fbe10b5548cc42a32b3233cc0a93778e35312427f073b116455124d286\": container with ID starting with 4d8247fbe10b5548cc42a32b3233cc0a93778e35312427f073b116455124d286 not found: ID does not exist" containerID="4d8247fbe10b5548cc42a32b3233cc0a93778e35312427f073b116455124d286" Apr 22 20:13:01.276608 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:13:01.276567 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4d8247fbe10b5548cc42a32b3233cc0a93778e35312427f073b116455124d286"} err="failed to get container status \"4d8247fbe10b5548cc42a32b3233cc0a93778e35312427f073b116455124d286\": rpc error: code = NotFound desc = could not find container \"4d8247fbe10b5548cc42a32b3233cc0a93778e35312427f073b116455124d286\": container with ID starting with 4d8247fbe10b5548cc42a32b3233cc0a93778e35312427f073b116455124d286 not found: ID does not exist" Apr 22 20:13:01.279249 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:13:01.279224 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-8755cc8d4-4nqql"] Apr 22 20:13:03.252645 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:13:03.252613 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c90ff58b-5721-4fb5-b68b-cb94c7ac5e0f" path="/var/lib/kubelet/pods/c90ff58b-5721-4fb5-b68b-cb94c7ac5e0f/volumes" Apr 22 20:13:27.615503 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:13:27.615469 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-8755cc8d4-jfz47"] Apr 22 20:13:27.616124 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:13:27.616073 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c90ff58b-5721-4fb5-b68b-cb94c7ac5e0f" containerName="main" Apr 22 20:13:27.616124 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:13:27.616101 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="c90ff58b-5721-4fb5-b68b-cb94c7ac5e0f" containerName="main" Apr 22 20:13:27.616124 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:13:27.616122 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c90ff58b-5721-4fb5-b68b-cb94c7ac5e0f" containerName="storage-initializer" Apr 22 20:13:27.616124 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:13:27.616132 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="c90ff58b-5721-4fb5-b68b-cb94c7ac5e0f" containerName="storage-initializer" Apr 22 20:13:27.616383 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:13:27.616165 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c90ff58b-5721-4fb5-b68b-cb94c7ac5e0f" containerName="tokenizer" Apr 22 20:13:27.616383 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:13:27.616174 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="c90ff58b-5721-4fb5-b68b-cb94c7ac5e0f" containerName="tokenizer" Apr 22 20:13:27.616383 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:13:27.616262 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="c90ff58b-5721-4fb5-b68b-cb94c7ac5e0f" containerName="main" Apr 22 20:13:27.616383 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:13:27.616277 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="c90ff58b-5721-4fb5-b68b-cb94c7ac5e0f" containerName="tokenizer" Apr 22 20:13:27.619200 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:13:27.619185 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-8755cc8d4-jfz47" Apr 22 20:13:27.621657 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:13:27.621636 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 22 20:13:27.621778 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:13:27.621684 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"stop-feature-test-epp-sa-dockercfg-cjmtt\"" Apr 22 20:13:27.622361 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:13:27.622342 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"stop-feature-test-kserve-self-signed-certs\"" Apr 22 20:13:27.622458 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:13:27.622364 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 22 20:13:27.622458 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:13:27.622372 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-tlcdw\"" Apr 22 20:13:27.629738 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:13:27.629718 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-8755cc8d4-jfz47"] Apr 22 20:13:27.641106 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:13:27.641080 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/903f834f-26c9-4ba6-9a21-4331d7505b51-tls-certs\") pod \"stop-feature-test-kserve-router-scheduler-8755cc8d4-jfz47\" (UID: \"903f834f-26c9-4ba6-9a21-4331d7505b51\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-8755cc8d4-jfz47" Apr 22 20:13:27.641268 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:13:27.641136 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/903f834f-26c9-4ba6-9a21-4331d7505b51-kserve-provision-location\") pod \"stop-feature-test-kserve-router-scheduler-8755cc8d4-jfz47\" (UID: \"903f834f-26c9-4ba6-9a21-4331d7505b51\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-8755cc8d4-jfz47" Apr 22 20:13:27.641268 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:13:27.641223 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/903f834f-26c9-4ba6-9a21-4331d7505b51-tokenizer-tmp\") pod \"stop-feature-test-kserve-router-scheduler-8755cc8d4-jfz47\" (UID: \"903f834f-26c9-4ba6-9a21-4331d7505b51\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-8755cc8d4-jfz47" Apr 22 20:13:27.641388 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:13:27.641285 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/903f834f-26c9-4ba6-9a21-4331d7505b51-tokenizer-uds\") pod \"stop-feature-test-kserve-router-scheduler-8755cc8d4-jfz47\" (UID: \"903f834f-26c9-4ba6-9a21-4331d7505b51\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-8755cc8d4-jfz47" Apr 22 20:13:27.641388 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:13:27.641369 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/903f834f-26c9-4ba6-9a21-4331d7505b51-tokenizer-cache\") pod \"stop-feature-test-kserve-router-scheduler-8755cc8d4-jfz47\" (UID: \"903f834f-26c9-4ba6-9a21-4331d7505b51\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-8755cc8d4-jfz47" Apr 22 20:13:27.641494 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:13:27.641399 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9452r\" (UniqueName: \"kubernetes.io/projected/903f834f-26c9-4ba6-9a21-4331d7505b51-kube-api-access-9452r\") pod \"stop-feature-test-kserve-router-scheduler-8755cc8d4-jfz47\" (UID: \"903f834f-26c9-4ba6-9a21-4331d7505b51\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-8755cc8d4-jfz47" Apr 22 20:13:27.742462 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:13:27.742429 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/903f834f-26c9-4ba6-9a21-4331d7505b51-tls-certs\") pod \"stop-feature-test-kserve-router-scheduler-8755cc8d4-jfz47\" (UID: \"903f834f-26c9-4ba6-9a21-4331d7505b51\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-8755cc8d4-jfz47" Apr 22 20:13:27.742462 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:13:27.742472 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/903f834f-26c9-4ba6-9a21-4331d7505b51-kserve-provision-location\") pod \"stop-feature-test-kserve-router-scheduler-8755cc8d4-jfz47\" (UID: \"903f834f-26c9-4ba6-9a21-4331d7505b51\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-8755cc8d4-jfz47" Apr 22 20:13:27.742739 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:13:27.742497 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/903f834f-26c9-4ba6-9a21-4331d7505b51-tokenizer-tmp\") pod \"stop-feature-test-kserve-router-scheduler-8755cc8d4-jfz47\" (UID: \"903f834f-26c9-4ba6-9a21-4331d7505b51\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-8755cc8d4-jfz47" Apr 22 20:13:27.742739 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:13:27.742527 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/903f834f-26c9-4ba6-9a21-4331d7505b51-tokenizer-uds\") pod \"stop-feature-test-kserve-router-scheduler-8755cc8d4-jfz47\" (UID: \"903f834f-26c9-4ba6-9a21-4331d7505b51\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-8755cc8d4-jfz47" Apr 22 20:13:27.742739 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:13:27.742566 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/903f834f-26c9-4ba6-9a21-4331d7505b51-tokenizer-cache\") pod \"stop-feature-test-kserve-router-scheduler-8755cc8d4-jfz47\" (UID: \"903f834f-26c9-4ba6-9a21-4331d7505b51\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-8755cc8d4-jfz47" Apr 22 20:13:27.742739 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:13:27.742582 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9452r\" (UniqueName: \"kubernetes.io/projected/903f834f-26c9-4ba6-9a21-4331d7505b51-kube-api-access-9452r\") pod \"stop-feature-test-kserve-router-scheduler-8755cc8d4-jfz47\" (UID: \"903f834f-26c9-4ba6-9a21-4331d7505b51\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-8755cc8d4-jfz47" Apr 22 20:13:27.742976 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:13:27.742948 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/903f834f-26c9-4ba6-9a21-4331d7505b51-kserve-provision-location\") pod \"stop-feature-test-kserve-router-scheduler-8755cc8d4-jfz47\" (UID: \"903f834f-26c9-4ba6-9a21-4331d7505b51\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-8755cc8d4-jfz47" Apr 22 20:13:27.743025 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:13:27.742974 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/903f834f-26c9-4ba6-9a21-4331d7505b51-tokenizer-cache\") pod \"stop-feature-test-kserve-router-scheduler-8755cc8d4-jfz47\" (UID: \"903f834f-26c9-4ba6-9a21-4331d7505b51\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-8755cc8d4-jfz47" Apr 22 20:13:27.743060 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:13:27.743024 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/903f834f-26c9-4ba6-9a21-4331d7505b51-tokenizer-uds\") pod \"stop-feature-test-kserve-router-scheduler-8755cc8d4-jfz47\" (UID: \"903f834f-26c9-4ba6-9a21-4331d7505b51\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-8755cc8d4-jfz47" Apr 22 20:13:27.743095 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:13:27.743072 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/903f834f-26c9-4ba6-9a21-4331d7505b51-tokenizer-tmp\") pod \"stop-feature-test-kserve-router-scheduler-8755cc8d4-jfz47\" (UID: \"903f834f-26c9-4ba6-9a21-4331d7505b51\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-8755cc8d4-jfz47" Apr 22 20:13:27.745018 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:13:27.744992 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/903f834f-26c9-4ba6-9a21-4331d7505b51-tls-certs\") pod \"stop-feature-test-kserve-router-scheduler-8755cc8d4-jfz47\" (UID: \"903f834f-26c9-4ba6-9a21-4331d7505b51\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-8755cc8d4-jfz47" Apr 22 20:13:27.750198 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:13:27.750173 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9452r\" (UniqueName: \"kubernetes.io/projected/903f834f-26c9-4ba6-9a21-4331d7505b51-kube-api-access-9452r\") pod \"stop-feature-test-kserve-router-scheduler-8755cc8d4-jfz47\" (UID: \"903f834f-26c9-4ba6-9a21-4331d7505b51\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-8755cc8d4-jfz47" Apr 22 20:13:27.929347 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:13:27.929256 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-8755cc8d4-jfz47" Apr 22 20:13:28.053620 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:13:28.053590 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-8755cc8d4-jfz47"] Apr 22 20:13:28.055368 ip-10-0-143-253 kubenswrapper[2577]: W0422 20:13:28.055331 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod903f834f_26c9_4ba6_9a21_4331d7505b51.slice/crio-8ab1101809f9247a331ef4be80364c3f8d3ceb65902d9ed6e6397054a2d6fd9d WatchSource:0}: Error finding container 8ab1101809f9247a331ef4be80364c3f8d3ceb65902d9ed6e6397054a2d6fd9d: Status 404 returned error can't find the container with id 8ab1101809f9247a331ef4be80364c3f8d3ceb65902d9ed6e6397054a2d6fd9d Apr 22 20:13:28.340029 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:13:28.339983 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-8755cc8d4-jfz47" event={"ID":"903f834f-26c9-4ba6-9a21-4331d7505b51","Type":"ContainerStarted","Data":"5f0c5466861b8996882ddec54bd5bdf5258d76f6f4dd7ad5f7c859494a6424d3"} Apr 22 20:13:28.340029 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:13:28.340030 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-8755cc8d4-jfz47" event={"ID":"903f834f-26c9-4ba6-9a21-4331d7505b51","Type":"ContainerStarted","Data":"8ab1101809f9247a331ef4be80364c3f8d3ceb65902d9ed6e6397054a2d6fd9d"} Apr 22 20:13:29.343893 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:13:29.343856 2577 generic.go:358] "Generic (PLEG): container finished" podID="903f834f-26c9-4ba6-9a21-4331d7505b51" containerID="5f0c5466861b8996882ddec54bd5bdf5258d76f6f4dd7ad5f7c859494a6424d3" exitCode=0 Apr 22 20:13:29.344355 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:13:29.343918 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-8755cc8d4-jfz47" event={"ID":"903f834f-26c9-4ba6-9a21-4331d7505b51","Type":"ContainerDied","Data":"5f0c5466861b8996882ddec54bd5bdf5258d76f6f4dd7ad5f7c859494a6424d3"} Apr 22 20:13:30.350039 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:13:30.349998 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-8755cc8d4-jfz47" event={"ID":"903f834f-26c9-4ba6-9a21-4331d7505b51","Type":"ContainerStarted","Data":"5d4e61873c2a66478c5938d6fdfa197558c47666b277018f95c69481a2ec064d"} Apr 22 20:13:30.350039 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:13:30.350036 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-8755cc8d4-jfz47" event={"ID":"903f834f-26c9-4ba6-9a21-4331d7505b51","Type":"ContainerStarted","Data":"a33b4d6692f8d840993e0b132023fe940d64aab4ad70715f3d8d019603418773"} Apr 22 20:13:30.350567 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:13:30.350126 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-8755cc8d4-jfz47" Apr 22 20:13:30.370620 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:13:30.370578 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-8755cc8d4-jfz47" podStartSLOduration=3.370561383 podStartE2EDuration="3.370561383s" podCreationTimestamp="2026-04-22 20:13:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 20:13:30.368357483 +0000 UTC m=+945.683449716" watchObservedRunningTime="2026-04-22 20:13:30.370561383 +0000 UTC m=+945.685653614" Apr 22 20:13:37.929430 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:13:37.929396 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-8755cc8d4-jfz47" Apr 22 20:13:37.929430 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:13:37.929431 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-8755cc8d4-jfz47" Apr 22 20:13:37.932296 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:13:37.932271 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-8755cc8d4-jfz47" Apr 22 20:13:38.381363 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:13:38.381338 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-8755cc8d4-jfz47" Apr 22 20:13:59.384491 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:13:59.384464 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-8755cc8d4-jfz47" Apr 22 20:15:08.588391 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:15:08.588356 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-8755cc8d4-jfz47"] Apr 22 20:15:08.589031 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:15:08.588747 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-8755cc8d4-jfz47" podUID="903f834f-26c9-4ba6-9a21-4331d7505b51" containerName="main" containerID="cri-o://a33b4d6692f8d840993e0b132023fe940d64aab4ad70715f3d8d019603418773" gracePeriod=30 Apr 22 20:15:08.589031 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:15:08.588786 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-8755cc8d4-jfz47" podUID="903f834f-26c9-4ba6-9a21-4331d7505b51" containerName="tokenizer" containerID="cri-o://5d4e61873c2a66478c5938d6fdfa197558c47666b277018f95c69481a2ec064d" gracePeriod=30 Apr 22 20:15:09.383994 ip-10-0-143-253 kubenswrapper[2577]: W0422 20:15:09.383957 2577 logging.go:55] [core] [Channel #142 SubChannel #143]grpc: addrConn.createTransport failed to connect to {Addr: "10.133.0.29:9003", ServerName: "10.133.0.29:9003", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 10.133.0.29:9003: connect: connection refused" Apr 22 20:15:09.670154 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:15:09.670123 2577 generic.go:358] "Generic (PLEG): container finished" podID="903f834f-26c9-4ba6-9a21-4331d7505b51" containerID="a33b4d6692f8d840993e0b132023fe940d64aab4ad70715f3d8d019603418773" exitCode=0 Apr 22 20:15:09.670484 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:15:09.670178 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-8755cc8d4-jfz47" event={"ID":"903f834f-26c9-4ba6-9a21-4331d7505b51","Type":"ContainerDied","Data":"a33b4d6692f8d840993e0b132023fe940d64aab4ad70715f3d8d019603418773"} Apr 22 20:15:09.826757 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:15:09.826734 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-8755cc8d4-jfz47" Apr 22 20:15:09.992570 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:15:09.992475 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/903f834f-26c9-4ba6-9a21-4331d7505b51-tokenizer-uds\") pod \"903f834f-26c9-4ba6-9a21-4331d7505b51\" (UID: \"903f834f-26c9-4ba6-9a21-4331d7505b51\") " Apr 22 20:15:09.992570 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:15:09.992537 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/903f834f-26c9-4ba6-9a21-4331d7505b51-tokenizer-tmp\") pod \"903f834f-26c9-4ba6-9a21-4331d7505b51\" (UID: \"903f834f-26c9-4ba6-9a21-4331d7505b51\") " Apr 22 20:15:09.992792 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:15:09.992596 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/903f834f-26c9-4ba6-9a21-4331d7505b51-kserve-provision-location\") pod \"903f834f-26c9-4ba6-9a21-4331d7505b51\" (UID: \"903f834f-26c9-4ba6-9a21-4331d7505b51\") " Apr 22 20:15:09.992792 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:15:09.992658 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9452r\" (UniqueName: \"kubernetes.io/projected/903f834f-26c9-4ba6-9a21-4331d7505b51-kube-api-access-9452r\") pod \"903f834f-26c9-4ba6-9a21-4331d7505b51\" (UID: \"903f834f-26c9-4ba6-9a21-4331d7505b51\") " Apr 22 20:15:09.992792 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:15:09.992686 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/903f834f-26c9-4ba6-9a21-4331d7505b51-tokenizer-cache\") pod \"903f834f-26c9-4ba6-9a21-4331d7505b51\" (UID: \"903f834f-26c9-4ba6-9a21-4331d7505b51\") " Apr 22 20:15:09.992792 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:15:09.992727 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/903f834f-26c9-4ba6-9a21-4331d7505b51-tls-certs\") pod \"903f834f-26c9-4ba6-9a21-4331d7505b51\" (UID: \"903f834f-26c9-4ba6-9a21-4331d7505b51\") " Apr 22 20:15:09.993034 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:15:09.992800 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/903f834f-26c9-4ba6-9a21-4331d7505b51-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "903f834f-26c9-4ba6-9a21-4331d7505b51" (UID: "903f834f-26c9-4ba6-9a21-4331d7505b51"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 20:15:09.993034 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:15:09.992951 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/903f834f-26c9-4ba6-9a21-4331d7505b51-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "903f834f-26c9-4ba6-9a21-4331d7505b51" (UID: "903f834f-26c9-4ba6-9a21-4331d7505b51"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 20:15:09.993034 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:15:09.992973 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/903f834f-26c9-4ba6-9a21-4331d7505b51-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "903f834f-26c9-4ba6-9a21-4331d7505b51" (UID: "903f834f-26c9-4ba6-9a21-4331d7505b51"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 20:15:09.993177 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:15:09.993064 2577 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/903f834f-26c9-4ba6-9a21-4331d7505b51-tokenizer-cache\") on node \"ip-10-0-143-253.ec2.internal\" DevicePath \"\"" Apr 22 20:15:09.993177 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:15:09.993084 2577 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/903f834f-26c9-4ba6-9a21-4331d7505b51-tokenizer-uds\") on node \"ip-10-0-143-253.ec2.internal\" DevicePath \"\"" Apr 22 20:15:09.993177 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:15:09.993099 2577 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/903f834f-26c9-4ba6-9a21-4331d7505b51-tokenizer-tmp\") on node \"ip-10-0-143-253.ec2.internal\" DevicePath \"\"" Apr 22 20:15:09.993329 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:15:09.993311 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/903f834f-26c9-4ba6-9a21-4331d7505b51-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "903f834f-26c9-4ba6-9a21-4331d7505b51" (UID: "903f834f-26c9-4ba6-9a21-4331d7505b51"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 20:15:09.994774 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:15:09.994749 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/903f834f-26c9-4ba6-9a21-4331d7505b51-kube-api-access-9452r" (OuterVolumeSpecName: "kube-api-access-9452r") pod "903f834f-26c9-4ba6-9a21-4331d7505b51" (UID: "903f834f-26c9-4ba6-9a21-4331d7505b51"). InnerVolumeSpecName "kube-api-access-9452r". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 20:15:09.994875 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:15:09.994766 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/903f834f-26c9-4ba6-9a21-4331d7505b51-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "903f834f-26c9-4ba6-9a21-4331d7505b51" (UID: "903f834f-26c9-4ba6-9a21-4331d7505b51"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 20:15:10.093787 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:15:10.093752 2577 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/903f834f-26c9-4ba6-9a21-4331d7505b51-kserve-provision-location\") on node \"ip-10-0-143-253.ec2.internal\" DevicePath \"\"" Apr 22 20:15:10.093787 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:15:10.093780 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-9452r\" (UniqueName: \"kubernetes.io/projected/903f834f-26c9-4ba6-9a21-4331d7505b51-kube-api-access-9452r\") on node \"ip-10-0-143-253.ec2.internal\" DevicePath \"\"" Apr 22 20:15:10.093787 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:15:10.093791 2577 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/903f834f-26c9-4ba6-9a21-4331d7505b51-tls-certs\") on node \"ip-10-0-143-253.ec2.internal\" DevicePath \"\"" Apr 22 20:15:10.384066 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:15:10.384022 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-8755cc8d4-jfz47" podUID="903f834f-26c9-4ba6-9a21-4331d7505b51" containerName="main" probeResult="failure" output="timeout: failed to connect service \"10.133.0.29:9003\" within 1s: context deadline exceeded" Apr 22 20:15:10.674831 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:15:10.674736 2577 generic.go:358] "Generic (PLEG): container finished" podID="903f834f-26c9-4ba6-9a21-4331d7505b51" containerID="5d4e61873c2a66478c5938d6fdfa197558c47666b277018f95c69481a2ec064d" exitCode=0 Apr 22 20:15:10.674831 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:15:10.674820 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-8755cc8d4-jfz47" Apr 22 20:15:10.675329 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:15:10.674851 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-8755cc8d4-jfz47" event={"ID":"903f834f-26c9-4ba6-9a21-4331d7505b51","Type":"ContainerDied","Data":"5d4e61873c2a66478c5938d6fdfa197558c47666b277018f95c69481a2ec064d"} Apr 22 20:15:10.675329 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:15:10.674897 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-8755cc8d4-jfz47" event={"ID":"903f834f-26c9-4ba6-9a21-4331d7505b51","Type":"ContainerDied","Data":"8ab1101809f9247a331ef4be80364c3f8d3ceb65902d9ed6e6397054a2d6fd9d"} Apr 22 20:15:10.675329 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:15:10.674921 2577 scope.go:117] "RemoveContainer" containerID="5d4e61873c2a66478c5938d6fdfa197558c47666b277018f95c69481a2ec064d" Apr 22 20:15:10.683475 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:15:10.683452 2577 scope.go:117] "RemoveContainer" containerID="a33b4d6692f8d840993e0b132023fe940d64aab4ad70715f3d8d019603418773" Apr 22 20:15:10.690407 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:15:10.690388 2577 scope.go:117] "RemoveContainer" containerID="5f0c5466861b8996882ddec54bd5bdf5258d76f6f4dd7ad5f7c859494a6424d3" Apr 22 20:15:10.695974 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:15:10.695951 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-8755cc8d4-jfz47"] Apr 22 20:15:10.699097 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:15:10.699077 2577 scope.go:117] "RemoveContainer" containerID="5d4e61873c2a66478c5938d6fdfa197558c47666b277018f95c69481a2ec064d" Apr 22 20:15:10.699404 ip-10-0-143-253 kubenswrapper[2577]: E0422 20:15:10.699383 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5d4e61873c2a66478c5938d6fdfa197558c47666b277018f95c69481a2ec064d\": container with ID starting with 5d4e61873c2a66478c5938d6fdfa197558c47666b277018f95c69481a2ec064d not found: ID does not exist" containerID="5d4e61873c2a66478c5938d6fdfa197558c47666b277018f95c69481a2ec064d" Apr 22 20:15:10.699481 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:15:10.699416 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5d4e61873c2a66478c5938d6fdfa197558c47666b277018f95c69481a2ec064d"} err="failed to get container status \"5d4e61873c2a66478c5938d6fdfa197558c47666b277018f95c69481a2ec064d\": rpc error: code = NotFound desc = could not find container \"5d4e61873c2a66478c5938d6fdfa197558c47666b277018f95c69481a2ec064d\": container with ID starting with 5d4e61873c2a66478c5938d6fdfa197558c47666b277018f95c69481a2ec064d not found: ID does not exist" Apr 22 20:15:10.699481 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:15:10.699444 2577 scope.go:117] "RemoveContainer" containerID="a33b4d6692f8d840993e0b132023fe940d64aab4ad70715f3d8d019603418773" Apr 22 20:15:10.699578 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:15:10.699565 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-8755cc8d4-jfz47"] Apr 22 20:15:10.699738 ip-10-0-143-253 kubenswrapper[2577]: E0422 20:15:10.699718 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a33b4d6692f8d840993e0b132023fe940d64aab4ad70715f3d8d019603418773\": container with ID starting with a33b4d6692f8d840993e0b132023fe940d64aab4ad70715f3d8d019603418773 not found: ID does not exist" containerID="a33b4d6692f8d840993e0b132023fe940d64aab4ad70715f3d8d019603418773" Apr 22 20:15:10.699799 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:15:10.699741 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a33b4d6692f8d840993e0b132023fe940d64aab4ad70715f3d8d019603418773"} err="failed to get container status \"a33b4d6692f8d840993e0b132023fe940d64aab4ad70715f3d8d019603418773\": rpc error: code = NotFound desc = could not find container \"a33b4d6692f8d840993e0b132023fe940d64aab4ad70715f3d8d019603418773\": container with ID starting with a33b4d6692f8d840993e0b132023fe940d64aab4ad70715f3d8d019603418773 not found: ID does not exist" Apr 22 20:15:10.699799 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:15:10.699757 2577 scope.go:117] "RemoveContainer" containerID="5f0c5466861b8996882ddec54bd5bdf5258d76f6f4dd7ad5f7c859494a6424d3" Apr 22 20:15:10.700027 ip-10-0-143-253 kubenswrapper[2577]: E0422 20:15:10.700003 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5f0c5466861b8996882ddec54bd5bdf5258d76f6f4dd7ad5f7c859494a6424d3\": container with ID starting with 5f0c5466861b8996882ddec54bd5bdf5258d76f6f4dd7ad5f7c859494a6424d3 not found: ID does not exist" containerID="5f0c5466861b8996882ddec54bd5bdf5258d76f6f4dd7ad5f7c859494a6424d3" Apr 22 20:15:10.700110 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:15:10.700032 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5f0c5466861b8996882ddec54bd5bdf5258d76f6f4dd7ad5f7c859494a6424d3"} err="failed to get container status \"5f0c5466861b8996882ddec54bd5bdf5258d76f6f4dd7ad5f7c859494a6424d3\": rpc error: code = NotFound desc = could not find container \"5f0c5466861b8996882ddec54bd5bdf5258d76f6f4dd7ad5f7c859494a6424d3\": container with ID starting with 5f0c5466861b8996882ddec54bd5bdf5258d76f6f4dd7ad5f7c859494a6424d3 not found: ID does not exist" Apr 22 20:15:10.875966 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:15:10.875934 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/llmisvc-controller-manager-5f7fb6b5-9njpn"] Apr 22 20:15:10.876257 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:15:10.876246 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="903f834f-26c9-4ba6-9a21-4331d7505b51" containerName="main" Apr 22 20:15:10.876301 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:15:10.876260 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="903f834f-26c9-4ba6-9a21-4331d7505b51" containerName="main" Apr 22 20:15:10.876301 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:15:10.876271 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="903f834f-26c9-4ba6-9a21-4331d7505b51" containerName="tokenizer" Apr 22 20:15:10.876301 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:15:10.876276 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="903f834f-26c9-4ba6-9a21-4331d7505b51" containerName="tokenizer" Apr 22 20:15:10.876301 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:15:10.876292 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="903f834f-26c9-4ba6-9a21-4331d7505b51" containerName="storage-initializer" Apr 22 20:15:10.876301 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:15:10.876298 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="903f834f-26c9-4ba6-9a21-4331d7505b51" containerName="storage-initializer" Apr 22 20:15:10.876448 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:15:10.876350 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="903f834f-26c9-4ba6-9a21-4331d7505b51" containerName="tokenizer" Apr 22 20:15:10.876448 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:15:10.876360 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="903f834f-26c9-4ba6-9a21-4331d7505b51" containerName="main" Apr 22 20:15:10.880458 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:15:10.880441 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-5f7fb6b5-9njpn" Apr 22 20:15:10.882670 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:15:10.882652 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-webhook-server-cert\"" Apr 22 20:15:10.882807 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:15:10.882688 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 22 20:15:10.882807 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:15:10.882703 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 22 20:15:10.882807 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:15:10.882654 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-controller-manager-dockercfg-6dkhn\"" Apr 22 20:15:10.886364 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:15:10.886292 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-5f7fb6b5-9njpn"] Apr 22 20:15:11.003910 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:15:11.003800 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sxl29\" (UniqueName: \"kubernetes.io/projected/8043636a-e751-4cb3-b104-75a384995723-kube-api-access-sxl29\") pod \"llmisvc-controller-manager-5f7fb6b5-9njpn\" (UID: \"8043636a-e751-4cb3-b104-75a384995723\") " pod="kserve/llmisvc-controller-manager-5f7fb6b5-9njpn" Apr 22 20:15:11.003910 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:15:11.003880 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8043636a-e751-4cb3-b104-75a384995723-cert\") pod \"llmisvc-controller-manager-5f7fb6b5-9njpn\" (UID: \"8043636a-e751-4cb3-b104-75a384995723\") " pod="kserve/llmisvc-controller-manager-5f7fb6b5-9njpn" Apr 22 20:15:11.104880 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:15:11.104798 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sxl29\" (UniqueName: \"kubernetes.io/projected/8043636a-e751-4cb3-b104-75a384995723-kube-api-access-sxl29\") pod \"llmisvc-controller-manager-5f7fb6b5-9njpn\" (UID: \"8043636a-e751-4cb3-b104-75a384995723\") " pod="kserve/llmisvc-controller-manager-5f7fb6b5-9njpn" Apr 22 20:15:11.105076 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:15:11.104919 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8043636a-e751-4cb3-b104-75a384995723-cert\") pod \"llmisvc-controller-manager-5f7fb6b5-9njpn\" (UID: \"8043636a-e751-4cb3-b104-75a384995723\") " pod="kserve/llmisvc-controller-manager-5f7fb6b5-9njpn" Apr 22 20:15:11.107274 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:15:11.107254 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8043636a-e751-4cb3-b104-75a384995723-cert\") pod \"llmisvc-controller-manager-5f7fb6b5-9njpn\" (UID: \"8043636a-e751-4cb3-b104-75a384995723\") " pod="kserve/llmisvc-controller-manager-5f7fb6b5-9njpn" Apr 22 20:15:11.112377 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:15:11.112352 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sxl29\" (UniqueName: \"kubernetes.io/projected/8043636a-e751-4cb3-b104-75a384995723-kube-api-access-sxl29\") pod \"llmisvc-controller-manager-5f7fb6b5-9njpn\" (UID: \"8043636a-e751-4cb3-b104-75a384995723\") " pod="kserve/llmisvc-controller-manager-5f7fb6b5-9njpn" Apr 22 20:15:11.191344 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:15:11.191303 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-5f7fb6b5-9njpn" Apr 22 20:15:11.253727 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:15:11.253692 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="903f834f-26c9-4ba6-9a21-4331d7505b51" path="/var/lib/kubelet/pods/903f834f-26c9-4ba6-9a21-4331d7505b51/volumes" Apr 22 20:15:11.311395 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:15:11.311375 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-5f7fb6b5-9njpn"] Apr 22 20:15:11.313672 ip-10-0-143-253 kubenswrapper[2577]: W0422 20:15:11.313643 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod8043636a_e751_4cb3_b104_75a384995723.slice/crio-ad3dd5e8cea5f35d61eaa9703d7ef015f709ffe5ce6435a0afd1e077937f1b12 WatchSource:0}: Error finding container ad3dd5e8cea5f35d61eaa9703d7ef015f709ffe5ce6435a0afd1e077937f1b12: Status 404 returned error can't find the container with id ad3dd5e8cea5f35d61eaa9703d7ef015f709ffe5ce6435a0afd1e077937f1b12 Apr 22 20:15:11.314957 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:15:11.314934 2577 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 20:15:11.679319 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:15:11.679282 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-5f7fb6b5-9njpn" event={"ID":"8043636a-e751-4cb3-b104-75a384995723","Type":"ContainerStarted","Data":"ad3dd5e8cea5f35d61eaa9703d7ef015f709ffe5ce6435a0afd1e077937f1b12"} Apr 22 20:15:15.694284 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:15:15.694244 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-5f7fb6b5-9njpn" event={"ID":"8043636a-e751-4cb3-b104-75a384995723","Type":"ContainerStarted","Data":"529de7b6e38d9577da69f994c81980c926574bda54ffb4ab89085b3b81e84734"} Apr 22 20:15:15.694713 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:15:15.694304 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/llmisvc-controller-manager-5f7fb6b5-9njpn" Apr 22 20:15:15.712106 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:15:15.712060 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/llmisvc-controller-manager-5f7fb6b5-9njpn" podStartSLOduration=2.351425231 podStartE2EDuration="5.712046234s" podCreationTimestamp="2026-04-22 20:15:10 +0000 UTC" firstStartedPulling="2026-04-22 20:15:11.315061366 +0000 UTC m=+1046.630153577" lastFinishedPulling="2026-04-22 20:15:14.675682366 +0000 UTC m=+1049.990774580" observedRunningTime="2026-04-22 20:15:15.710715948 +0000 UTC m=+1051.025808181" watchObservedRunningTime="2026-04-22 20:15:15.712046234 +0000 UTC m=+1051.027138465" Apr 22 20:15:46.700819 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:15:46.700780 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/llmisvc-controller-manager-5f7fb6b5-9njpn" Apr 22 20:18:26.899183 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:18:26.899151 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schez9lx7"] Apr 22 20:18:26.902475 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:18:26.902448 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schez9lx7" Apr 22 20:18:26.904887 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:18:26.904863 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-tlcdw\"" Apr 22 20:18:26.905722 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:18:26.905701 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvc-model-fb-opt-125m-route-f312f5ec-epp-sa-dockercfg-v7pkm\"" Apr 22 20:18:26.905877 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:18:26.905763 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisv4e643bc258191ffc517a31cd1d0ddd27-kserve-self-signed-certs\"" Apr 22 20:18:26.905877 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:18:26.905815 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 22 20:18:26.906051 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:18:26.905912 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 22 20:18:26.914409 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:18:26.914374 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schez9lx7"] Apr 22 20:18:26.929208 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:18:26.929183 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a8cdd1df-b2a9-4ad6-8b39-83799a12b901-kserve-provision-location\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schez9lx7\" (UID: \"a8cdd1df-b2a9-4ad6-8b39-83799a12b901\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schez9lx7" Apr 22 20:18:26.929331 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:18:26.929218 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/a8cdd1df-b2a9-4ad6-8b39-83799a12b901-tokenizer-uds\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schez9lx7\" (UID: \"a8cdd1df-b2a9-4ad6-8b39-83799a12b901\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schez9lx7" Apr 22 20:18:26.929331 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:18:26.929244 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a8cdd1df-b2a9-4ad6-8b39-83799a12b901-tls-certs\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schez9lx7\" (UID: \"a8cdd1df-b2a9-4ad6-8b39-83799a12b901\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schez9lx7" Apr 22 20:18:26.929405 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:18:26.929347 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tsmp5\" (UniqueName: \"kubernetes.io/projected/a8cdd1df-b2a9-4ad6-8b39-83799a12b901-kube-api-access-tsmp5\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schez9lx7\" (UID: \"a8cdd1df-b2a9-4ad6-8b39-83799a12b901\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schez9lx7" Apr 22 20:18:26.929405 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:18:26.929388 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/a8cdd1df-b2a9-4ad6-8b39-83799a12b901-tokenizer-cache\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schez9lx7\" (UID: \"a8cdd1df-b2a9-4ad6-8b39-83799a12b901\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schez9lx7" Apr 22 20:18:26.929467 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:18:26.929441 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/a8cdd1df-b2a9-4ad6-8b39-83799a12b901-tokenizer-tmp\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schez9lx7\" (UID: \"a8cdd1df-b2a9-4ad6-8b39-83799a12b901\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schez9lx7" Apr 22 20:18:27.029968 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:18:27.029936 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tsmp5\" (UniqueName: \"kubernetes.io/projected/a8cdd1df-b2a9-4ad6-8b39-83799a12b901-kube-api-access-tsmp5\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schez9lx7\" (UID: \"a8cdd1df-b2a9-4ad6-8b39-83799a12b901\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schez9lx7" Apr 22 20:18:27.030162 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:18:27.029985 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/a8cdd1df-b2a9-4ad6-8b39-83799a12b901-tokenizer-cache\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schez9lx7\" (UID: \"a8cdd1df-b2a9-4ad6-8b39-83799a12b901\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schez9lx7" Apr 22 20:18:27.030162 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:18:27.030015 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/a8cdd1df-b2a9-4ad6-8b39-83799a12b901-tokenizer-tmp\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schez9lx7\" (UID: \"a8cdd1df-b2a9-4ad6-8b39-83799a12b901\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schez9lx7" Apr 22 20:18:27.030162 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:18:27.030040 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a8cdd1df-b2a9-4ad6-8b39-83799a12b901-kserve-provision-location\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schez9lx7\" (UID: \"a8cdd1df-b2a9-4ad6-8b39-83799a12b901\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schez9lx7" Apr 22 20:18:27.030162 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:18:27.030064 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/a8cdd1df-b2a9-4ad6-8b39-83799a12b901-tokenizer-uds\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schez9lx7\" (UID: \"a8cdd1df-b2a9-4ad6-8b39-83799a12b901\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schez9lx7" Apr 22 20:18:27.030162 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:18:27.030090 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a8cdd1df-b2a9-4ad6-8b39-83799a12b901-tls-certs\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schez9lx7\" (UID: \"a8cdd1df-b2a9-4ad6-8b39-83799a12b901\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schez9lx7" Apr 22 20:18:27.030433 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:18:27.030409 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/a8cdd1df-b2a9-4ad6-8b39-83799a12b901-tokenizer-cache\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schez9lx7\" (UID: \"a8cdd1df-b2a9-4ad6-8b39-83799a12b901\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schez9lx7" Apr 22 20:18:27.030494 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:18:27.030431 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/a8cdd1df-b2a9-4ad6-8b39-83799a12b901-tokenizer-tmp\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schez9lx7\" (UID: \"a8cdd1df-b2a9-4ad6-8b39-83799a12b901\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schez9lx7" Apr 22 20:18:27.030494 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:18:27.030467 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/a8cdd1df-b2a9-4ad6-8b39-83799a12b901-tokenizer-uds\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schez9lx7\" (UID: \"a8cdd1df-b2a9-4ad6-8b39-83799a12b901\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schez9lx7" Apr 22 20:18:27.030599 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:18:27.030548 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a8cdd1df-b2a9-4ad6-8b39-83799a12b901-kserve-provision-location\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schez9lx7\" (UID: \"a8cdd1df-b2a9-4ad6-8b39-83799a12b901\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schez9lx7" Apr 22 20:18:27.032617 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:18:27.032595 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a8cdd1df-b2a9-4ad6-8b39-83799a12b901-tls-certs\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schez9lx7\" (UID: \"a8cdd1df-b2a9-4ad6-8b39-83799a12b901\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schez9lx7" Apr 22 20:18:27.037539 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:18:27.037521 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tsmp5\" (UniqueName: \"kubernetes.io/projected/a8cdd1df-b2a9-4ad6-8b39-83799a12b901-kube-api-access-tsmp5\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schez9lx7\" (UID: \"a8cdd1df-b2a9-4ad6-8b39-83799a12b901\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schez9lx7" Apr 22 20:18:27.214828 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:18:27.214735 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schez9lx7" Apr 22 20:18:27.334747 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:18:27.334724 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schez9lx7"] Apr 22 20:18:27.336668 ip-10-0-143-253 kubenswrapper[2577]: W0422 20:18:27.336641 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda8cdd1df_b2a9_4ad6_8b39_83799a12b901.slice/crio-0b1daab25b779187fe88d7389e0c4385bb9cfbd7776e3ae672f9073ab65b5bd9 WatchSource:0}: Error finding container 0b1daab25b779187fe88d7389e0c4385bb9cfbd7776e3ae672f9073ab65b5bd9: Status 404 returned error can't find the container with id 0b1daab25b779187fe88d7389e0c4385bb9cfbd7776e3ae672f9073ab65b5bd9 Apr 22 20:18:28.305435 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:18:28.305400 2577 generic.go:358] "Generic (PLEG): container finished" podID="a8cdd1df-b2a9-4ad6-8b39-83799a12b901" containerID="73c58bb64fe34ce8cfbb5b4fc2a677782d78562f9e955d8e39b53f45b4a5721a" exitCode=0 Apr 22 20:18:28.305854 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:18:28.305450 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schez9lx7" event={"ID":"a8cdd1df-b2a9-4ad6-8b39-83799a12b901","Type":"ContainerDied","Data":"73c58bb64fe34ce8cfbb5b4fc2a677782d78562f9e955d8e39b53f45b4a5721a"} Apr 22 20:18:28.305854 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:18:28.305476 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schez9lx7" event={"ID":"a8cdd1df-b2a9-4ad6-8b39-83799a12b901","Type":"ContainerStarted","Data":"0b1daab25b779187fe88d7389e0c4385bb9cfbd7776e3ae672f9073ab65b5bd9"} Apr 22 20:18:29.311492 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:18:29.311449 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schez9lx7" event={"ID":"a8cdd1df-b2a9-4ad6-8b39-83799a12b901","Type":"ContainerStarted","Data":"336165ea50c915fd8c4126b9ba77eeceba4c89ba2e9cf79ffa4b7374161c8bf8"} Apr 22 20:18:29.311492 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:18:29.311498 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schez9lx7" event={"ID":"a8cdd1df-b2a9-4ad6-8b39-83799a12b901","Type":"ContainerStarted","Data":"e172194decbacf039154f3cd664cc883e16285b695d50ede651e62c99dfc628e"} Apr 22 20:18:29.311977 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:18:29.311667 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schez9lx7" Apr 22 20:18:29.331984 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:18:29.331936 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schez9lx7" podStartSLOduration=3.3319204940000002 podStartE2EDuration="3.331920494s" podCreationTimestamp="2026-04-22 20:18:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 20:18:29.330659844 +0000 UTC m=+1244.645752087" watchObservedRunningTime="2026-04-22 20:18:29.331920494 +0000 UTC m=+1244.647012728" Apr 22 20:18:37.215097 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:18:37.215013 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schez9lx7" Apr 22 20:18:37.215097 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:18:37.215049 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schez9lx7" Apr 22 20:18:37.217476 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:18:37.217455 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schez9lx7" Apr 22 20:18:37.339180 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:18:37.339150 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schez9lx7" Apr 22 20:18:58.343030 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:18:58.342996 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schez9lx7" Apr 22 20:20:08.945598 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:20:08.945499 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6d485k9sd4"] Apr 22 20:20:08.949422 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:20:08.949395 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6d485k9sd4" Apr 22 20:20:08.951826 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:20:08.951797 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"custom-route-timeout-pd-test-epp-sa-dockercfg-25ddx\"" Apr 22 20:20:08.951826 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:20:08.951819 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"custom-route-timeout-pd-test-kserve-self-signed-certs\"" Apr 22 20:20:08.959208 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:20:08.959163 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6d485k9sd4"] Apr 22 20:20:09.114545 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:20:09.114506 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/5afba752-7e7d-4425-97c0-051f6d42c32a-tokenizer-tmp\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-6d485k9sd4\" (UID: \"5afba752-7e7d-4425-97c0-051f6d42c32a\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6d485k9sd4" Apr 22 20:20:09.114722 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:20:09.114563 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h226s\" (UniqueName: \"kubernetes.io/projected/5afba752-7e7d-4425-97c0-051f6d42c32a-kube-api-access-h226s\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-6d485k9sd4\" (UID: \"5afba752-7e7d-4425-97c0-051f6d42c32a\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6d485k9sd4" Apr 22 20:20:09.114722 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:20:09.114633 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/5afba752-7e7d-4425-97c0-051f6d42c32a-tokenizer-uds\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-6d485k9sd4\" (UID: \"5afba752-7e7d-4425-97c0-051f6d42c32a\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6d485k9sd4" Apr 22 20:20:09.114722 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:20:09.114675 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5afba752-7e7d-4425-97c0-051f6d42c32a-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-6d485k9sd4\" (UID: \"5afba752-7e7d-4425-97c0-051f6d42c32a\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6d485k9sd4" Apr 22 20:20:09.114722 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:20:09.114714 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/5afba752-7e7d-4425-97c0-051f6d42c32a-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-6d485k9sd4\" (UID: \"5afba752-7e7d-4425-97c0-051f6d42c32a\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6d485k9sd4" Apr 22 20:20:09.114895 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:20:09.114736 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/5afba752-7e7d-4425-97c0-051f6d42c32a-tokenizer-cache\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-6d485k9sd4\" (UID: \"5afba752-7e7d-4425-97c0-051f6d42c32a\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6d485k9sd4" Apr 22 20:20:09.215588 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:20:09.215492 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h226s\" (UniqueName: \"kubernetes.io/projected/5afba752-7e7d-4425-97c0-051f6d42c32a-kube-api-access-h226s\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-6d485k9sd4\" (UID: \"5afba752-7e7d-4425-97c0-051f6d42c32a\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6d485k9sd4" Apr 22 20:20:09.215588 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:20:09.215537 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/5afba752-7e7d-4425-97c0-051f6d42c32a-tokenizer-uds\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-6d485k9sd4\" (UID: \"5afba752-7e7d-4425-97c0-051f6d42c32a\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6d485k9sd4" Apr 22 20:20:09.215588 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:20:09.215562 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5afba752-7e7d-4425-97c0-051f6d42c32a-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-6d485k9sd4\" (UID: \"5afba752-7e7d-4425-97c0-051f6d42c32a\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6d485k9sd4" Apr 22 20:20:09.215923 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:20:09.215704 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/5afba752-7e7d-4425-97c0-051f6d42c32a-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-6d485k9sd4\" (UID: \"5afba752-7e7d-4425-97c0-051f6d42c32a\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6d485k9sd4" Apr 22 20:20:09.215923 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:20:09.215747 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/5afba752-7e7d-4425-97c0-051f6d42c32a-tokenizer-cache\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-6d485k9sd4\" (UID: \"5afba752-7e7d-4425-97c0-051f6d42c32a\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6d485k9sd4" Apr 22 20:20:09.215923 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:20:09.215830 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/5afba752-7e7d-4425-97c0-051f6d42c32a-tokenizer-tmp\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-6d485k9sd4\" (UID: \"5afba752-7e7d-4425-97c0-051f6d42c32a\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6d485k9sd4" Apr 22 20:20:09.216037 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:20:09.215942 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/5afba752-7e7d-4425-97c0-051f6d42c32a-tokenizer-uds\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-6d485k9sd4\" (UID: \"5afba752-7e7d-4425-97c0-051f6d42c32a\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6d485k9sd4" Apr 22 20:20:09.216037 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:20:09.216000 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5afba752-7e7d-4425-97c0-051f6d42c32a-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-6d485k9sd4\" (UID: \"5afba752-7e7d-4425-97c0-051f6d42c32a\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6d485k9sd4" Apr 22 20:20:09.216198 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:20:09.216175 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/5afba752-7e7d-4425-97c0-051f6d42c32a-tokenizer-cache\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-6d485k9sd4\" (UID: \"5afba752-7e7d-4425-97c0-051f6d42c32a\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6d485k9sd4" Apr 22 20:20:09.216235 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:20:09.216179 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/5afba752-7e7d-4425-97c0-051f6d42c32a-tokenizer-tmp\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-6d485k9sd4\" (UID: \"5afba752-7e7d-4425-97c0-051f6d42c32a\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6d485k9sd4" Apr 22 20:20:09.218274 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:20:09.218250 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/5afba752-7e7d-4425-97c0-051f6d42c32a-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-6d485k9sd4\" (UID: \"5afba752-7e7d-4425-97c0-051f6d42c32a\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6d485k9sd4" Apr 22 20:20:09.227464 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:20:09.227437 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h226s\" (UniqueName: \"kubernetes.io/projected/5afba752-7e7d-4425-97c0-051f6d42c32a-kube-api-access-h226s\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-6d485k9sd4\" (UID: \"5afba752-7e7d-4425-97c0-051f6d42c32a\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6d485k9sd4" Apr 22 20:20:09.261641 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:20:09.261605 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6d485k9sd4" Apr 22 20:20:09.393804 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:20:09.393767 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6d485k9sd4"] Apr 22 20:20:09.397307 ip-10-0-143-253 kubenswrapper[2577]: W0422 20:20:09.397274 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5afba752_7e7d_4425_97c0_051f6d42c32a.slice/crio-1f212110be7baf42c47e8ab57b2cf136b8216cc48dd3056d328212092ef549a4 WatchSource:0}: Error finding container 1f212110be7baf42c47e8ab57b2cf136b8216cc48dd3056d328212092ef549a4: Status 404 returned error can't find the container with id 1f212110be7baf42c47e8ab57b2cf136b8216cc48dd3056d328212092ef549a4 Apr 22 20:20:09.634822 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:20:09.634784 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6d485k9sd4" event={"ID":"5afba752-7e7d-4425-97c0-051f6d42c32a","Type":"ContainerStarted","Data":"cdbf0c50be2419839db6d62de9afbe4edb43dfe69c605a9ba27471c2de9159fa"} Apr 22 20:20:09.634822 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:20:09.634822 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6d485k9sd4" event={"ID":"5afba752-7e7d-4425-97c0-051f6d42c32a","Type":"ContainerStarted","Data":"1f212110be7baf42c47e8ab57b2cf136b8216cc48dd3056d328212092ef549a4"} Apr 22 20:20:10.640088 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:20:10.640048 2577 generic.go:358] "Generic (PLEG): container finished" podID="5afba752-7e7d-4425-97c0-051f6d42c32a" containerID="cdbf0c50be2419839db6d62de9afbe4edb43dfe69c605a9ba27471c2de9159fa" exitCode=0 Apr 22 20:20:10.640475 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:20:10.640137 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6d485k9sd4" event={"ID":"5afba752-7e7d-4425-97c0-051f6d42c32a","Type":"ContainerDied","Data":"cdbf0c50be2419839db6d62de9afbe4edb43dfe69c605a9ba27471c2de9159fa"} Apr 22 20:20:11.647040 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:20:11.646999 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6d485k9sd4" event={"ID":"5afba752-7e7d-4425-97c0-051f6d42c32a","Type":"ContainerStarted","Data":"3184c6793eeec8ebbeaa6cdfd32b842d13224ce3661d9603ecff8faa6b8b3861"} Apr 22 20:20:11.647040 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:20:11.647045 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6d485k9sd4" event={"ID":"5afba752-7e7d-4425-97c0-051f6d42c32a","Type":"ContainerStarted","Data":"b114e007abb2846d419c52437224cac07a4286fc26aa37c9b09c8d9246a14e22"} Apr 22 20:20:11.647467 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:20:11.647154 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6d485k9sd4" Apr 22 20:20:11.669614 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:20:11.669553 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6d485k9sd4" podStartSLOduration=3.669533919 podStartE2EDuration="3.669533919s" podCreationTimestamp="2026-04-22 20:20:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 20:20:11.666483216 +0000 UTC m=+1346.981575448" watchObservedRunningTime="2026-04-22 20:20:11.669533919 +0000 UTC m=+1346.984626152" Apr 22 20:20:19.262153 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:20:19.262112 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6d485k9sd4" Apr 22 20:20:19.262153 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:20:19.262165 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6d485k9sd4" Apr 22 20:20:19.264959 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:20:19.264936 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6d485k9sd4" Apr 22 20:20:19.676361 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:20:19.676334 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6d485k9sd4" Apr 22 20:20:34.429633 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:20:34.429591 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schez9lx7"] Apr 22 20:20:34.430154 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:20:34.429987 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schez9lx7" podUID="a8cdd1df-b2a9-4ad6-8b39-83799a12b901" containerName="main" containerID="cri-o://e172194decbacf039154f3cd664cc883e16285b695d50ede651e62c99dfc628e" gracePeriod=30 Apr 22 20:20:34.430154 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:20:34.430139 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schez9lx7" podUID="a8cdd1df-b2a9-4ad6-8b39-83799a12b901" containerName="tokenizer" containerID="cri-o://336165ea50c915fd8c4126b9ba77eeceba4c89ba2e9cf79ffa4b7374161c8bf8" gracePeriod=30 Apr 22 20:20:34.732414 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:20:34.732330 2577 generic.go:358] "Generic (PLEG): container finished" podID="a8cdd1df-b2a9-4ad6-8b39-83799a12b901" containerID="e172194decbacf039154f3cd664cc883e16285b695d50ede651e62c99dfc628e" exitCode=0 Apr 22 20:20:34.732559 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:20:34.732405 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schez9lx7" event={"ID":"a8cdd1df-b2a9-4ad6-8b39-83799a12b901","Type":"ContainerDied","Data":"e172194decbacf039154f3cd664cc883e16285b695d50ede651e62c99dfc628e"} Apr 22 20:20:35.738587 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:20:35.738537 2577 generic.go:358] "Generic (PLEG): container finished" podID="a8cdd1df-b2a9-4ad6-8b39-83799a12b901" containerID="336165ea50c915fd8c4126b9ba77eeceba4c89ba2e9cf79ffa4b7374161c8bf8" exitCode=0 Apr 22 20:20:35.738973 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:20:35.738608 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schez9lx7" event={"ID":"a8cdd1df-b2a9-4ad6-8b39-83799a12b901","Type":"ContainerDied","Data":"336165ea50c915fd8c4126b9ba77eeceba4c89ba2e9cf79ffa4b7374161c8bf8"} Apr 22 20:20:35.788997 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:20:35.788971 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schez9lx7" Apr 22 20:20:35.840135 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:20:35.840096 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/a8cdd1df-b2a9-4ad6-8b39-83799a12b901-tokenizer-uds\") pod \"a8cdd1df-b2a9-4ad6-8b39-83799a12b901\" (UID: \"a8cdd1df-b2a9-4ad6-8b39-83799a12b901\") " Apr 22 20:20:35.840320 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:20:35.840150 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a8cdd1df-b2a9-4ad6-8b39-83799a12b901-tls-certs\") pod \"a8cdd1df-b2a9-4ad6-8b39-83799a12b901\" (UID: \"a8cdd1df-b2a9-4ad6-8b39-83799a12b901\") " Apr 22 20:20:35.840320 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:20:35.840176 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/a8cdd1df-b2a9-4ad6-8b39-83799a12b901-tokenizer-cache\") pod \"a8cdd1df-b2a9-4ad6-8b39-83799a12b901\" (UID: \"a8cdd1df-b2a9-4ad6-8b39-83799a12b901\") " Apr 22 20:20:35.840320 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:20:35.840197 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tsmp5\" (UniqueName: \"kubernetes.io/projected/a8cdd1df-b2a9-4ad6-8b39-83799a12b901-kube-api-access-tsmp5\") pod \"a8cdd1df-b2a9-4ad6-8b39-83799a12b901\" (UID: \"a8cdd1df-b2a9-4ad6-8b39-83799a12b901\") " Apr 22 20:20:35.840320 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:20:35.840226 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/a8cdd1df-b2a9-4ad6-8b39-83799a12b901-tokenizer-tmp\") pod \"a8cdd1df-b2a9-4ad6-8b39-83799a12b901\" (UID: \"a8cdd1df-b2a9-4ad6-8b39-83799a12b901\") " Apr 22 20:20:35.840320 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:20:35.840283 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a8cdd1df-b2a9-4ad6-8b39-83799a12b901-kserve-provision-location\") pod \"a8cdd1df-b2a9-4ad6-8b39-83799a12b901\" (UID: \"a8cdd1df-b2a9-4ad6-8b39-83799a12b901\") " Apr 22 20:20:35.840596 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:20:35.840387 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a8cdd1df-b2a9-4ad6-8b39-83799a12b901-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "a8cdd1df-b2a9-4ad6-8b39-83799a12b901" (UID: "a8cdd1df-b2a9-4ad6-8b39-83799a12b901"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 20:20:35.840596 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:20:35.840491 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a8cdd1df-b2a9-4ad6-8b39-83799a12b901-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "a8cdd1df-b2a9-4ad6-8b39-83799a12b901" (UID: "a8cdd1df-b2a9-4ad6-8b39-83799a12b901"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 20:20:35.840596 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:20:35.840584 2577 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/a8cdd1df-b2a9-4ad6-8b39-83799a12b901-tokenizer-uds\") on node \"ip-10-0-143-253.ec2.internal\" DevicePath \"\"" Apr 22 20:20:35.840744 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:20:35.840603 2577 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/a8cdd1df-b2a9-4ad6-8b39-83799a12b901-tokenizer-cache\") on node \"ip-10-0-143-253.ec2.internal\" DevicePath \"\"" Apr 22 20:20:35.840744 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:20:35.840692 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a8cdd1df-b2a9-4ad6-8b39-83799a12b901-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "a8cdd1df-b2a9-4ad6-8b39-83799a12b901" (UID: "a8cdd1df-b2a9-4ad6-8b39-83799a12b901"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 20:20:35.841271 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:20:35.841245 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a8cdd1df-b2a9-4ad6-8b39-83799a12b901-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "a8cdd1df-b2a9-4ad6-8b39-83799a12b901" (UID: "a8cdd1df-b2a9-4ad6-8b39-83799a12b901"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 20:20:35.842406 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:20:35.842387 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a8cdd1df-b2a9-4ad6-8b39-83799a12b901-kube-api-access-tsmp5" (OuterVolumeSpecName: "kube-api-access-tsmp5") pod "a8cdd1df-b2a9-4ad6-8b39-83799a12b901" (UID: "a8cdd1df-b2a9-4ad6-8b39-83799a12b901"). InnerVolumeSpecName "kube-api-access-tsmp5". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 20:20:35.842529 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:20:35.842509 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a8cdd1df-b2a9-4ad6-8b39-83799a12b901-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "a8cdd1df-b2a9-4ad6-8b39-83799a12b901" (UID: "a8cdd1df-b2a9-4ad6-8b39-83799a12b901"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 20:20:35.941466 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:20:35.941366 2577 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a8cdd1df-b2a9-4ad6-8b39-83799a12b901-kserve-provision-location\") on node \"ip-10-0-143-253.ec2.internal\" DevicePath \"\"" Apr 22 20:20:35.941466 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:20:35.941400 2577 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a8cdd1df-b2a9-4ad6-8b39-83799a12b901-tls-certs\") on node \"ip-10-0-143-253.ec2.internal\" DevicePath \"\"" Apr 22 20:20:35.941466 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:20:35.941413 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-tsmp5\" (UniqueName: \"kubernetes.io/projected/a8cdd1df-b2a9-4ad6-8b39-83799a12b901-kube-api-access-tsmp5\") on node \"ip-10-0-143-253.ec2.internal\" DevicePath \"\"" Apr 22 20:20:35.941466 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:20:35.941426 2577 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/a8cdd1df-b2a9-4ad6-8b39-83799a12b901-tokenizer-tmp\") on node \"ip-10-0-143-253.ec2.internal\" DevicePath \"\"" Apr 22 20:20:36.744217 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:20:36.744184 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schez9lx7" event={"ID":"a8cdd1df-b2a9-4ad6-8b39-83799a12b901","Type":"ContainerDied","Data":"0b1daab25b779187fe88d7389e0c4385bb9cfbd7776e3ae672f9073ab65b5bd9"} Apr 22 20:20:36.744217 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:20:36.744226 2577 scope.go:117] "RemoveContainer" containerID="336165ea50c915fd8c4126b9ba77eeceba4c89ba2e9cf79ffa4b7374161c8bf8" Apr 22 20:20:36.744655 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:20:36.744234 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schez9lx7" Apr 22 20:20:36.752694 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:20:36.752674 2577 scope.go:117] "RemoveContainer" containerID="e172194decbacf039154f3cd664cc883e16285b695d50ede651e62c99dfc628e" Apr 22 20:20:36.760770 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:20:36.760747 2577 scope.go:117] "RemoveContainer" containerID="73c58bb64fe34ce8cfbb5b4fc2a677782d78562f9e955d8e39b53f45b4a5721a" Apr 22 20:20:36.770282 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:20:36.770256 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schez9lx7"] Apr 22 20:20:36.775764 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:20:36.775739 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schez9lx7"] Apr 22 20:20:37.254712 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:20:37.254675 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a8cdd1df-b2a9-4ad6-8b39-83799a12b901" path="/var/lib/kubelet/pods/a8cdd1df-b2a9-4ad6-8b39-83799a12b901/volumes" Apr 22 20:20:40.680122 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:20:40.680093 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6d485k9sd4" Apr 22 20:22:50.013897 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:22:50.013862 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6d485k9sd4"] Apr 22 20:22:50.014414 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:22:50.014273 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6d485k9sd4" podUID="5afba752-7e7d-4425-97c0-051f6d42c32a" containerName="main" containerID="cri-o://b114e007abb2846d419c52437224cac07a4286fc26aa37c9b09c8d9246a14e22" gracePeriod=30 Apr 22 20:22:50.014414 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:22:50.014336 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6d485k9sd4" podUID="5afba752-7e7d-4425-97c0-051f6d42c32a" containerName="tokenizer" containerID="cri-o://3184c6793eeec8ebbeaa6cdfd32b842d13224ce3661d9603ecff8faa6b8b3861" gracePeriod=30 Apr 22 20:22:50.181692 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:22:50.181658 2577 generic.go:358] "Generic (PLEG): container finished" podID="5afba752-7e7d-4425-97c0-051f6d42c32a" containerID="b114e007abb2846d419c52437224cac07a4286fc26aa37c9b09c8d9246a14e22" exitCode=0 Apr 22 20:22:50.181901 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:22:50.181731 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6d485k9sd4" event={"ID":"5afba752-7e7d-4425-97c0-051f6d42c32a","Type":"ContainerDied","Data":"b114e007abb2846d419c52437224cac07a4286fc26aa37c9b09c8d9246a14e22"} Apr 22 20:22:50.679973 ip-10-0-143-253 kubenswrapper[2577]: W0422 20:22:50.679942 2577 logging.go:55] [core] [Channel #297 SubChannel #298]grpc: addrConn.createTransport failed to connect to {Addr: "10.133.0.32:9003", ServerName: "10.133.0.32:9003", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 10.133.0.32:9003: connect: connection refused" Apr 22 20:22:51.173319 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:22:51.173295 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6d485k9sd4" Apr 22 20:22:51.179361 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:22:51.179336 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5afba752-7e7d-4425-97c0-051f6d42c32a-kserve-provision-location\") pod \"5afba752-7e7d-4425-97c0-051f6d42c32a\" (UID: \"5afba752-7e7d-4425-97c0-051f6d42c32a\") " Apr 22 20:22:51.179470 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:22:51.179394 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h226s\" (UniqueName: \"kubernetes.io/projected/5afba752-7e7d-4425-97c0-051f6d42c32a-kube-api-access-h226s\") pod \"5afba752-7e7d-4425-97c0-051f6d42c32a\" (UID: \"5afba752-7e7d-4425-97c0-051f6d42c32a\") " Apr 22 20:22:51.179470 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:22:51.179422 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/5afba752-7e7d-4425-97c0-051f6d42c32a-tokenizer-tmp\") pod \"5afba752-7e7d-4425-97c0-051f6d42c32a\" (UID: \"5afba752-7e7d-4425-97c0-051f6d42c32a\") " Apr 22 20:22:51.179470 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:22:51.179458 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/5afba752-7e7d-4425-97c0-051f6d42c32a-tls-certs\") pod \"5afba752-7e7d-4425-97c0-051f6d42c32a\" (UID: \"5afba752-7e7d-4425-97c0-051f6d42c32a\") " Apr 22 20:22:51.179624 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:22:51.179490 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/5afba752-7e7d-4425-97c0-051f6d42c32a-tokenizer-cache\") pod \"5afba752-7e7d-4425-97c0-051f6d42c32a\" (UID: \"5afba752-7e7d-4425-97c0-051f6d42c32a\") " Apr 22 20:22:51.179624 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:22:51.179518 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/5afba752-7e7d-4425-97c0-051f6d42c32a-tokenizer-uds\") pod \"5afba752-7e7d-4425-97c0-051f6d42c32a\" (UID: \"5afba752-7e7d-4425-97c0-051f6d42c32a\") " Apr 22 20:22:51.179820 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:22:51.179796 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5afba752-7e7d-4425-97c0-051f6d42c32a-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "5afba752-7e7d-4425-97c0-051f6d42c32a" (UID: "5afba752-7e7d-4425-97c0-051f6d42c32a"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 20:22:51.179921 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:22:51.179813 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5afba752-7e7d-4425-97c0-051f6d42c32a-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "5afba752-7e7d-4425-97c0-051f6d42c32a" (UID: "5afba752-7e7d-4425-97c0-051f6d42c32a"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 20:22:51.179968 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:22:51.179937 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5afba752-7e7d-4425-97c0-051f6d42c32a-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "5afba752-7e7d-4425-97c0-051f6d42c32a" (UID: "5afba752-7e7d-4425-97c0-051f6d42c32a"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 20:22:51.180191 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:22:51.180166 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5afba752-7e7d-4425-97c0-051f6d42c32a-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "5afba752-7e7d-4425-97c0-051f6d42c32a" (UID: "5afba752-7e7d-4425-97c0-051f6d42c32a"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 20:22:51.181586 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:22:51.181565 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5afba752-7e7d-4425-97c0-051f6d42c32a-kube-api-access-h226s" (OuterVolumeSpecName: "kube-api-access-h226s") pod "5afba752-7e7d-4425-97c0-051f6d42c32a" (UID: "5afba752-7e7d-4425-97c0-051f6d42c32a"). InnerVolumeSpecName "kube-api-access-h226s". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 20:22:51.181668 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:22:51.181605 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5afba752-7e7d-4425-97c0-051f6d42c32a-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "5afba752-7e7d-4425-97c0-051f6d42c32a" (UID: "5afba752-7e7d-4425-97c0-051f6d42c32a"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 20:22:51.192458 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:22:51.191993 2577 generic.go:358] "Generic (PLEG): container finished" podID="5afba752-7e7d-4425-97c0-051f6d42c32a" containerID="3184c6793eeec8ebbeaa6cdfd32b842d13224ce3661d9603ecff8faa6b8b3861" exitCode=0 Apr 22 20:22:51.192458 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:22:51.192043 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6d485k9sd4" event={"ID":"5afba752-7e7d-4425-97c0-051f6d42c32a","Type":"ContainerDied","Data":"3184c6793eeec8ebbeaa6cdfd32b842d13224ce3661d9603ecff8faa6b8b3861"} Apr 22 20:22:51.192458 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:22:51.192081 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6d485k9sd4" event={"ID":"5afba752-7e7d-4425-97c0-051f6d42c32a","Type":"ContainerDied","Data":"1f212110be7baf42c47e8ab57b2cf136b8216cc48dd3056d328212092ef549a4"} Apr 22 20:22:51.192458 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:22:51.192101 2577 scope.go:117] "RemoveContainer" containerID="3184c6793eeec8ebbeaa6cdfd32b842d13224ce3661d9603ecff8faa6b8b3861" Apr 22 20:22:51.194754 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:22:51.194715 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6d485k9sd4" Apr 22 20:22:51.204205 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:22:51.204189 2577 scope.go:117] "RemoveContainer" containerID="b114e007abb2846d419c52437224cac07a4286fc26aa37c9b09c8d9246a14e22" Apr 22 20:22:51.212309 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:22:51.212289 2577 scope.go:117] "RemoveContainer" containerID="cdbf0c50be2419839db6d62de9afbe4edb43dfe69c605a9ba27471c2de9159fa" Apr 22 20:22:51.219941 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:22:51.219919 2577 scope.go:117] "RemoveContainer" containerID="3184c6793eeec8ebbeaa6cdfd32b842d13224ce3661d9603ecff8faa6b8b3861" Apr 22 20:22:51.220191 ip-10-0-143-253 kubenswrapper[2577]: E0422 20:22:51.220172 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3184c6793eeec8ebbeaa6cdfd32b842d13224ce3661d9603ecff8faa6b8b3861\": container with ID starting with 3184c6793eeec8ebbeaa6cdfd32b842d13224ce3661d9603ecff8faa6b8b3861 not found: ID does not exist" containerID="3184c6793eeec8ebbeaa6cdfd32b842d13224ce3661d9603ecff8faa6b8b3861" Apr 22 20:22:51.220242 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:22:51.220198 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3184c6793eeec8ebbeaa6cdfd32b842d13224ce3661d9603ecff8faa6b8b3861"} err="failed to get container status \"3184c6793eeec8ebbeaa6cdfd32b842d13224ce3661d9603ecff8faa6b8b3861\": rpc error: code = NotFound desc = could not find container \"3184c6793eeec8ebbeaa6cdfd32b842d13224ce3661d9603ecff8faa6b8b3861\": container with ID starting with 3184c6793eeec8ebbeaa6cdfd32b842d13224ce3661d9603ecff8faa6b8b3861 not found: ID does not exist" Apr 22 20:22:51.220242 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:22:51.220215 2577 scope.go:117] "RemoveContainer" containerID="b114e007abb2846d419c52437224cac07a4286fc26aa37c9b09c8d9246a14e22" Apr 22 20:22:51.220428 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:22:51.220403 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6d485k9sd4"] Apr 22 20:22:51.220428 ip-10-0-143-253 kubenswrapper[2577]: E0422 20:22:51.220418 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b114e007abb2846d419c52437224cac07a4286fc26aa37c9b09c8d9246a14e22\": container with ID starting with b114e007abb2846d419c52437224cac07a4286fc26aa37c9b09c8d9246a14e22 not found: ID does not exist" containerID="b114e007abb2846d419c52437224cac07a4286fc26aa37c9b09c8d9246a14e22" Apr 22 20:22:51.220595 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:22:51.220442 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b114e007abb2846d419c52437224cac07a4286fc26aa37c9b09c8d9246a14e22"} err="failed to get container status \"b114e007abb2846d419c52437224cac07a4286fc26aa37c9b09c8d9246a14e22\": rpc error: code = NotFound desc = could not find container \"b114e007abb2846d419c52437224cac07a4286fc26aa37c9b09c8d9246a14e22\": container with ID starting with b114e007abb2846d419c52437224cac07a4286fc26aa37c9b09c8d9246a14e22 not found: ID does not exist" Apr 22 20:22:51.220595 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:22:51.220460 2577 scope.go:117] "RemoveContainer" containerID="cdbf0c50be2419839db6d62de9afbe4edb43dfe69c605a9ba27471c2de9159fa" Apr 22 20:22:51.220706 ip-10-0-143-253 kubenswrapper[2577]: E0422 20:22:51.220687 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cdbf0c50be2419839db6d62de9afbe4edb43dfe69c605a9ba27471c2de9159fa\": container with ID starting with cdbf0c50be2419839db6d62de9afbe4edb43dfe69c605a9ba27471c2de9159fa not found: ID does not exist" containerID="cdbf0c50be2419839db6d62de9afbe4edb43dfe69c605a9ba27471c2de9159fa" Apr 22 20:22:51.220748 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:22:51.220711 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cdbf0c50be2419839db6d62de9afbe4edb43dfe69c605a9ba27471c2de9159fa"} err="failed to get container status \"cdbf0c50be2419839db6d62de9afbe4edb43dfe69c605a9ba27471c2de9159fa\": rpc error: code = NotFound desc = could not find container \"cdbf0c50be2419839db6d62de9afbe4edb43dfe69c605a9ba27471c2de9159fa\": container with ID starting with cdbf0c50be2419839db6d62de9afbe4edb43dfe69c605a9ba27471c2de9159fa not found: ID does not exist" Apr 22 20:22:51.225022 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:22:51.225000 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6d485k9sd4"] Apr 22 20:22:51.252712 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:22:51.252685 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5afba752-7e7d-4425-97c0-051f6d42c32a" path="/var/lib/kubelet/pods/5afba752-7e7d-4425-97c0-051f6d42c32a/volumes" Apr 22 20:22:51.280341 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:22:51.280319 2577 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/5afba752-7e7d-4425-97c0-051f6d42c32a-tls-certs\") on node \"ip-10-0-143-253.ec2.internal\" DevicePath \"\"" Apr 22 20:22:51.280458 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:22:51.280342 2577 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/5afba752-7e7d-4425-97c0-051f6d42c32a-tokenizer-cache\") on node \"ip-10-0-143-253.ec2.internal\" DevicePath \"\"" Apr 22 20:22:51.280458 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:22:51.280353 2577 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/5afba752-7e7d-4425-97c0-051f6d42c32a-tokenizer-uds\") on node \"ip-10-0-143-253.ec2.internal\" DevicePath \"\"" Apr 22 20:22:51.280458 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:22:51.280363 2577 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5afba752-7e7d-4425-97c0-051f6d42c32a-kserve-provision-location\") on node \"ip-10-0-143-253.ec2.internal\" DevicePath \"\"" Apr 22 20:22:51.280458 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:22:51.280374 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-h226s\" (UniqueName: \"kubernetes.io/projected/5afba752-7e7d-4425-97c0-051f6d42c32a-kube-api-access-h226s\") on node \"ip-10-0-143-253.ec2.internal\" DevicePath \"\"" Apr 22 20:22:51.280458 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:22:51.280383 2577 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/5afba752-7e7d-4425-97c0-051f6d42c32a-tokenizer-tmp\") on node \"ip-10-0-143-253.ec2.internal\" DevicePath \"\"" Apr 22 20:22:51.680675 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:22:51.680580 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6d485k9sd4" podUID="5afba752-7e7d-4425-97c0-051f6d42c32a" containerName="main" probeResult="failure" output="timeout: failed to connect service \"10.133.0.32:9003\" within 1s: context deadline exceeded" Apr 22 20:23:02.141894 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:23:02.141864 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-76b559dddsfsrm"] Apr 22 20:23:02.142513 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:23:02.142328 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a8cdd1df-b2a9-4ad6-8b39-83799a12b901" containerName="main" Apr 22 20:23:02.142513 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:23:02.142345 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8cdd1df-b2a9-4ad6-8b39-83799a12b901" containerName="main" Apr 22 20:23:02.142513 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:23:02.142366 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5afba752-7e7d-4425-97c0-051f6d42c32a" containerName="main" Apr 22 20:23:02.142513 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:23:02.142374 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="5afba752-7e7d-4425-97c0-051f6d42c32a" containerName="main" Apr 22 20:23:02.142513 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:23:02.142389 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5afba752-7e7d-4425-97c0-051f6d42c32a" containerName="tokenizer" Apr 22 20:23:02.142513 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:23:02.142398 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="5afba752-7e7d-4425-97c0-051f6d42c32a" containerName="tokenizer" Apr 22 20:23:02.142513 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:23:02.142422 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a8cdd1df-b2a9-4ad6-8b39-83799a12b901" containerName="storage-initializer" Apr 22 20:23:02.142513 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:23:02.142431 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8cdd1df-b2a9-4ad6-8b39-83799a12b901" containerName="storage-initializer" Apr 22 20:23:02.142513 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:23:02.142443 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a8cdd1df-b2a9-4ad6-8b39-83799a12b901" containerName="tokenizer" Apr 22 20:23:02.142513 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:23:02.142452 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8cdd1df-b2a9-4ad6-8b39-83799a12b901" containerName="tokenizer" Apr 22 20:23:02.142513 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:23:02.142464 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5afba752-7e7d-4425-97c0-051f6d42c32a" containerName="storage-initializer" Apr 22 20:23:02.142513 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:23:02.142472 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="5afba752-7e7d-4425-97c0-051f6d42c32a" containerName="storage-initializer" Apr 22 20:23:02.143225 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:23:02.142552 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="a8cdd1df-b2a9-4ad6-8b39-83799a12b901" containerName="main" Apr 22 20:23:02.143225 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:23:02.142567 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="5afba752-7e7d-4425-97c0-051f6d42c32a" containerName="tokenizer" Apr 22 20:23:02.143225 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:23:02.142579 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="a8cdd1df-b2a9-4ad6-8b39-83799a12b901" containerName="tokenizer" Apr 22 20:23:02.143225 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:23:02.142590 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="5afba752-7e7d-4425-97c0-051f6d42c32a" containerName="main" Apr 22 20:23:02.147712 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:23:02.147690 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-76b559dddsfsrm" Apr 22 20:23:02.150110 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:23:02.150088 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 22 20:23:02.150237 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:23:02.150156 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-with-refs-pd-test-kserve-self-signed-certs\"" Apr 22 20:23:02.150237 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:23:02.150155 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-with-refs-pd-test-epp-sa-dockercfg-zbvzq\"" Apr 22 20:23:02.151005 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:23:02.150985 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 22 20:23:02.151132 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:23:02.151114 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-tlcdw\"" Apr 22 20:23:02.156861 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:23:02.156819 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-76b559dddsfsrm"] Apr 22 20:23:02.272453 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:23:02.272406 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/bf660141-5a2d-4ca5-957c-a77a88e8865e-tokenizer-tmp\") pod \"router-with-refs-pd-test-kserve-router-scheduler-76b559dddsfsrm\" (UID: \"bf660141-5a2d-4ca5-957c-a77a88e8865e\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-76b559dddsfsrm" Apr 22 20:23:02.272632 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:23:02.272480 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bf660141-5a2d-4ca5-957c-a77a88e8865e-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-router-scheduler-76b559dddsfsrm\" (UID: \"bf660141-5a2d-4ca5-957c-a77a88e8865e\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-76b559dddsfsrm" Apr 22 20:23:02.272632 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:23:02.272506 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/bf660141-5a2d-4ca5-957c-a77a88e8865e-tls-certs\") pod \"router-with-refs-pd-test-kserve-router-scheduler-76b559dddsfsrm\" (UID: \"bf660141-5a2d-4ca5-957c-a77a88e8865e\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-76b559dddsfsrm" Apr 22 20:23:02.272632 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:23:02.272527 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/bf660141-5a2d-4ca5-957c-a77a88e8865e-tokenizer-uds\") pod \"router-with-refs-pd-test-kserve-router-scheduler-76b559dddsfsrm\" (UID: \"bf660141-5a2d-4ca5-957c-a77a88e8865e\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-76b559dddsfsrm" Apr 22 20:23:02.272632 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:23:02.272559 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qspqj\" (UniqueName: \"kubernetes.io/projected/bf660141-5a2d-4ca5-957c-a77a88e8865e-kube-api-access-qspqj\") pod \"router-with-refs-pd-test-kserve-router-scheduler-76b559dddsfsrm\" (UID: \"bf660141-5a2d-4ca5-957c-a77a88e8865e\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-76b559dddsfsrm" Apr 22 20:23:02.272632 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:23:02.272603 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/bf660141-5a2d-4ca5-957c-a77a88e8865e-tokenizer-cache\") pod \"router-with-refs-pd-test-kserve-router-scheduler-76b559dddsfsrm\" (UID: \"bf660141-5a2d-4ca5-957c-a77a88e8865e\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-76b559dddsfsrm" Apr 22 20:23:02.373805 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:23:02.373770 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/bf660141-5a2d-4ca5-957c-a77a88e8865e-tokenizer-cache\") pod \"router-with-refs-pd-test-kserve-router-scheduler-76b559dddsfsrm\" (UID: \"bf660141-5a2d-4ca5-957c-a77a88e8865e\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-76b559dddsfsrm" Apr 22 20:23:02.373805 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:23:02.373811 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/bf660141-5a2d-4ca5-957c-a77a88e8865e-tokenizer-tmp\") pod \"router-with-refs-pd-test-kserve-router-scheduler-76b559dddsfsrm\" (UID: \"bf660141-5a2d-4ca5-957c-a77a88e8865e\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-76b559dddsfsrm" Apr 22 20:23:02.374078 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:23:02.373881 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bf660141-5a2d-4ca5-957c-a77a88e8865e-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-router-scheduler-76b559dddsfsrm\" (UID: \"bf660141-5a2d-4ca5-957c-a77a88e8865e\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-76b559dddsfsrm" Apr 22 20:23:02.374078 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:23:02.373904 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/bf660141-5a2d-4ca5-957c-a77a88e8865e-tls-certs\") pod \"router-with-refs-pd-test-kserve-router-scheduler-76b559dddsfsrm\" (UID: \"bf660141-5a2d-4ca5-957c-a77a88e8865e\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-76b559dddsfsrm" Apr 22 20:23:02.374078 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:23:02.373933 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/bf660141-5a2d-4ca5-957c-a77a88e8865e-tokenizer-uds\") pod \"router-with-refs-pd-test-kserve-router-scheduler-76b559dddsfsrm\" (UID: \"bf660141-5a2d-4ca5-957c-a77a88e8865e\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-76b559dddsfsrm" Apr 22 20:23:02.374078 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:23:02.373958 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qspqj\" (UniqueName: \"kubernetes.io/projected/bf660141-5a2d-4ca5-957c-a77a88e8865e-kube-api-access-qspqj\") pod \"router-with-refs-pd-test-kserve-router-scheduler-76b559dddsfsrm\" (UID: \"bf660141-5a2d-4ca5-957c-a77a88e8865e\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-76b559dddsfsrm" Apr 22 20:23:02.374286 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:23:02.374257 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/bf660141-5a2d-4ca5-957c-a77a88e8865e-tokenizer-cache\") pod \"router-with-refs-pd-test-kserve-router-scheduler-76b559dddsfsrm\" (UID: \"bf660141-5a2d-4ca5-957c-a77a88e8865e\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-76b559dddsfsrm" Apr 22 20:23:02.374352 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:23:02.374318 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bf660141-5a2d-4ca5-957c-a77a88e8865e-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-router-scheduler-76b559dddsfsrm\" (UID: \"bf660141-5a2d-4ca5-957c-a77a88e8865e\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-76b559dddsfsrm" Apr 22 20:23:02.374407 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:23:02.374362 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/bf660141-5a2d-4ca5-957c-a77a88e8865e-tokenizer-tmp\") pod \"router-with-refs-pd-test-kserve-router-scheduler-76b559dddsfsrm\" (UID: \"bf660141-5a2d-4ca5-957c-a77a88e8865e\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-76b559dddsfsrm" Apr 22 20:23:02.374579 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:23:02.374556 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/bf660141-5a2d-4ca5-957c-a77a88e8865e-tokenizer-uds\") pod \"router-with-refs-pd-test-kserve-router-scheduler-76b559dddsfsrm\" (UID: \"bf660141-5a2d-4ca5-957c-a77a88e8865e\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-76b559dddsfsrm" Apr 22 20:23:02.376419 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:23:02.376401 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/bf660141-5a2d-4ca5-957c-a77a88e8865e-tls-certs\") pod \"router-with-refs-pd-test-kserve-router-scheduler-76b559dddsfsrm\" (UID: \"bf660141-5a2d-4ca5-957c-a77a88e8865e\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-76b559dddsfsrm" Apr 22 20:23:02.381211 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:23:02.381181 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qspqj\" (UniqueName: \"kubernetes.io/projected/bf660141-5a2d-4ca5-957c-a77a88e8865e-kube-api-access-qspqj\") pod \"router-with-refs-pd-test-kserve-router-scheduler-76b559dddsfsrm\" (UID: \"bf660141-5a2d-4ca5-957c-a77a88e8865e\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-76b559dddsfsrm" Apr 22 20:23:02.458867 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:23:02.458750 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-76b559dddsfsrm" Apr 22 20:23:02.608005 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:23:02.607967 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-76b559dddsfsrm"] Apr 22 20:23:02.611871 ip-10-0-143-253 kubenswrapper[2577]: W0422 20:23:02.611830 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbf660141_5a2d_4ca5_957c_a77a88e8865e.slice/crio-9ef97ebeeba92fac0fa63e62948fdc3497289143f5d3fe883095f09eeb2b52af WatchSource:0}: Error finding container 9ef97ebeeba92fac0fa63e62948fdc3497289143f5d3fe883095f09eeb2b52af: Status 404 returned error can't find the container with id 9ef97ebeeba92fac0fa63e62948fdc3497289143f5d3fe883095f09eeb2b52af Apr 22 20:23:02.613781 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:23:02.613765 2577 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 20:23:03.233800 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:23:03.233767 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-76b559dddsfsrm" event={"ID":"bf660141-5a2d-4ca5-957c-a77a88e8865e","Type":"ContainerStarted","Data":"07fa4ca2ce6855d5a7d1e4fb485a63e4cc274311a1fd27c9d3bc2daa0470c110"} Apr 22 20:23:03.233800 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:23:03.233803 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-76b559dddsfsrm" event={"ID":"bf660141-5a2d-4ca5-957c-a77a88e8865e","Type":"ContainerStarted","Data":"9ef97ebeeba92fac0fa63e62948fdc3497289143f5d3fe883095f09eeb2b52af"} Apr 22 20:23:04.238930 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:23:04.238827 2577 generic.go:358] "Generic (PLEG): container finished" podID="bf660141-5a2d-4ca5-957c-a77a88e8865e" containerID="07fa4ca2ce6855d5a7d1e4fb485a63e4cc274311a1fd27c9d3bc2daa0470c110" exitCode=0 Apr 22 20:23:04.238930 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:23:04.238911 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-76b559dddsfsrm" event={"ID":"bf660141-5a2d-4ca5-957c-a77a88e8865e","Type":"ContainerDied","Data":"07fa4ca2ce6855d5a7d1e4fb485a63e4cc274311a1fd27c9d3bc2daa0470c110"} Apr 22 20:23:05.243640 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:23:05.243604 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-76b559dddsfsrm" event={"ID":"bf660141-5a2d-4ca5-957c-a77a88e8865e","Type":"ContainerStarted","Data":"bbddbdb1fd5eaaf8ec9fd5f4db6c0b8771430ffa514ed1ec5e0eee819c65cf4b"} Apr 22 20:23:05.243640 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:23:05.243641 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-76b559dddsfsrm" event={"ID":"bf660141-5a2d-4ca5-957c-a77a88e8865e","Type":"ContainerStarted","Data":"bbee0dae2cb2ad75871a99b48d9b6ece8abb3adfa341bff6bd133f9f05a81a5c"} Apr 22 20:23:05.244117 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:23:05.243752 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-76b559dddsfsrm" Apr 22 20:23:05.265920 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:23:05.265862 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-76b559dddsfsrm" podStartSLOduration=3.265812403 podStartE2EDuration="3.265812403s" podCreationTimestamp="2026-04-22 20:23:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 20:23:05.263143672 +0000 UTC m=+1520.578235905" watchObservedRunningTime="2026-04-22 20:23:05.265812403 +0000 UTC m=+1520.580904635" Apr 22 20:23:12.459023 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:23:12.458985 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-76b559dddsfsrm" Apr 22 20:23:12.459023 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:23:12.459034 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-76b559dddsfsrm" Apr 22 20:23:12.461688 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:23:12.461663 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-76b559dddsfsrm" Apr 22 20:23:13.273537 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:23:13.273507 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-76b559dddsfsrm" Apr 22 20:23:34.276966 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:23:34.276939 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-76b559dddsfsrm" Apr 22 20:25:43.588781 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:25:43.588749 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-76b559dddsfsrm"] Apr 22 20:25:43.589270 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:25:43.589101 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-76b559dddsfsrm" podUID="bf660141-5a2d-4ca5-957c-a77a88e8865e" containerName="main" containerID="cri-o://bbee0dae2cb2ad75871a99b48d9b6ece8abb3adfa341bff6bd133f9f05a81a5c" gracePeriod=30 Apr 22 20:25:43.589270 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:25:43.589161 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-76b559dddsfsrm" podUID="bf660141-5a2d-4ca5-957c-a77a88e8865e" containerName="tokenizer" containerID="cri-o://bbddbdb1fd5eaaf8ec9fd5f4db6c0b8771430ffa514ed1ec5e0eee819c65cf4b" gracePeriod=30 Apr 22 20:25:43.760199 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:25:43.760165 2577 generic.go:358] "Generic (PLEG): container finished" podID="bf660141-5a2d-4ca5-957c-a77a88e8865e" containerID="bbee0dae2cb2ad75871a99b48d9b6ece8abb3adfa341bff6bd133f9f05a81a5c" exitCode=0 Apr 22 20:25:43.760353 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:25:43.760235 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-76b559dddsfsrm" event={"ID":"bf660141-5a2d-4ca5-957c-a77a88e8865e","Type":"ContainerDied","Data":"bbee0dae2cb2ad75871a99b48d9b6ece8abb3adfa341bff6bd133f9f05a81a5c"} Apr 22 20:25:44.276397 ip-10-0-143-253 kubenswrapper[2577]: W0422 20:25:44.276362 2577 logging.go:55] [core] [Channel #386 SubChannel #387]grpc: addrConn.createTransport failed to connect to {Addr: "10.133.0.33:9003", ServerName: "10.133.0.33:9003", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 10.133.0.33:9003: connect: connection refused" Apr 22 20:25:44.765303 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:25:44.765277 2577 generic.go:358] "Generic (PLEG): container finished" podID="bf660141-5a2d-4ca5-957c-a77a88e8865e" containerID="bbddbdb1fd5eaaf8ec9fd5f4db6c0b8771430ffa514ed1ec5e0eee819c65cf4b" exitCode=0 Apr 22 20:25:44.765618 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:25:44.765313 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-76b559dddsfsrm" event={"ID":"bf660141-5a2d-4ca5-957c-a77a88e8865e","Type":"ContainerDied","Data":"bbddbdb1fd5eaaf8ec9fd5f4db6c0b8771430ffa514ed1ec5e0eee819c65cf4b"} Apr 22 20:25:44.835755 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:25:44.835733 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-76b559dddsfsrm" Apr 22 20:25:44.956253 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:25:44.956167 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bf660141-5a2d-4ca5-957c-a77a88e8865e-kserve-provision-location\") pod \"bf660141-5a2d-4ca5-957c-a77a88e8865e\" (UID: \"bf660141-5a2d-4ca5-957c-a77a88e8865e\") " Apr 22 20:25:44.956253 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:25:44.956213 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qspqj\" (UniqueName: \"kubernetes.io/projected/bf660141-5a2d-4ca5-957c-a77a88e8865e-kube-api-access-qspqj\") pod \"bf660141-5a2d-4ca5-957c-a77a88e8865e\" (UID: \"bf660141-5a2d-4ca5-957c-a77a88e8865e\") " Apr 22 20:25:44.956253 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:25:44.956236 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/bf660141-5a2d-4ca5-957c-a77a88e8865e-tokenizer-uds\") pod \"bf660141-5a2d-4ca5-957c-a77a88e8865e\" (UID: \"bf660141-5a2d-4ca5-957c-a77a88e8865e\") " Apr 22 20:25:44.956515 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:25:44.956275 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/bf660141-5a2d-4ca5-957c-a77a88e8865e-tls-certs\") pod \"bf660141-5a2d-4ca5-957c-a77a88e8865e\" (UID: \"bf660141-5a2d-4ca5-957c-a77a88e8865e\") " Apr 22 20:25:44.956515 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:25:44.956300 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/bf660141-5a2d-4ca5-957c-a77a88e8865e-tokenizer-cache\") pod \"bf660141-5a2d-4ca5-957c-a77a88e8865e\" (UID: \"bf660141-5a2d-4ca5-957c-a77a88e8865e\") " Apr 22 20:25:44.956515 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:25:44.956327 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/bf660141-5a2d-4ca5-957c-a77a88e8865e-tokenizer-tmp\") pod \"bf660141-5a2d-4ca5-957c-a77a88e8865e\" (UID: \"bf660141-5a2d-4ca5-957c-a77a88e8865e\") " Apr 22 20:25:44.956668 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:25:44.956623 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bf660141-5a2d-4ca5-957c-a77a88e8865e-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "bf660141-5a2d-4ca5-957c-a77a88e8865e" (UID: "bf660141-5a2d-4ca5-957c-a77a88e8865e"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 20:25:44.956668 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:25:44.956646 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bf660141-5a2d-4ca5-957c-a77a88e8865e-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "bf660141-5a2d-4ca5-957c-a77a88e8865e" (UID: "bf660141-5a2d-4ca5-957c-a77a88e8865e"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 20:25:44.956768 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:25:44.956751 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bf660141-5a2d-4ca5-957c-a77a88e8865e-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "bf660141-5a2d-4ca5-957c-a77a88e8865e" (UID: "bf660141-5a2d-4ca5-957c-a77a88e8865e"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 20:25:44.957040 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:25:44.957016 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bf660141-5a2d-4ca5-957c-a77a88e8865e-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "bf660141-5a2d-4ca5-957c-a77a88e8865e" (UID: "bf660141-5a2d-4ca5-957c-a77a88e8865e"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 20:25:44.958389 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:25:44.958364 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf660141-5a2d-4ca5-957c-a77a88e8865e-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "bf660141-5a2d-4ca5-957c-a77a88e8865e" (UID: "bf660141-5a2d-4ca5-957c-a77a88e8865e"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 20:25:44.958490 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:25:44.958421 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf660141-5a2d-4ca5-957c-a77a88e8865e-kube-api-access-qspqj" (OuterVolumeSpecName: "kube-api-access-qspqj") pod "bf660141-5a2d-4ca5-957c-a77a88e8865e" (UID: "bf660141-5a2d-4ca5-957c-a77a88e8865e"). InnerVolumeSpecName "kube-api-access-qspqj". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 20:25:45.057170 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:25:45.057135 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qspqj\" (UniqueName: \"kubernetes.io/projected/bf660141-5a2d-4ca5-957c-a77a88e8865e-kube-api-access-qspqj\") on node \"ip-10-0-143-253.ec2.internal\" DevicePath \"\"" Apr 22 20:25:45.057170 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:25:45.057164 2577 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/bf660141-5a2d-4ca5-957c-a77a88e8865e-tokenizer-uds\") on node \"ip-10-0-143-253.ec2.internal\" DevicePath \"\"" Apr 22 20:25:45.057170 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:25:45.057174 2577 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/bf660141-5a2d-4ca5-957c-a77a88e8865e-tls-certs\") on node \"ip-10-0-143-253.ec2.internal\" DevicePath \"\"" Apr 22 20:25:45.057412 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:25:45.057182 2577 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/bf660141-5a2d-4ca5-957c-a77a88e8865e-tokenizer-cache\") on node \"ip-10-0-143-253.ec2.internal\" DevicePath \"\"" Apr 22 20:25:45.057412 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:25:45.057193 2577 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/bf660141-5a2d-4ca5-957c-a77a88e8865e-tokenizer-tmp\") on node \"ip-10-0-143-253.ec2.internal\" DevicePath \"\"" Apr 22 20:25:45.057412 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:25:45.057204 2577 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bf660141-5a2d-4ca5-957c-a77a88e8865e-kserve-provision-location\") on node \"ip-10-0-143-253.ec2.internal\" DevicePath \"\"" Apr 22 20:25:45.276122 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:25:45.276085 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-76b559dddsfsrm" podUID="bf660141-5a2d-4ca5-957c-a77a88e8865e" containerName="main" probeResult="failure" output="timeout: failed to connect service \"10.133.0.33:9003\" within 1s: context deadline exceeded" Apr 22 20:25:45.770129 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:25:45.770095 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-76b559dddsfsrm" event={"ID":"bf660141-5a2d-4ca5-957c-a77a88e8865e","Type":"ContainerDied","Data":"9ef97ebeeba92fac0fa63e62948fdc3497289143f5d3fe883095f09eeb2b52af"} Apr 22 20:25:45.770528 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:25:45.770140 2577 scope.go:117] "RemoveContainer" containerID="bbddbdb1fd5eaaf8ec9fd5f4db6c0b8771430ffa514ed1ec5e0eee819c65cf4b" Apr 22 20:25:45.770528 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:25:45.770144 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-76b559dddsfsrm" Apr 22 20:25:45.780226 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:25:45.778622 2577 scope.go:117] "RemoveContainer" containerID="bbee0dae2cb2ad75871a99b48d9b6ece8abb3adfa341bff6bd133f9f05a81a5c" Apr 22 20:25:45.787765 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:25:45.787677 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-76b559dddsfsrm"] Apr 22 20:25:45.787822 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:25:45.787795 2577 scope.go:117] "RemoveContainer" containerID="07fa4ca2ce6855d5a7d1e4fb485a63e4cc274311a1fd27c9d3bc2daa0470c110" Apr 22 20:25:45.791407 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:25:45.791382 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-76b559dddsfsrm"] Apr 22 20:25:47.253108 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:25:47.253076 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf660141-5a2d-4ca5-957c-a77a88e8865e" path="/var/lib/kubelet/pods/bf660141-5a2d-4ca5-957c-a77a88e8865e/volumes" Apr 22 20:26:13.724140 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:26:13.724068 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-operator-7587b89b76-7bw52_6971e1a7-75bb-4cf5-9121-501f2d314714/manager/0.log" Apr 22 20:26:13.811619 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:26:13.811590 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-limitador-67566c68b4-x98p9_e0e8f835-af26-4eab-89bc-aaba452e1d80/limitador/0.log" Apr 22 20:26:18.811463 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:26:18.811430 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-x55sm_0763314b-12d3-4771-844c-120f25ae1bc3/global-pull-secret-syncer/0.log" Apr 22 20:26:18.896461 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:26:18.896431 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-wzwch_74ab80fc-aaa7-48f7-8670-ed1cd47ff5c8/konnectivity-agent/0.log" Apr 22 20:26:18.961632 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:26:18.961606 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-143-253.ec2.internal_68c5e58877595fc451d476fd9e217735/haproxy/0.log" Apr 22 20:26:23.172829 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:26:23.172801 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-operator-7587b89b76-7bw52_6971e1a7-75bb-4cf5-9121-501f2d314714/manager/0.log" Apr 22 20:26:23.312325 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:26:23.312292 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-limitador-67566c68b4-x98p9_e0e8f835-af26-4eab-89bc-aaba452e1d80/limitador/0.log" Apr 22 20:26:24.512589 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:26:24.512513 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-v94xd_252eb25d-0b26-498e-85d2-b99506d56ed4/kube-state-metrics/0.log" Apr 22 20:26:24.533968 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:26:24.533943 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-v94xd_252eb25d-0b26-498e-85d2-b99506d56ed4/kube-rbac-proxy-main/0.log" Apr 22 20:26:24.557681 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:26:24.557660 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-v94xd_252eb25d-0b26-498e-85d2-b99506d56ed4/kube-rbac-proxy-self/0.log" Apr 22 20:26:24.582806 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:26:24.582788 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-7bcf7f978f-v75zc_061643e4-5536-4b98-a3db-7dc78b143be4/metrics-server/0.log" Apr 22 20:26:24.613104 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:26:24.613086 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-7dccd58f55-fbwbl_3d17ac7f-8550-4939-b37f-744e02796a0a/monitoring-plugin/0.log" Apr 22 20:26:24.794965 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:26:24.794941 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-wg2wh_41445d5c-895e-4561-8e7f-4520630856ea/node-exporter/0.log" Apr 22 20:26:24.818387 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:26:24.818368 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-wg2wh_41445d5c-895e-4561-8e7f-4520630856ea/kube-rbac-proxy/0.log" Apr 22 20:26:24.840821 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:26:24.840806 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-wg2wh_41445d5c-895e-4561-8e7f-4520630856ea/init-textfile/0.log" Apr 22 20:26:24.954779 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:26:24.954754 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_5d1ab1f0-f27c-4498-8914-f5927c356290/prometheus/0.log" Apr 22 20:26:24.971999 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:26:24.971979 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_5d1ab1f0-f27c-4498-8914-f5927c356290/config-reloader/0.log" Apr 22 20:26:24.991879 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:26:24.991862 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_5d1ab1f0-f27c-4498-8914-f5927c356290/thanos-sidecar/0.log" Apr 22 20:26:25.015124 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:26:25.015107 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_5d1ab1f0-f27c-4498-8914-f5927c356290/kube-rbac-proxy-web/0.log" Apr 22 20:26:25.038952 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:26:25.038936 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_5d1ab1f0-f27c-4498-8914-f5927c356290/kube-rbac-proxy/0.log" Apr 22 20:26:25.061082 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:26:25.061031 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_5d1ab1f0-f27c-4498-8914-f5927c356290/kube-rbac-proxy-thanos/0.log" Apr 22 20:26:25.083659 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:26:25.083640 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_5d1ab1f0-f27c-4498-8914-f5927c356290/init-config-reloader/0.log" Apr 22 20:26:25.112466 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:26:25.112442 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-8qm52_d43fc95f-82c0-458a-abd4-ae257a54e5b0/prometheus-operator/0.log" Apr 22 20:26:25.142084 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:26:25.142064 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-8qm52_d43fc95f-82c0-458a-abd4-ae257a54e5b0/kube-rbac-proxy/0.log" Apr 22 20:26:25.221083 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:26:25.221056 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-68f67747f4-skfhf_c67727b0-cc9d-4536-8f61-bbb39b7943f6/telemeter-client/0.log" Apr 22 20:26:25.247778 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:26:25.247755 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-68f67747f4-skfhf_c67727b0-cc9d-4536-8f61-bbb39b7943f6/reload/0.log" Apr 22 20:26:25.277994 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:26:25.277964 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-68f67747f4-skfhf_c67727b0-cc9d-4536-8f61-bbb39b7943f6/kube-rbac-proxy/0.log" Apr 22 20:26:27.770966 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:26:27.770938 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-7cxtv/perf-node-gather-daemonset-5gqs9"] Apr 22 20:26:27.771347 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:26:27.771250 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bf660141-5a2d-4ca5-957c-a77a88e8865e" containerName="main" Apr 22 20:26:27.771347 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:26:27.771260 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf660141-5a2d-4ca5-957c-a77a88e8865e" containerName="main" Apr 22 20:26:27.771347 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:26:27.771275 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bf660141-5a2d-4ca5-957c-a77a88e8865e" containerName="storage-initializer" Apr 22 20:26:27.771347 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:26:27.771280 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf660141-5a2d-4ca5-957c-a77a88e8865e" containerName="storage-initializer" Apr 22 20:26:27.771347 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:26:27.771287 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bf660141-5a2d-4ca5-957c-a77a88e8865e" containerName="tokenizer" Apr 22 20:26:27.771347 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:26:27.771293 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf660141-5a2d-4ca5-957c-a77a88e8865e" containerName="tokenizer" Apr 22 20:26:27.771347 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:26:27.771345 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="bf660141-5a2d-4ca5-957c-a77a88e8865e" containerName="tokenizer" Apr 22 20:26:27.771597 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:26:27.771355 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="bf660141-5a2d-4ca5-957c-a77a88e8865e" containerName="main" Apr 22 20:26:27.774487 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:26:27.774466 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7cxtv/perf-node-gather-daemonset-5gqs9" Apr 22 20:26:27.776796 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:26:27.776778 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-7cxtv\"/\"openshift-service-ca.crt\"" Apr 22 20:26:27.777473 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:26:27.777446 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-7cxtv\"/\"default-dockercfg-phjf8\"" Apr 22 20:26:27.777580 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:26:27.777480 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-7cxtv\"/\"kube-root-ca.crt\"" Apr 22 20:26:27.780131 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:26:27.780111 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-7cxtv/perf-node-gather-daemonset-5gqs9"] Apr 22 20:26:27.809964 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:26:27.809938 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f79f7d6a-d4da-4dd5-879c-b9959ab733f7-lib-modules\") pod \"perf-node-gather-daemonset-5gqs9\" (UID: \"f79f7d6a-d4da-4dd5-879c-b9959ab733f7\") " pod="openshift-must-gather-7cxtv/perf-node-gather-daemonset-5gqs9" Apr 22 20:26:27.809964 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:26:27.809966 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/f79f7d6a-d4da-4dd5-879c-b9959ab733f7-podres\") pod \"perf-node-gather-daemonset-5gqs9\" (UID: \"f79f7d6a-d4da-4dd5-879c-b9959ab733f7\") " pod="openshift-must-gather-7cxtv/perf-node-gather-daemonset-5gqs9" Apr 22 20:26:27.810125 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:26:27.809989 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f79f7d6a-d4da-4dd5-879c-b9959ab733f7-sys\") pod \"perf-node-gather-daemonset-5gqs9\" (UID: \"f79f7d6a-d4da-4dd5-879c-b9959ab733f7\") " pod="openshift-must-gather-7cxtv/perf-node-gather-daemonset-5gqs9" Apr 22 20:26:27.810125 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:26:27.810016 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/f79f7d6a-d4da-4dd5-879c-b9959ab733f7-proc\") pod \"perf-node-gather-daemonset-5gqs9\" (UID: \"f79f7d6a-d4da-4dd5-879c-b9959ab733f7\") " pod="openshift-must-gather-7cxtv/perf-node-gather-daemonset-5gqs9" Apr 22 20:26:27.810125 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:26:27.810067 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nn4x2\" (UniqueName: \"kubernetes.io/projected/f79f7d6a-d4da-4dd5-879c-b9959ab733f7-kube-api-access-nn4x2\") pod \"perf-node-gather-daemonset-5gqs9\" (UID: \"f79f7d6a-d4da-4dd5-879c-b9959ab733f7\") " pod="openshift-must-gather-7cxtv/perf-node-gather-daemonset-5gqs9" Apr 22 20:26:27.910651 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:26:27.910623 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f79f7d6a-d4da-4dd5-879c-b9959ab733f7-lib-modules\") pod \"perf-node-gather-daemonset-5gqs9\" (UID: \"f79f7d6a-d4da-4dd5-879c-b9959ab733f7\") " pod="openshift-must-gather-7cxtv/perf-node-gather-daemonset-5gqs9" Apr 22 20:26:27.910651 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:26:27.910652 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/f79f7d6a-d4da-4dd5-879c-b9959ab733f7-podres\") pod \"perf-node-gather-daemonset-5gqs9\" (UID: \"f79f7d6a-d4da-4dd5-879c-b9959ab733f7\") " pod="openshift-must-gather-7cxtv/perf-node-gather-daemonset-5gqs9" Apr 22 20:26:27.910904 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:26:27.910679 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f79f7d6a-d4da-4dd5-879c-b9959ab733f7-sys\") pod \"perf-node-gather-daemonset-5gqs9\" (UID: \"f79f7d6a-d4da-4dd5-879c-b9959ab733f7\") " pod="openshift-must-gather-7cxtv/perf-node-gather-daemonset-5gqs9" Apr 22 20:26:27.910904 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:26:27.910711 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/f79f7d6a-d4da-4dd5-879c-b9959ab733f7-proc\") pod \"perf-node-gather-daemonset-5gqs9\" (UID: \"f79f7d6a-d4da-4dd5-879c-b9959ab733f7\") " pod="openshift-must-gather-7cxtv/perf-node-gather-daemonset-5gqs9" Apr 22 20:26:27.910904 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:26:27.910728 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nn4x2\" (UniqueName: \"kubernetes.io/projected/f79f7d6a-d4da-4dd5-879c-b9959ab733f7-kube-api-access-nn4x2\") pod \"perf-node-gather-daemonset-5gqs9\" (UID: \"f79f7d6a-d4da-4dd5-879c-b9959ab733f7\") " pod="openshift-must-gather-7cxtv/perf-node-gather-daemonset-5gqs9" Apr 22 20:26:27.910904 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:26:27.910795 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f79f7d6a-d4da-4dd5-879c-b9959ab733f7-lib-modules\") pod \"perf-node-gather-daemonset-5gqs9\" (UID: \"f79f7d6a-d4da-4dd5-879c-b9959ab733f7\") " pod="openshift-must-gather-7cxtv/perf-node-gather-daemonset-5gqs9" Apr 22 20:26:27.910904 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:26:27.910802 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/f79f7d6a-d4da-4dd5-879c-b9959ab733f7-podres\") pod \"perf-node-gather-daemonset-5gqs9\" (UID: \"f79f7d6a-d4da-4dd5-879c-b9959ab733f7\") " pod="openshift-must-gather-7cxtv/perf-node-gather-daemonset-5gqs9" Apr 22 20:26:27.910904 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:26:27.910816 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f79f7d6a-d4da-4dd5-879c-b9959ab733f7-sys\") pod \"perf-node-gather-daemonset-5gqs9\" (UID: \"f79f7d6a-d4da-4dd5-879c-b9959ab733f7\") " pod="openshift-must-gather-7cxtv/perf-node-gather-daemonset-5gqs9" Apr 22 20:26:27.910904 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:26:27.910822 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/f79f7d6a-d4da-4dd5-879c-b9959ab733f7-proc\") pod \"perf-node-gather-daemonset-5gqs9\" (UID: \"f79f7d6a-d4da-4dd5-879c-b9959ab733f7\") " pod="openshift-must-gather-7cxtv/perf-node-gather-daemonset-5gqs9" Apr 22 20:26:27.918996 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:26:27.918975 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nn4x2\" (UniqueName: \"kubernetes.io/projected/f79f7d6a-d4da-4dd5-879c-b9959ab733f7-kube-api-access-nn4x2\") pod \"perf-node-gather-daemonset-5gqs9\" (UID: \"f79f7d6a-d4da-4dd5-879c-b9959ab733f7\") " pod="openshift-must-gather-7cxtv/perf-node-gather-daemonset-5gqs9" Apr 22 20:26:28.084759 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:26:28.084731 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7cxtv/perf-node-gather-daemonset-5gqs9" Apr 22 20:26:28.203018 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:26:28.202994 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-7cxtv/perf-node-gather-daemonset-5gqs9"] Apr 22 20:26:28.838754 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:26:28.838728 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-ptknf_82d3388d-34c5-45f5-82dd-28252d41e89a/dns/0.log" Apr 22 20:26:28.861362 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:26:28.861338 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-ptknf_82d3388d-34c5-45f5-82dd-28252d41e89a/kube-rbac-proxy/0.log" Apr 22 20:26:28.907959 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:26:28.907936 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-msxbb_3d105cfe-1e71-45ef-b072-4f6de04ca9c1/dns-node-resolver/0.log" Apr 22 20:26:28.917290 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:26:28.917262 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7cxtv/perf-node-gather-daemonset-5gqs9" event={"ID":"f79f7d6a-d4da-4dd5-879c-b9959ab733f7","Type":"ContainerStarted","Data":"f3d084fd9643939a264eecb23cdc826822c38c4d8d05c44d78650bb57a0591e3"} Apr 22 20:26:28.917401 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:26:28.917296 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7cxtv/perf-node-gather-daemonset-5gqs9" event={"ID":"f79f7d6a-d4da-4dd5-879c-b9959ab733f7","Type":"ContainerStarted","Data":"ffb457f5fcfbfbf3ae2e5eb8c22ac4de9cc511d7967be1a5c64e2c40f0d1df16"} Apr 22 20:26:28.917401 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:26:28.917334 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-7cxtv/perf-node-gather-daemonset-5gqs9" Apr 22 20:26:28.932461 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:26:28.932420 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-7cxtv/perf-node-gather-daemonset-5gqs9" podStartSLOduration=1.9324097980000001 podStartE2EDuration="1.932409798s" podCreationTimestamp="2026-04-22 20:26:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 20:26:28.931230676 +0000 UTC m=+1724.246322908" watchObservedRunningTime="2026-04-22 20:26:28.932409798 +0000 UTC m=+1724.247502028" Apr 22 20:26:29.437135 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:26:29.437105 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-rpfnc_8b248b8e-1022-47ab-b16f-e3e4f3ee7abb/node-ca/0.log" Apr 22 20:26:30.834162 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:26:30.834121 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-svd5v_a663c8f0-aa0a-4c22-a907-7ecf606a4790/serve-healthcheck-canary/0.log" Apr 22 20:26:31.403316 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:26:31.403292 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-khbvn_9bbf2aad-4420-4e3a-9215-7d09954398fa/kube-rbac-proxy/0.log" Apr 22 20:26:31.423464 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:26:31.423444 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-khbvn_9bbf2aad-4420-4e3a-9215-7d09954398fa/exporter/0.log" Apr 22 20:26:31.444287 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:26:31.444267 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-khbvn_9bbf2aad-4420-4e3a-9215-7d09954398fa/extractor/0.log" Apr 22 20:26:33.955065 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:26:33.955037 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_lws-controller-manager-5846f88986-g272f_e6f57a3c-e388-4c16-aab1-ab242ce9d1bb/manager/0.log" Apr 22 20:26:34.003914 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:26:34.003882 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_openshift-lws-operator-bfc7f696d-zws8j_9fbe25ee-cbc0-42fd-9592-6015d395cd1f/openshift-lws-operator/0.log" Apr 22 20:26:34.573884 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:26:34.573854 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_llmisvc-controller-manager-5f7fb6b5-9njpn_8043636a-e751-4cb3-b104-75a384995723/manager/0.log" Apr 22 20:26:34.929663 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:26:34.929598 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-7cxtv/perf-node-gather-daemonset-5gqs9" Apr 22 20:26:39.444306 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:26:39.444231 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-lpfc8_5db6b33c-709b-4948-8eeb-99cd23ddba38/migrator/0.log" Apr 22 20:26:39.466399 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:26:39.466366 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-lpfc8_5db6b33c-709b-4948-8eeb-99cd23ddba38/graceful-termination/0.log" Apr 22 20:26:40.986279 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:26:40.986249 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-p4l6b_c28089f2-d625-4e69-b372-16c2a540e3a1/kube-multus-additional-cni-plugins/0.log" Apr 22 20:26:41.019080 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:26:41.019058 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-p4l6b_c28089f2-d625-4e69-b372-16c2a540e3a1/egress-router-binary-copy/0.log" Apr 22 20:26:41.040279 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:26:41.040255 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-p4l6b_c28089f2-d625-4e69-b372-16c2a540e3a1/cni-plugins/0.log" Apr 22 20:26:41.060725 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:26:41.060706 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-p4l6b_c28089f2-d625-4e69-b372-16c2a540e3a1/bond-cni-plugin/0.log" Apr 22 20:26:41.082056 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:26:41.082035 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-p4l6b_c28089f2-d625-4e69-b372-16c2a540e3a1/routeoverride-cni/0.log" Apr 22 20:26:41.101861 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:26:41.101823 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-p4l6b_c28089f2-d625-4e69-b372-16c2a540e3a1/whereabouts-cni-bincopy/0.log" Apr 22 20:26:41.123161 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:26:41.123138 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-p4l6b_c28089f2-d625-4e69-b372-16c2a540e3a1/whereabouts-cni/0.log" Apr 22 20:26:41.311490 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:26:41.311468 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-pkpcm_e24f6b8a-d137-4b5b-94b4-011f680ada1d/kube-multus/0.log" Apr 22 20:26:41.362775 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:26:41.362750 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-5dv89_d77c3bfb-9ca8-4f41-8c75-d15f2c1ab3b8/network-metrics-daemon/0.log" Apr 22 20:26:41.382604 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:26:41.382583 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-5dv89_d77c3bfb-9ca8-4f41-8c75-d15f2c1ab3b8/kube-rbac-proxy/0.log" Apr 22 20:26:42.834439 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:26:42.834411 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wrbxl_36e9b580-270c-4cbb-b3e6-78fde6f244ec/ovn-controller/0.log" Apr 22 20:26:42.860797 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:26:42.860766 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wrbxl_36e9b580-270c-4cbb-b3e6-78fde6f244ec/ovn-acl-logging/0.log" Apr 22 20:26:42.878275 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:26:42.878243 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wrbxl_36e9b580-270c-4cbb-b3e6-78fde6f244ec/kube-rbac-proxy-node/0.log" Apr 22 20:26:42.897657 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:26:42.897633 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wrbxl_36e9b580-270c-4cbb-b3e6-78fde6f244ec/kube-rbac-proxy-ovn-metrics/0.log" Apr 22 20:26:42.916628 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:26:42.916606 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wrbxl_36e9b580-270c-4cbb-b3e6-78fde6f244ec/northd/0.log" Apr 22 20:26:42.936280 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:26:42.936256 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wrbxl_36e9b580-270c-4cbb-b3e6-78fde6f244ec/nbdb/0.log" Apr 22 20:26:42.955892 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:26:42.955875 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wrbxl_36e9b580-270c-4cbb-b3e6-78fde6f244ec/sbdb/0.log" Apr 22 20:26:43.082441 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:26:43.082415 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wrbxl_36e9b580-270c-4cbb-b3e6-78fde6f244ec/ovnkube-controller/0.log" Apr 22 20:26:44.233295 ip-10-0-143-253 kubenswrapper[2577]: I0422 20:26:44.233266 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-t68sf_73129b41-d555-4a74-9f2a-640a35e9625f/network-check-target-container/0.log"