Apr 23 13:28:39.080149 ip-10-0-128-108 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 23 13:28:39.080161 ip-10-0-128-108 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 23 13:28:39.080169 ip-10-0-128-108 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 23 13:28:39.080439 ip-10-0-128-108 systemd[1]: Failed to start Kubernetes Kubelet. Apr 23 13:28:49.182335 ip-10-0-128-108 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 23 13:28:49.182351 ip-10-0-128-108 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot dc35640c2fc14fa2b01a8cf06d5e1411 -- Apr 23 13:31:05.732631 ip-10-0-128-108 systemd[1]: Starting Kubernetes Kubelet... Apr 23 13:31:06.203691 ip-10-0-128-108 kubenswrapper[2571]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 23 13:31:06.203691 ip-10-0-128-108 kubenswrapper[2571]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 23 13:31:06.203691 ip-10-0-128-108 kubenswrapper[2571]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 23 13:31:06.203691 ip-10-0-128-108 kubenswrapper[2571]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 23 13:31:06.203691 ip-10-0-128-108 kubenswrapper[2571]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 23 13:31:06.206096 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.206003 2571 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 23 13:31:06.211267 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.211242 2571 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 13:31:06.211267 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.211259 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 13:31:06.211267 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.211262 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 13:31:06.211267 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.211265 2571 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 13:31:06.211267 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.211269 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 13:31:06.211267 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.211272 2571 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 13:31:06.211267 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.211275 2571 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 13:31:06.211267 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.211277 2571 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 13:31:06.211566 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.211280 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 13:31:06.211566 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.211283 2571 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 13:31:06.211566 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.211286 2571 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 13:31:06.211566 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.211289 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 13:31:06.211566 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.211292 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 13:31:06.211566 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.211295 2571 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 13:31:06.211566 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.211297 2571 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 13:31:06.211566 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.211300 2571 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 13:31:06.211566 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.211305 2571 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 13:31:06.211566 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.211308 2571 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 13:31:06.211566 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.211312 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 13:31:06.211566 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.211315 2571 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 13:31:06.211566 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.211320 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 13:31:06.211566 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.211322 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 13:31:06.211566 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.211325 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 13:31:06.211566 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.211344 2571 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 13:31:06.211566 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.211348 2571 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 13:31:06.211566 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.211351 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 13:31:06.211566 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.211354 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 13:31:06.212015 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.211356 2571 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 13:31:06.212015 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.211359 2571 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 13:31:06.212015 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.211362 2571 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 13:31:06.212015 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.211365 2571 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 13:31:06.212015 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.211367 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 13:31:06.212015 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.211370 2571 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 13:31:06.212015 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.211373 2571 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 13:31:06.212015 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.211375 2571 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 13:31:06.212015 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.211378 2571 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 13:31:06.212015 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.211381 2571 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 13:31:06.212015 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.211384 2571 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 13:31:06.212015 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.211386 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 13:31:06.212015 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.211389 2571 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 13:31:06.212015 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.211391 2571 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 13:31:06.212015 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.211395 2571 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 13:31:06.212015 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.211398 2571 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 13:31:06.212015 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.211401 2571 feature_gate.go:328] unrecognized feature gate: Example Apr 23 13:31:06.212015 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.211404 2571 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 13:31:06.212015 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.211406 2571 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 13:31:06.212015 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.211409 2571 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 13:31:06.212519 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.211411 2571 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 13:31:06.212519 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.211414 2571 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 13:31:06.212519 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.211417 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 13:31:06.212519 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.211419 2571 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 13:31:06.212519 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.211422 2571 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 13:31:06.212519 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.211425 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 13:31:06.212519 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.211428 2571 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 13:31:06.212519 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.211430 2571 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 13:31:06.212519 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.211433 2571 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 13:31:06.212519 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.211435 2571 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 13:31:06.212519 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.211438 2571 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 13:31:06.212519 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.211440 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 13:31:06.212519 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.211443 2571 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 13:31:06.212519 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.211445 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 13:31:06.212519 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.211448 2571 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 13:31:06.212519 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.211450 2571 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 13:31:06.212519 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.211453 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 13:31:06.212519 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.211455 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 13:31:06.212519 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.211458 2571 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 13:31:06.212519 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.211460 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 13:31:06.213012 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.211463 2571 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 13:31:06.213012 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.211465 2571 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 13:31:06.213012 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.211470 2571 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 13:31:06.213012 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.211473 2571 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 13:31:06.213012 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.211475 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 13:31:06.213012 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.211478 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 13:31:06.213012 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.211481 2571 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 13:31:06.213012 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.211484 2571 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 13:31:06.213012 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.211486 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 13:31:06.213012 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.211489 2571 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 13:31:06.213012 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.211491 2571 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 13:31:06.213012 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.211495 2571 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 13:31:06.213012 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.211498 2571 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 13:31:06.213012 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.211501 2571 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 13:31:06.213012 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.211504 2571 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 13:31:06.213012 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.211517 2571 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 13:31:06.213012 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.211520 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 13:31:06.213012 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.211522 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 13:31:06.213012 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.211525 2571 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 13:31:06.213012 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.211921 2571 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 13:31:06.213518 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.211926 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 13:31:06.213518 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.211929 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 13:31:06.213518 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.211932 2571 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 13:31:06.213518 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.211934 2571 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 13:31:06.213518 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.211938 2571 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 13:31:06.213518 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.211940 2571 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 13:31:06.213518 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.211945 2571 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 13:31:06.213518 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.211949 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 13:31:06.213518 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.211952 2571 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 13:31:06.213518 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.211955 2571 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 13:31:06.213518 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.211958 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 13:31:06.213518 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.211961 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 13:31:06.213518 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.211963 2571 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 13:31:06.213518 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.211966 2571 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 13:31:06.213518 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.211969 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 13:31:06.213518 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.211972 2571 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 13:31:06.213518 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.211974 2571 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 13:31:06.213518 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.211977 2571 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 13:31:06.213518 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.211981 2571 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 13:31:06.213984 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.211985 2571 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 13:31:06.213984 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.211988 2571 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 13:31:06.213984 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.211991 2571 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 13:31:06.213984 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.211994 2571 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 13:31:06.213984 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.211997 2571 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 13:31:06.213984 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.212000 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 13:31:06.213984 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.212002 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 13:31:06.213984 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.212005 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 13:31:06.213984 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.212008 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 13:31:06.213984 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.212010 2571 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 13:31:06.213984 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.212013 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 13:31:06.213984 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.212015 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 13:31:06.213984 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.212018 2571 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 13:31:06.213984 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.212021 2571 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 13:31:06.213984 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.212024 2571 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 13:31:06.213984 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.212027 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 13:31:06.213984 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.212029 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 13:31:06.213984 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.212032 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 13:31:06.213984 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.212034 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 13:31:06.213984 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.212037 2571 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 13:31:06.214542 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.212039 2571 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 13:31:06.214542 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.212042 2571 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 13:31:06.214542 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.212044 2571 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 13:31:06.214542 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.212047 2571 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 13:31:06.214542 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.212049 2571 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 13:31:06.214542 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.212052 2571 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 13:31:06.214542 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.212054 2571 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 13:31:06.214542 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.212057 2571 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 13:31:06.214542 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.212060 2571 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 13:31:06.214542 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.212063 2571 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 13:31:06.214542 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.212065 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 13:31:06.214542 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.212068 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 13:31:06.214542 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.212070 2571 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 13:31:06.214542 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.212073 2571 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 13:31:06.214542 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.212075 2571 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 13:31:06.214542 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.212079 2571 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 13:31:06.214542 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.212081 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 13:31:06.214542 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.212084 2571 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 13:31:06.214542 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.212086 2571 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 13:31:06.214542 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.212089 2571 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 13:31:06.215127 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.212092 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 13:31:06.215127 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.212094 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 13:31:06.215127 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.212096 2571 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 13:31:06.215127 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.212099 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 13:31:06.215127 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.212101 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 13:31:06.215127 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.212104 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 13:31:06.215127 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.212106 2571 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 13:31:06.215127 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.212109 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 13:31:06.215127 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.212111 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 13:31:06.215127 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.212114 2571 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 13:31:06.215127 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.212116 2571 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 13:31:06.215127 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.212119 2571 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 13:31:06.215127 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.212121 2571 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 13:31:06.215127 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.212123 2571 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 13:31:06.215127 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.212126 2571 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 13:31:06.215127 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.212128 2571 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 13:31:06.215127 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.212131 2571 feature_gate.go:328] unrecognized feature gate: Example Apr 23 13:31:06.215127 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.212134 2571 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 13:31:06.215127 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.212136 2571 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 13:31:06.215127 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.212139 2571 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 13:31:06.215662 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.212141 2571 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 13:31:06.215662 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.212143 2571 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 13:31:06.215662 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.212146 2571 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 13:31:06.215662 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.212149 2571 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 13:31:06.215662 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.212152 2571 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 13:31:06.215662 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.212154 2571 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 13:31:06.215662 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.212224 2571 flags.go:64] FLAG: --address="0.0.0.0" Apr 23 13:31:06.215662 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.212231 2571 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 23 13:31:06.215662 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.212238 2571 flags.go:64] FLAG: --anonymous-auth="true" Apr 23 13:31:06.215662 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.212244 2571 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 23 13:31:06.215662 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.212251 2571 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 23 13:31:06.215662 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.212254 2571 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 23 13:31:06.215662 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.212259 2571 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 23 13:31:06.215662 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.212263 2571 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 23 13:31:06.215662 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.212266 2571 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 23 13:31:06.215662 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.212269 2571 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 23 13:31:06.215662 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.212273 2571 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 23 13:31:06.215662 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.212276 2571 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 23 13:31:06.215662 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.212279 2571 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 23 13:31:06.215662 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.212282 2571 flags.go:64] FLAG: --cgroup-root="" Apr 23 13:31:06.215662 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.212285 2571 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 23 13:31:06.215662 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.212288 2571 flags.go:64] FLAG: --client-ca-file="" Apr 23 13:31:06.215662 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.212291 2571 flags.go:64] FLAG: --cloud-config="" Apr 23 13:31:06.216227 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.212293 2571 flags.go:64] FLAG: --cloud-provider="external" Apr 23 13:31:06.216227 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.212296 2571 flags.go:64] FLAG: --cluster-dns="[]" Apr 23 13:31:06.216227 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.212301 2571 flags.go:64] FLAG: --cluster-domain="" Apr 23 13:31:06.216227 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.212304 2571 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 23 13:31:06.216227 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.212308 2571 flags.go:64] FLAG: --config-dir="" Apr 23 13:31:06.216227 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.212310 2571 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 23 13:31:06.216227 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.212314 2571 flags.go:64] FLAG: --container-log-max-files="5" Apr 23 13:31:06.216227 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.212318 2571 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 23 13:31:06.216227 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.212320 2571 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 23 13:31:06.216227 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.212324 2571 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 23 13:31:06.216227 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.212342 2571 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 23 13:31:06.216227 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.212346 2571 flags.go:64] FLAG: --contention-profiling="false" Apr 23 13:31:06.216227 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.212349 2571 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 23 13:31:06.216227 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.212352 2571 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 23 13:31:06.216227 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.212355 2571 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 23 13:31:06.216227 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.212358 2571 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 23 13:31:06.216227 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.212363 2571 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 23 13:31:06.216227 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.212366 2571 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 23 13:31:06.216227 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.212369 2571 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 23 13:31:06.216227 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.212372 2571 flags.go:64] FLAG: --enable-load-reader="false" Apr 23 13:31:06.216227 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.212375 2571 flags.go:64] FLAG: --enable-server="true" Apr 23 13:31:06.216227 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.212378 2571 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 23 13:31:06.216227 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.212382 2571 flags.go:64] FLAG: --event-burst="100" Apr 23 13:31:06.216227 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.212385 2571 flags.go:64] FLAG: --event-qps="50" Apr 23 13:31:06.216227 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.212388 2571 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 23 13:31:06.216859 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.212392 2571 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 23 13:31:06.216859 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.212394 2571 flags.go:64] FLAG: --eviction-hard="" Apr 23 13:31:06.216859 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.212398 2571 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 23 13:31:06.216859 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.212401 2571 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 23 13:31:06.216859 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.212404 2571 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 23 13:31:06.216859 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.212407 2571 flags.go:64] FLAG: --eviction-soft="" Apr 23 13:31:06.216859 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.212409 2571 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 23 13:31:06.216859 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.212412 2571 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 23 13:31:06.216859 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.212415 2571 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 23 13:31:06.216859 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.212418 2571 flags.go:64] FLAG: --experimental-mounter-path="" Apr 23 13:31:06.216859 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.212421 2571 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 23 13:31:06.216859 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.212424 2571 flags.go:64] FLAG: --fail-swap-on="true" Apr 23 13:31:06.216859 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.212427 2571 flags.go:64] FLAG: --feature-gates="" Apr 23 13:31:06.216859 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.212432 2571 flags.go:64] FLAG: --file-check-frequency="20s" Apr 23 13:31:06.216859 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.212434 2571 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 23 13:31:06.216859 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.212438 2571 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 23 13:31:06.216859 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.212441 2571 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 23 13:31:06.216859 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.212444 2571 flags.go:64] FLAG: --healthz-port="10248" Apr 23 13:31:06.216859 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.212450 2571 flags.go:64] FLAG: --help="false" Apr 23 13:31:06.216859 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.212453 2571 flags.go:64] FLAG: --hostname-override="ip-10-0-128-108.ec2.internal" Apr 23 13:31:06.216859 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.212456 2571 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 23 13:31:06.216859 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.212459 2571 flags.go:64] FLAG: --http-check-frequency="20s" Apr 23 13:31:06.216859 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.212462 2571 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 23 13:31:06.217429 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.212466 2571 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 23 13:31:06.217429 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.212469 2571 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 23 13:31:06.217429 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.212472 2571 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 23 13:31:06.217429 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.212475 2571 flags.go:64] FLAG: --image-service-endpoint="" Apr 23 13:31:06.217429 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.212477 2571 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 23 13:31:06.217429 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.212481 2571 flags.go:64] FLAG: --kube-api-burst="100" Apr 23 13:31:06.217429 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.212484 2571 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 23 13:31:06.217429 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.212487 2571 flags.go:64] FLAG: --kube-api-qps="50" Apr 23 13:31:06.217429 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.212489 2571 flags.go:64] FLAG: --kube-reserved="" Apr 23 13:31:06.217429 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.212492 2571 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 23 13:31:06.217429 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.212495 2571 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 23 13:31:06.217429 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.212498 2571 flags.go:64] FLAG: --kubelet-cgroups="" Apr 23 13:31:06.217429 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.212501 2571 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 23 13:31:06.217429 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.212504 2571 flags.go:64] FLAG: --lock-file="" Apr 23 13:31:06.217429 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.212506 2571 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 23 13:31:06.217429 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.212509 2571 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 23 13:31:06.217429 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.212515 2571 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 23 13:31:06.217429 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.212520 2571 flags.go:64] FLAG: --log-json-split-stream="false" Apr 23 13:31:06.217429 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.212523 2571 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 23 13:31:06.217429 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.212526 2571 flags.go:64] FLAG: --log-text-split-stream="false" Apr 23 13:31:06.217429 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.212529 2571 flags.go:64] FLAG: --logging-format="text" Apr 23 13:31:06.217429 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.212532 2571 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 23 13:31:06.217429 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.212536 2571 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 23 13:31:06.217429 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.212539 2571 flags.go:64] FLAG: --manifest-url="" Apr 23 13:31:06.218003 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.212541 2571 flags.go:64] FLAG: --manifest-url-header="" Apr 23 13:31:06.218003 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.212546 2571 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 23 13:31:06.218003 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.212549 2571 flags.go:64] FLAG: --max-open-files="1000000" Apr 23 13:31:06.218003 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.212554 2571 flags.go:64] FLAG: --max-pods="110" Apr 23 13:31:06.218003 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.212557 2571 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 23 13:31:06.218003 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.212560 2571 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 23 13:31:06.218003 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.212563 2571 flags.go:64] FLAG: --memory-manager-policy="None" Apr 23 13:31:06.218003 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.212566 2571 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 23 13:31:06.218003 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.212569 2571 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 23 13:31:06.218003 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.212572 2571 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 23 13:31:06.218003 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.212576 2571 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 23 13:31:06.218003 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.212583 2571 flags.go:64] FLAG: --node-status-max-images="50" Apr 23 13:31:06.218003 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.212586 2571 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 23 13:31:06.218003 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.212589 2571 flags.go:64] FLAG: --oom-score-adj="-999" Apr 23 13:31:06.218003 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.212593 2571 flags.go:64] FLAG: --pod-cidr="" Apr 23 13:31:06.218003 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.212596 2571 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 23 13:31:06.218003 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.212602 2571 flags.go:64] FLAG: --pod-manifest-path="" Apr 23 13:31:06.218003 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.212606 2571 flags.go:64] FLAG: --pod-max-pids="-1" Apr 23 13:31:06.218003 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.212609 2571 flags.go:64] FLAG: --pods-per-core="0" Apr 23 13:31:06.218003 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.212612 2571 flags.go:64] FLAG: --port="10250" Apr 23 13:31:06.218003 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.212615 2571 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 23 13:31:06.218003 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.212618 2571 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-054e557f41ae528d0" Apr 23 13:31:06.218003 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.212621 2571 flags.go:64] FLAG: --qos-reserved="" Apr 23 13:31:06.218003 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.212624 2571 flags.go:64] FLAG: --read-only-port="10255" Apr 23 13:31:06.218617 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.212627 2571 flags.go:64] FLAG: --register-node="true" Apr 23 13:31:06.218617 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.212632 2571 flags.go:64] FLAG: --register-schedulable="true" Apr 23 13:31:06.218617 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.212635 2571 flags.go:64] FLAG: --register-with-taints="" Apr 23 13:31:06.218617 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.212638 2571 flags.go:64] FLAG: --registry-burst="10" Apr 23 13:31:06.218617 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.212641 2571 flags.go:64] FLAG: --registry-qps="5" Apr 23 13:31:06.218617 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.212644 2571 flags.go:64] FLAG: --reserved-cpus="" Apr 23 13:31:06.218617 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.212647 2571 flags.go:64] FLAG: --reserved-memory="" Apr 23 13:31:06.218617 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.212651 2571 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 23 13:31:06.218617 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.212654 2571 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 23 13:31:06.218617 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.212657 2571 flags.go:64] FLAG: --rotate-certificates="false" Apr 23 13:31:06.218617 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.212660 2571 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 23 13:31:06.218617 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.212664 2571 flags.go:64] FLAG: --runonce="false" Apr 23 13:31:06.218617 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.212667 2571 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 23 13:31:06.218617 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.212670 2571 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 23 13:31:06.218617 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.212673 2571 flags.go:64] FLAG: --seccomp-default="false" Apr 23 13:31:06.218617 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.212676 2571 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 23 13:31:06.218617 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.212679 2571 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 23 13:31:06.218617 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.212682 2571 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 23 13:31:06.218617 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.212685 2571 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 23 13:31:06.218617 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.212688 2571 flags.go:64] FLAG: --storage-driver-password="root" Apr 23 13:31:06.218617 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.212691 2571 flags.go:64] FLAG: --storage-driver-secure="false" Apr 23 13:31:06.218617 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.212694 2571 flags.go:64] FLAG: --storage-driver-table="stats" Apr 23 13:31:06.218617 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.212697 2571 flags.go:64] FLAG: --storage-driver-user="root" Apr 23 13:31:06.218617 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.212700 2571 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 23 13:31:06.218617 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.212703 2571 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 23 13:31:06.218617 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.212706 2571 flags.go:64] FLAG: --system-cgroups="" Apr 23 13:31:06.219294 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.212708 2571 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 23 13:31:06.219294 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.212714 2571 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 23 13:31:06.219294 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.212717 2571 flags.go:64] FLAG: --tls-cert-file="" Apr 23 13:31:06.219294 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.212719 2571 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 23 13:31:06.219294 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.212724 2571 flags.go:64] FLAG: --tls-min-version="" Apr 23 13:31:06.219294 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.212726 2571 flags.go:64] FLAG: --tls-private-key-file="" Apr 23 13:31:06.219294 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.212729 2571 flags.go:64] FLAG: --topology-manager-policy="none" Apr 23 13:31:06.219294 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.212734 2571 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 23 13:31:06.219294 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.212737 2571 flags.go:64] FLAG: --topology-manager-scope="container" Apr 23 13:31:06.219294 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.212740 2571 flags.go:64] FLAG: --v="2" Apr 23 13:31:06.219294 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.212744 2571 flags.go:64] FLAG: --version="false" Apr 23 13:31:06.219294 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.212748 2571 flags.go:64] FLAG: --vmodule="" Apr 23 13:31:06.219294 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.212753 2571 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 23 13:31:06.219294 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.212756 2571 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 23 13:31:06.219294 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.212847 2571 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 13:31:06.219294 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.212851 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 13:31:06.219294 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.212854 2571 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 13:31:06.219294 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.212862 2571 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 13:31:06.219294 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.212865 2571 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 13:31:06.219294 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.212868 2571 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 13:31:06.219294 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.212871 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 13:31:06.219294 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.212873 2571 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 13:31:06.219294 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.212876 2571 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 13:31:06.219876 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.212879 2571 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 13:31:06.219876 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.212882 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 13:31:06.219876 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.212885 2571 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 13:31:06.219876 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.212887 2571 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 13:31:06.219876 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.212890 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 13:31:06.219876 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.212892 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 13:31:06.219876 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.212895 2571 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 13:31:06.219876 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.212898 2571 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 13:31:06.219876 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.212901 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 13:31:06.219876 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.212904 2571 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 13:31:06.219876 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.212906 2571 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 13:31:06.219876 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.212909 2571 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 13:31:06.219876 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.212911 2571 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 13:31:06.219876 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.212914 2571 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 13:31:06.219876 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.212916 2571 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 13:31:06.219876 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.212919 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 13:31:06.219876 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.212923 2571 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 13:31:06.219876 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.212926 2571 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 13:31:06.219876 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.212928 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 13:31:06.219876 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.212931 2571 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 13:31:06.220410 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.212933 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 13:31:06.220410 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.212936 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 13:31:06.220410 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.212938 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 13:31:06.220410 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.212941 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 13:31:06.220410 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.212943 2571 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 13:31:06.220410 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.212946 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 13:31:06.220410 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.212949 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 13:31:06.220410 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.212952 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 13:31:06.220410 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.212954 2571 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 13:31:06.220410 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.212957 2571 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 13:31:06.220410 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.212960 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 13:31:06.220410 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.212963 2571 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 13:31:06.220410 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.212965 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 13:31:06.220410 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.212968 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 13:31:06.220410 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.212971 2571 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 13:31:06.220410 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.212973 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 13:31:06.220410 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.212976 2571 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 13:31:06.220410 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.212978 2571 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 13:31:06.220410 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.212981 2571 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 13:31:06.220410 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.212983 2571 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 13:31:06.220902 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.212986 2571 feature_gate.go:328] unrecognized feature gate: Example Apr 23 13:31:06.220902 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.212989 2571 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 13:31:06.220902 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.212991 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 13:31:06.220902 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.212994 2571 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 13:31:06.220902 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.212996 2571 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 13:31:06.220902 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.212999 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 13:31:06.220902 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.213001 2571 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 13:31:06.220902 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.213004 2571 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 13:31:06.220902 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.213007 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 13:31:06.220902 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.213010 2571 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 13:31:06.220902 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.213013 2571 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 13:31:06.220902 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.213016 2571 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 13:31:06.220902 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.213018 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 13:31:06.220902 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.213021 2571 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 13:31:06.220902 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.213024 2571 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 13:31:06.220902 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.213026 2571 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 13:31:06.220902 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.213029 2571 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 13:31:06.220902 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.213032 2571 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 13:31:06.220902 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.213035 2571 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 13:31:06.221397 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.213037 2571 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 13:31:06.221397 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.213040 2571 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 13:31:06.221397 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.213042 2571 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 13:31:06.221397 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.213045 2571 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 13:31:06.221397 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.213047 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 13:31:06.221397 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.213050 2571 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 13:31:06.221397 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.213053 2571 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 13:31:06.221397 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.213055 2571 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 13:31:06.221397 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.213058 2571 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 13:31:06.221397 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.213060 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 13:31:06.221397 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.213064 2571 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 13:31:06.221397 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.213069 2571 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 13:31:06.221397 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.213073 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 13:31:06.221397 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.213077 2571 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 13:31:06.221397 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.213080 2571 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 13:31:06.221397 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.213083 2571 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 13:31:06.221397 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.213085 2571 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 13:31:06.221397 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.213088 2571 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 13:31:06.221852 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.213093 2571 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 23 13:31:06.221852 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.219454 2571 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 23 13:31:06.221852 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.219470 2571 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 23 13:31:06.221852 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.219529 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 13:31:06.221852 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.219535 2571 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 13:31:06.221852 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.219538 2571 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 13:31:06.221852 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.219542 2571 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 13:31:06.221852 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.219544 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 13:31:06.221852 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.219547 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 13:31:06.221852 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.219550 2571 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 13:31:06.221852 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.219553 2571 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 13:31:06.221852 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.219556 2571 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 13:31:06.221852 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.219559 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 13:31:06.221852 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.219561 2571 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 13:31:06.221852 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.219564 2571 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 13:31:06.221852 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.219567 2571 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 13:31:06.222257 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.219569 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 13:31:06.222257 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.219572 2571 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 13:31:06.222257 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.219575 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 13:31:06.222257 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.219578 2571 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 13:31:06.222257 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.219580 2571 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 13:31:06.222257 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.219583 2571 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 13:31:06.222257 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.219586 2571 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 13:31:06.222257 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.219588 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 13:31:06.222257 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.219591 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 13:31:06.222257 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.219593 2571 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 13:31:06.222257 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.219596 2571 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 13:31:06.222257 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.219599 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 13:31:06.222257 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.219601 2571 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 13:31:06.222257 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.219604 2571 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 13:31:06.222257 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.219606 2571 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 13:31:06.222257 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.219609 2571 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 13:31:06.222257 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.219611 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 13:31:06.222257 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.219614 2571 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 13:31:06.222257 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.219618 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 13:31:06.222257 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.219621 2571 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 13:31:06.222790 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.219624 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 13:31:06.222790 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.219627 2571 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 13:31:06.222790 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.219629 2571 feature_gate.go:328] unrecognized feature gate: Example Apr 23 13:31:06.222790 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.219632 2571 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 13:31:06.222790 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.219634 2571 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 13:31:06.222790 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.219637 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 13:31:06.222790 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.219639 2571 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 13:31:06.222790 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.219642 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 13:31:06.222790 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.219644 2571 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 13:31:06.222790 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.219648 2571 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 13:31:06.222790 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.219652 2571 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 13:31:06.222790 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.219655 2571 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 13:31:06.222790 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.219658 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 13:31:06.222790 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.219660 2571 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 13:31:06.222790 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.219663 2571 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 13:31:06.222790 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.219665 2571 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 13:31:06.222790 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.219668 2571 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 13:31:06.222790 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.219670 2571 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 13:31:06.222790 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.219673 2571 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 13:31:06.222790 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.219675 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 13:31:06.223273 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.219678 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 13:31:06.223273 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.219681 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 13:31:06.223273 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.219683 2571 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 13:31:06.223273 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.219686 2571 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 13:31:06.223273 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.219688 2571 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 13:31:06.223273 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.219691 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 13:31:06.223273 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.219694 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 13:31:06.223273 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.219696 2571 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 13:31:06.223273 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.219699 2571 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 13:31:06.223273 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.219702 2571 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 13:31:06.223273 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.219705 2571 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 13:31:06.223273 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.219710 2571 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 13:31:06.223273 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.219714 2571 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 13:31:06.223273 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.219717 2571 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 13:31:06.223273 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.219720 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 13:31:06.223273 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.219723 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 13:31:06.223273 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.219726 2571 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 13:31:06.223273 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.219729 2571 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 13:31:06.223273 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.219732 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 13:31:06.223763 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.219735 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 13:31:06.223763 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.219737 2571 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 13:31:06.223763 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.219740 2571 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 13:31:06.223763 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.219743 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 13:31:06.223763 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.219745 2571 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 13:31:06.223763 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.219748 2571 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 13:31:06.223763 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.219750 2571 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 13:31:06.223763 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.219752 2571 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 13:31:06.223763 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.219755 2571 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 13:31:06.223763 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.219758 2571 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 13:31:06.223763 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.219760 2571 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 13:31:06.223763 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.219763 2571 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 13:31:06.223763 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.219765 2571 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 13:31:06.223763 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.219768 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 13:31:06.223763 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.219773 2571 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 23 13:31:06.223763 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.219868 2571 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 13:31:06.224161 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.219873 2571 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 13:31:06.224161 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.219876 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 13:31:06.224161 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.219879 2571 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 13:31:06.224161 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.219882 2571 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 13:31:06.224161 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.219885 2571 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 13:31:06.224161 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.219888 2571 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 13:31:06.224161 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.219891 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 13:31:06.224161 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.219894 2571 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 13:31:06.224161 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.219897 2571 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 13:31:06.224161 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.219900 2571 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 13:31:06.224161 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.219903 2571 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 13:31:06.224161 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.219905 2571 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 13:31:06.224161 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.219908 2571 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 13:31:06.224161 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.219910 2571 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 13:31:06.224161 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.219913 2571 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 13:31:06.224161 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.219916 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 13:31:06.224161 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.219918 2571 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 13:31:06.224161 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.219920 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 13:31:06.224161 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.219924 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 13:31:06.224161 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.219926 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 13:31:06.224666 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.219929 2571 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 13:31:06.224666 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.219931 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 13:31:06.224666 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.219934 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 13:31:06.224666 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.219936 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 13:31:06.224666 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.219939 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 13:31:06.224666 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.219941 2571 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 13:31:06.224666 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.219944 2571 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 13:31:06.224666 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.219947 2571 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 13:31:06.224666 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.219950 2571 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 13:31:06.224666 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.219952 2571 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 13:31:06.224666 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.219954 2571 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 13:31:06.224666 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.219957 2571 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 13:31:06.224666 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.219960 2571 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 13:31:06.224666 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.219962 2571 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 13:31:06.224666 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.219964 2571 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 13:31:06.224666 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.219967 2571 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 13:31:06.224666 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.219969 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 13:31:06.224666 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.219972 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 13:31:06.224666 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.219975 2571 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 13:31:06.224666 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.219977 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 13:31:06.225153 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.219980 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 13:31:06.225153 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.219983 2571 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 13:31:06.225153 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.219986 2571 feature_gate.go:328] unrecognized feature gate: Example Apr 23 13:31:06.225153 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.219988 2571 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 13:31:06.225153 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.219991 2571 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 13:31:06.225153 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.219994 2571 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 13:31:06.225153 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.219996 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 13:31:06.225153 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.219999 2571 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 13:31:06.225153 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.220001 2571 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 13:31:06.225153 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.220004 2571 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 13:31:06.225153 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.220006 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 13:31:06.225153 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.220009 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 13:31:06.225153 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.220011 2571 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 13:31:06.225153 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.220013 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 13:31:06.225153 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.220016 2571 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 13:31:06.225153 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.220018 2571 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 13:31:06.225153 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.220021 2571 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 13:31:06.225153 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.220023 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 13:31:06.225153 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.220026 2571 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 13:31:06.225153 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.220028 2571 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 13:31:06.225755 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.220031 2571 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 13:31:06.225755 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.220033 2571 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 13:31:06.225755 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.220036 2571 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 13:31:06.225755 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.220039 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 13:31:06.225755 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.220042 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 13:31:06.225755 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.220044 2571 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 13:31:06.225755 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.220048 2571 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 13:31:06.225755 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.220052 2571 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 13:31:06.225755 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.220055 2571 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 13:31:06.225755 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.220057 2571 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 13:31:06.225755 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.220061 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 13:31:06.225755 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.220064 2571 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 13:31:06.225755 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.220067 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 13:31:06.225755 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.220069 2571 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 13:31:06.225755 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.220072 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 13:31:06.225755 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.220075 2571 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 13:31:06.225755 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.220078 2571 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 13:31:06.225755 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.220080 2571 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 13:31:06.225755 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.220083 2571 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 13:31:06.226260 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.220085 2571 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 13:31:06.226260 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.220088 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 13:31:06.226260 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.220090 2571 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 13:31:06.226260 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.220093 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 13:31:06.226260 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.220095 2571 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 13:31:06.226260 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:06.220098 2571 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 13:31:06.226260 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.220103 2571 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 23 13:31:06.226260 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.220829 2571 server.go:962] "Client rotation is on, will bootstrap in background" Apr 23 13:31:06.226260 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.223939 2571 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 23 13:31:06.226260 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.224811 2571 server.go:1019] "Starting client certificate rotation" Apr 23 13:31:06.226260 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.224904 2571 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 23 13:31:06.226260 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.224946 2571 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 23 13:31:06.249265 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.249247 2571 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 23 13:31:06.252991 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.252974 2571 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 23 13:31:06.270523 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.270492 2571 log.go:25] "Validated CRI v1 runtime API" Apr 23 13:31:06.277215 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.277200 2571 log.go:25] "Validated CRI v1 image API" Apr 23 13:31:06.278209 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.278193 2571 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 23 13:31:06.278651 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.278637 2571 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 23 13:31:06.285469 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.285440 2571 fs.go:135] Filesystem UUIDs: map[1ccce5fd-1f0a-46ba-b829-cc1c85863755:/dev/nvme0n1p3 7B77-95E7:/dev/nvme0n1p2 c760b2d0-f13c-4789-83ba-50fa8d32a00f:/dev/nvme0n1p4] Apr 23 13:31:06.285547 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.285464 2571 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 23 13:31:06.291374 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.291253 2571 manager.go:217] Machine: {Timestamp:2026-04-23 13:31:06.289443549 +0000 UTC m=+0.427722246 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3100654 MemoryCapacity:33164492800 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2dec88227abdf89eb27a3e5d220a9d SystemUUID:ec2dec88-227a-bdf8-9eb2-7a3e5d220a9d BootID:dc35640c-2fc1-4fa2-b01a-8cf06d5e1411 Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:b0:c5:28:b6:ed Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:b0:c5:28:b6:ed Speed:0 Mtu:9001} {Name:ovs-system MacAddress:12:ea:5b:79:a6:d3 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164492800 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 23 13:31:06.291374 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.291368 2571 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 23 13:31:06.291487 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.291472 2571 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 23 13:31:06.292467 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.292443 2571 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 23 13:31:06.292605 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.292470 2571 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-128-108.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 23 13:31:06.292653 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.292614 2571 topology_manager.go:138] "Creating topology manager with none policy" Apr 23 13:31:06.292653 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.292622 2571 container_manager_linux.go:306] "Creating device plugin manager" Apr 23 13:31:06.292653 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.292635 2571 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 23 13:31:06.293524 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.293513 2571 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 23 13:31:06.294372 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.294362 2571 state_mem.go:36] "Initialized new in-memory state store" Apr 23 13:31:06.294619 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.294609 2571 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 23 13:31:06.295163 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.295144 2571 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-xkxbw" Apr 23 13:31:06.297183 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.297173 2571 kubelet.go:491] "Attempting to sync node with API server" Apr 23 13:31:06.297220 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.297192 2571 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 23 13:31:06.297220 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.297204 2571 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 23 13:31:06.297220 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.297213 2571 kubelet.go:397] "Adding apiserver pod source" Apr 23 13:31:06.297220 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.297221 2571 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 23 13:31:06.298734 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.298720 2571 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 23 13:31:06.298811 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.298738 2571 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 23 13:31:06.301918 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.301903 2571 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 23 13:31:06.303306 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.303291 2571 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-xkxbw" Apr 23 13:31:06.303363 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.303309 2571 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 23 13:31:06.305041 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.305019 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 23 13:31:06.305041 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.305041 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 23 13:31:06.305115 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.305050 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 23 13:31:06.305115 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.305058 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 23 13:31:06.305115 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.305063 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 23 13:31:06.305115 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.305069 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 23 13:31:06.305115 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.305075 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 23 13:31:06.305115 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.305082 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 23 13:31:06.305115 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.305089 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 23 13:31:06.305115 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.305095 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 23 13:31:06.305115 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.305110 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 23 13:31:06.305115 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.305119 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 23 13:31:06.306122 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.306109 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 23 13:31:06.306122 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.306121 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 23 13:31:06.309775 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.309762 2571 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 23 13:31:06.309828 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.309796 2571 server.go:1295] "Started kubelet" Apr 23 13:31:06.309914 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.309873 2571 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 23 13:31:06.310030 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.309958 2571 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 23 13:31:06.310087 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.310062 2571 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 23 13:31:06.310900 ip-10-0-128-108 systemd[1]: Started Kubernetes Kubelet. Apr 23 13:31:06.311576 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.311210 2571 server.go:317] "Adding debug handlers to kubelet server" Apr 23 13:31:06.312114 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.311878 2571 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 23 13:31:06.317377 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.317272 2571 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 13:31:06.320681 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.320665 2571 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 13:31:06.320906 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.320881 2571 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 23 13:31:06.321386 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.321359 2571 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 23 13:31:06.322102 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.322073 2571 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 23 13:31:06.322102 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.322095 2571 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 23 13:31:06.322275 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.322262 2571 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 23 13:31:06.322357 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.322347 2571 reconstruct.go:97] "Volume reconstruction finished" Apr 23 13:31:06.322418 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.322359 2571 reconciler.go:26] "Reconciler: start to sync state" Apr 23 13:31:06.322517 ip-10-0-128-108 kubenswrapper[2571]: E0423 13:31:06.322479 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-108.ec2.internal\" not found" Apr 23 13:31:06.322595 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.322581 2571 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-128-108.ec2.internal" not found Apr 23 13:31:06.322723 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.322713 2571 factory.go:55] Registering systemd factory Apr 23 13:31:06.322760 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.322739 2571 factory.go:223] Registration of the systemd container factory successfully Apr 23 13:31:06.322950 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.322934 2571 factory.go:153] Registering CRI-O factory Apr 23 13:31:06.323018 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.322953 2571 factory.go:223] Registration of the crio container factory successfully Apr 23 13:31:06.323018 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.323011 2571 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 23 13:31:06.323115 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.323036 2571 factory.go:103] Registering Raw factory Apr 23 13:31:06.323115 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.323051 2571 manager.go:1196] Started watching for new ooms in manager Apr 23 13:31:06.323823 ip-10-0-128-108 kubenswrapper[2571]: E0423 13:31:06.323794 2571 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 23 13:31:06.323954 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.323942 2571 manager.go:319] Starting recovery of all containers Apr 23 13:31:06.324215 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.324198 2571 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 13:31:06.326517 ip-10-0-128-108 kubenswrapper[2571]: E0423 13:31:06.326492 2571 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-128-108.ec2.internal\" not found" node="ip-10-0-128-108.ec2.internal" Apr 23 13:31:06.330243 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.330204 2571 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 23 13:31:06.333876 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.333861 2571 manager.go:324] Recovery completed Apr 23 13:31:06.338037 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.338023 2571 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-128-108.ec2.internal" not found Apr 23 13:31:06.338656 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.338645 2571 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 13:31:06.340402 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.340378 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-108.ec2.internal" event="NodeHasSufficientMemory" Apr 23 13:31:06.340477 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.340405 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-108.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 13:31:06.340477 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.340416 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-108.ec2.internal" event="NodeHasSufficientPID" Apr 23 13:31:06.340866 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.340852 2571 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 23 13:31:06.340919 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.340866 2571 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 23 13:31:06.340919 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.340885 2571 state_mem.go:36] "Initialized new in-memory state store" Apr 23 13:31:06.343091 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.343080 2571 policy_none.go:49] "None policy: Start" Apr 23 13:31:06.343131 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.343095 2571 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 23 13:31:06.343131 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.343105 2571 state_mem.go:35] "Initializing new in-memory state store" Apr 23 13:31:06.385927 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.385907 2571 manager.go:341] "Starting Device Plugin manager" Apr 23 13:31:06.392795 ip-10-0-128-108 kubenswrapper[2571]: E0423 13:31:06.385967 2571 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 23 13:31:06.392795 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.385977 2571 server.go:85] "Starting device plugin registration server" Apr 23 13:31:06.392795 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.386184 2571 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 23 13:31:06.392795 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.386197 2571 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 23 13:31:06.392795 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.386293 2571 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 23 13:31:06.392795 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.386392 2571 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 23 13:31:06.392795 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.386406 2571 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 23 13:31:06.392795 ip-10-0-128-108 kubenswrapper[2571]: E0423 13:31:06.387748 2571 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 23 13:31:06.392795 ip-10-0-128-108 kubenswrapper[2571]: E0423 13:31:06.387783 2571 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-128-108.ec2.internal\" not found" Apr 23 13:31:06.398865 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.398852 2571 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-128-108.ec2.internal" not found Apr 23 13:31:06.428509 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.428487 2571 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 23 13:31:06.428584 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.428516 2571 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 23 13:31:06.428584 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.428534 2571 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 23 13:31:06.428584 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.428540 2571 kubelet.go:2451] "Starting kubelet main sync loop" Apr 23 13:31:06.428584 ip-10-0-128-108 kubenswrapper[2571]: E0423 13:31:06.428572 2571 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 23 13:31:06.430786 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.430770 2571 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 13:31:06.486551 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.486504 2571 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 13:31:06.487431 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.487417 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-108.ec2.internal" event="NodeHasSufficientMemory" Apr 23 13:31:06.487486 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.487444 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-108.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 13:31:06.487486 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.487454 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-108.ec2.internal" event="NodeHasSufficientPID" Apr 23 13:31:06.487486 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.487474 2571 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-128-108.ec2.internal" Apr 23 13:31:06.496431 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.496415 2571 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-128-108.ec2.internal" Apr 23 13:31:06.529676 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.529651 2571 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-108.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-128-108.ec2.internal"] Apr 23 13:31:06.532001 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.531986 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-108.ec2.internal" Apr 23 13:31:06.532089 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.531990 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-108.ec2.internal" Apr 23 13:31:06.548832 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.548816 2571 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-108.ec2.internal" Apr 23 13:31:06.552512 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.552498 2571 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-108.ec2.internal" Apr 23 13:31:06.563875 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.563861 2571 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 23 13:31:06.565767 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.565752 2571 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 23 13:31:06.624887 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.624866 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/5a1bfc99dc8a1b47af59ec4d15c476b0-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-128-108.ec2.internal\" (UID: \"5a1bfc99dc8a1b47af59ec4d15c476b0\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-108.ec2.internal" Apr 23 13:31:06.624975 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.624891 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/5a1bfc99dc8a1b47af59ec4d15c476b0-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-128-108.ec2.internal\" (UID: \"5a1bfc99dc8a1b47af59ec4d15c476b0\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-108.ec2.internal" Apr 23 13:31:06.624975 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.624914 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/95cd19913bda1121c44084ba2ccca700-config\") pod \"kube-apiserver-proxy-ip-10-0-128-108.ec2.internal\" (UID: \"95cd19913bda1121c44084ba2ccca700\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-128-108.ec2.internal" Apr 23 13:31:06.725721 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.725701 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/5a1bfc99dc8a1b47af59ec4d15c476b0-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-128-108.ec2.internal\" (UID: \"5a1bfc99dc8a1b47af59ec4d15c476b0\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-108.ec2.internal" Apr 23 13:31:06.725828 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.725745 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/5a1bfc99dc8a1b47af59ec4d15c476b0-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-128-108.ec2.internal\" (UID: \"5a1bfc99dc8a1b47af59ec4d15c476b0\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-108.ec2.internal" Apr 23 13:31:06.725828 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.725769 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/95cd19913bda1121c44084ba2ccca700-config\") pod \"kube-apiserver-proxy-ip-10-0-128-108.ec2.internal\" (UID: \"95cd19913bda1121c44084ba2ccca700\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-128-108.ec2.internal" Apr 23 13:31:06.725828 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.725807 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/95cd19913bda1121c44084ba2ccca700-config\") pod \"kube-apiserver-proxy-ip-10-0-128-108.ec2.internal\" (UID: \"95cd19913bda1121c44084ba2ccca700\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-128-108.ec2.internal" Apr 23 13:31:06.725828 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.725812 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/5a1bfc99dc8a1b47af59ec4d15c476b0-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-128-108.ec2.internal\" (UID: \"5a1bfc99dc8a1b47af59ec4d15c476b0\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-108.ec2.internal" Apr 23 13:31:06.725955 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.725818 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/5a1bfc99dc8a1b47af59ec4d15c476b0-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-128-108.ec2.internal\" (UID: \"5a1bfc99dc8a1b47af59ec4d15c476b0\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-108.ec2.internal" Apr 23 13:31:06.867627 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.867552 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-108.ec2.internal" Apr 23 13:31:06.867751 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:06.867552 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-108.ec2.internal" Apr 23 13:31:07.225019 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.224954 2571 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 23 13:31:07.225729 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.225083 2571 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 23 13:31:07.225729 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.225133 2571 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 23 13:31:07.225729 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.225118 2571 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 23 13:31:07.298383 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.298352 2571 apiserver.go:52] "Watching apiserver" Apr 23 13:31:07.305141 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.305094 2571 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-22 13:26:06 +0000 UTC" deadline="2027-10-17 06:45:14.468740678 +0000 UTC" Apr 23 13:31:07.305141 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.305136 2571 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="13001h14m7.163607295s" Apr 23 13:31:07.305495 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.305478 2571 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 23 13:31:07.306324 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.306306 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-rm6cm","openshift-multus/network-metrics-daemon-k958q","openshift-network-diagnostics/network-check-target-qqr6d","kube-system/konnectivity-agent-5kfq8","openshift-multus/multus-2g7ct","openshift-network-operator/iptables-alerter-4xm4l","openshift-ovn-kubernetes/ovnkube-node-zk8wt","kube-system/kube-apiserver-proxy-ip-10-0-128-108.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zq8gn","openshift-cluster-node-tuning-operator/tuned-wfrll","openshift-image-registry/node-ca-2rlxj","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-108.ec2.internal"] Apr 23 13:31:07.308808 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.308793 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-rm6cm" Apr 23 13:31:07.310997 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.310978 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k958q" Apr 23 13:31:07.311083 ip-10-0-128-108 kubenswrapper[2571]: E0423 13:31:07.311036 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k958q" podUID="6866a2aa-1943-4e03-a99a-8b054a2434c8" Apr 23 13:31:07.311531 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.311502 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 23 13:31:07.311652 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.311636 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 23 13:31:07.311717 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.311659 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-j987r\"" Apr 23 13:31:07.311717 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.311694 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 23 13:31:07.311793 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.311714 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 23 13:31:07.311793 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.311659 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 23 13:31:07.315428 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.315405 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qqr6d" Apr 23 13:31:07.315519 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.315482 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-5kfq8" Apr 23 13:31:07.315572 ip-10-0-128-108 kubenswrapper[2571]: E0423 13:31:07.315479 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qqr6d" podUID="958f9f23-bac7-4183-b8b4-7d4d89901105" Apr 23 13:31:07.317615 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.317599 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 23 13:31:07.317806 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.317792 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 23 13:31:07.317860 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.317847 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-k28fs\"" Apr 23 13:31:07.319731 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.319718 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-2g7ct" Apr 23 13:31:07.319819 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.319806 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4xm4l" Apr 23 13:31:07.321106 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.321090 2571 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 23 13:31:07.321865 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.321828 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-s6685\"" Apr 23 13:31:07.321968 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.321881 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 23 13:31:07.322028 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.322004 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 23 13:31:07.322256 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.322240 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 23 13:31:07.322326 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.322258 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 23 13:31:07.322326 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.322259 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-zg2qg\"" Apr 23 13:31:07.322687 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.322671 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-zk8wt" Apr 23 13:31:07.324755 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.324737 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 23 13:31:07.324834 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.324740 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 23 13:31:07.325685 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.325670 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zq8gn" Apr 23 13:31:07.326076 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.326058 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-cs55k\"" Apr 23 13:31:07.326166 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.326097 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 23 13:31:07.326166 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.326114 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 23 13:31:07.326272 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.326221 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 23 13:31:07.327749 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.327732 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 23 13:31:07.327963 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.327944 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/abbb4dd8-8bc4-4850-bdc6-2b1ed2494101-host-run-k8s-cni-cncf-io\") pod \"multus-2g7ct\" (UID: \"abbb4dd8-8bc4-4850-bdc6-2b1ed2494101\") " pod="openshift-multus/multus-2g7ct" Apr 23 13:31:07.328043 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.327976 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/a5619baf-099b-4d83-ad43-fd7d0083f57b-run-ovn\") pod \"ovnkube-node-zk8wt\" (UID: \"a5619baf-099b-4d83-ad43-fd7d0083f57b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zk8wt" Apr 23 13:31:07.328043 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.328017 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/a5619baf-099b-4d83-ad43-fd7d0083f57b-log-socket\") pod \"ovnkube-node-zk8wt\" (UID: \"a5619baf-099b-4d83-ad43-fd7d0083f57b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zk8wt" Apr 23 13:31:07.328043 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.328005 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-wfrll" Apr 23 13:31:07.328196 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.328041 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a5619baf-099b-4d83-ad43-fd7d0083f57b-host-cni-bin\") pod \"ovnkube-node-zk8wt\" (UID: \"a5619baf-099b-4d83-ad43-fd7d0083f57b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zk8wt" Apr 23 13:31:07.328196 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.328088 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/c3276fec-f4f2-47b0-bc26-38fc5eea9ab7-konnectivity-ca\") pod \"konnectivity-agent-5kfq8\" (UID: \"c3276fec-f4f2-47b0-bc26-38fc5eea9ab7\") " pod="kube-system/konnectivity-agent-5kfq8" Apr 23 13:31:07.328196 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.328119 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2mkdt\" (UniqueName: \"kubernetes.io/projected/d6840304-d946-4507-bcb1-55547ba40093-kube-api-access-2mkdt\") pod \"iptables-alerter-4xm4l\" (UID: \"d6840304-d946-4507-bcb1-55547ba40093\") " pod="openshift-network-operator/iptables-alerter-4xm4l" Apr 23 13:31:07.328196 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.328153 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c1494fbd-44b0-417d-857b-089a1705bbe9-cnibin\") pod \"multus-additional-cni-plugins-rm6cm\" (UID: \"c1494fbd-44b0-417d-857b-089a1705bbe9\") " pod="openshift-multus/multus-additional-cni-plugins-rm6cm" Apr 23 13:31:07.328411 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.328205 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c1494fbd-44b0-417d-857b-089a1705bbe9-tuning-conf-dir\") pod \"multus-additional-cni-plugins-rm6cm\" (UID: \"c1494fbd-44b0-417d-857b-089a1705bbe9\") " pod="openshift-multus/multus-additional-cni-plugins-rm6cm" Apr 23 13:31:07.328411 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.328231 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/abbb4dd8-8bc4-4850-bdc6-2b1ed2494101-multus-cni-dir\") pod \"multus-2g7ct\" (UID: \"abbb4dd8-8bc4-4850-bdc6-2b1ed2494101\") " pod="openshift-multus/multus-2g7ct" Apr 23 13:31:07.328411 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.328253 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/abbb4dd8-8bc4-4850-bdc6-2b1ed2494101-cni-binary-copy\") pod \"multus-2g7ct\" (UID: \"abbb4dd8-8bc4-4850-bdc6-2b1ed2494101\") " pod="openshift-multus/multus-2g7ct" Apr 23 13:31:07.328411 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.328276 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/abbb4dd8-8bc4-4850-bdc6-2b1ed2494101-host-var-lib-cni-multus\") pod \"multus-2g7ct\" (UID: \"abbb4dd8-8bc4-4850-bdc6-2b1ed2494101\") " pod="openshift-multus/multus-2g7ct" Apr 23 13:31:07.328411 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.328299 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/abbb4dd8-8bc4-4850-bdc6-2b1ed2494101-os-release\") pod \"multus-2g7ct\" (UID: \"abbb4dd8-8bc4-4850-bdc6-2b1ed2494101\") " pod="openshift-multus/multus-2g7ct" Apr 23 13:31:07.328411 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.328321 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/abbb4dd8-8bc4-4850-bdc6-2b1ed2494101-multus-conf-dir\") pod \"multus-2g7ct\" (UID: \"abbb4dd8-8bc4-4850-bdc6-2b1ed2494101\") " pod="openshift-multus/multus-2g7ct" Apr 23 13:31:07.328411 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.328369 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/abbb4dd8-8bc4-4850-bdc6-2b1ed2494101-multus-daemon-config\") pod \"multus-2g7ct\" (UID: \"abbb4dd8-8bc4-4850-bdc6-2b1ed2494101\") " pod="openshift-multus/multus-2g7ct" Apr 23 13:31:07.328411 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.328389 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/abbb4dd8-8bc4-4850-bdc6-2b1ed2494101-host-run-multus-certs\") pod \"multus-2g7ct\" (UID: \"abbb4dd8-8bc4-4850-bdc6-2b1ed2494101\") " pod="openshift-multus/multus-2g7ct" Apr 23 13:31:07.328718 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.328414 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/abbb4dd8-8bc4-4850-bdc6-2b1ed2494101-etc-kubernetes\") pod \"multus-2g7ct\" (UID: \"abbb4dd8-8bc4-4850-bdc6-2b1ed2494101\") " pod="openshift-multus/multus-2g7ct" Apr 23 13:31:07.328718 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.328439 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xk9t4\" (UniqueName: \"kubernetes.io/projected/abbb4dd8-8bc4-4850-bdc6-2b1ed2494101-kube-api-access-xk9t4\") pod \"multus-2g7ct\" (UID: \"abbb4dd8-8bc4-4850-bdc6-2b1ed2494101\") " pod="openshift-multus/multus-2g7ct" Apr 23 13:31:07.328718 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.328480 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/a5619baf-099b-4d83-ad43-fd7d0083f57b-host-kubelet\") pod \"ovnkube-node-zk8wt\" (UID: \"a5619baf-099b-4d83-ad43-fd7d0083f57b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zk8wt" Apr 23 13:31:07.328718 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.328507 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/a5619baf-099b-4d83-ad43-fd7d0083f57b-run-systemd\") pod \"ovnkube-node-zk8wt\" (UID: \"a5619baf-099b-4d83-ad43-fd7d0083f57b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zk8wt" Apr 23 13:31:07.328718 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.328525 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6866a2aa-1943-4e03-a99a-8b054a2434c8-metrics-certs\") pod \"network-metrics-daemon-k958q\" (UID: \"6866a2aa-1943-4e03-a99a-8b054a2434c8\") " pod="openshift-multus/network-metrics-daemon-k958q" Apr 23 13:31:07.328718 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.328483 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-mrq52\"" Apr 23 13:31:07.328718 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.328539 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d6840304-d946-4507-bcb1-55547ba40093-host-slash\") pod \"iptables-alerter-4xm4l\" (UID: \"d6840304-d946-4507-bcb1-55547ba40093\") " pod="openshift-network-operator/iptables-alerter-4xm4l" Apr 23 13:31:07.328718 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.328558 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c1494fbd-44b0-417d-857b-089a1705bbe9-os-release\") pod \"multus-additional-cni-plugins-rm6cm\" (UID: \"c1494fbd-44b0-417d-857b-089a1705bbe9\") " pod="openshift-multus/multus-additional-cni-plugins-rm6cm" Apr 23 13:31:07.328718 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.328578 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/a5619baf-099b-4d83-ad43-fd7d0083f57b-node-log\") pod \"ovnkube-node-zk8wt\" (UID: \"a5619baf-099b-4d83-ad43-fd7d0083f57b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zk8wt" Apr 23 13:31:07.328718 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.328595 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a5619baf-099b-4d83-ad43-fd7d0083f57b-host-run-ovn-kubernetes\") pod \"ovnkube-node-zk8wt\" (UID: \"a5619baf-099b-4d83-ad43-fd7d0083f57b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zk8wt" Apr 23 13:31:07.328718 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.328610 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 23 13:31:07.328718 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.328637 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/a5619baf-099b-4d83-ad43-fd7d0083f57b-host-cni-netd\") pod \"ovnkube-node-zk8wt\" (UID: \"a5619baf-099b-4d83-ad43-fd7d0083f57b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zk8wt" Apr 23 13:31:07.328718 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.328673 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a5619baf-099b-4d83-ad43-fd7d0083f57b-ovnkube-config\") pod \"ovnkube-node-zk8wt\" (UID: \"a5619baf-099b-4d83-ad43-fd7d0083f57b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zk8wt" Apr 23 13:31:07.328718 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.328699 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c1494fbd-44b0-417d-857b-089a1705bbe9-cni-binary-copy\") pod \"multus-additional-cni-plugins-rm6cm\" (UID: \"c1494fbd-44b0-417d-857b-089a1705bbe9\") " pod="openshift-multus/multus-additional-cni-plugins-rm6cm" Apr 23 13:31:07.329405 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.328728 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/abbb4dd8-8bc4-4850-bdc6-2b1ed2494101-cnibin\") pod \"multus-2g7ct\" (UID: \"abbb4dd8-8bc4-4850-bdc6-2b1ed2494101\") " pod="openshift-multus/multus-2g7ct" Apr 23 13:31:07.329405 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.328758 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 23 13:31:07.329405 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.328771 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/abbb4dd8-8bc4-4850-bdc6-2b1ed2494101-multus-socket-dir-parent\") pod \"multus-2g7ct\" (UID: \"abbb4dd8-8bc4-4850-bdc6-2b1ed2494101\") " pod="openshift-multus/multus-2g7ct" Apr 23 13:31:07.329405 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.328763 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 23 13:31:07.329405 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.328826 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/abbb4dd8-8bc4-4850-bdc6-2b1ed2494101-hostroot\") pod \"multus-2g7ct\" (UID: \"abbb4dd8-8bc4-4850-bdc6-2b1ed2494101\") " pod="openshift-multus/multus-2g7ct" Apr 23 13:31:07.329405 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.328859 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a5619baf-099b-4d83-ad43-fd7d0083f57b-var-lib-openvswitch\") pod \"ovnkube-node-zk8wt\" (UID: \"a5619baf-099b-4d83-ad43-fd7d0083f57b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zk8wt" Apr 23 13:31:07.329405 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.328898 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a5619baf-099b-4d83-ad43-fd7d0083f57b-run-openvswitch\") pod \"ovnkube-node-zk8wt\" (UID: \"a5619baf-099b-4d83-ad43-fd7d0083f57b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zk8wt" Apr 23 13:31:07.329405 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.328925 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a5619baf-099b-4d83-ad43-fd7d0083f57b-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-zk8wt\" (UID: \"a5619baf-099b-4d83-ad43-fd7d0083f57b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zk8wt" Apr 23 13:31:07.329405 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.328951 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a5619baf-099b-4d83-ad43-fd7d0083f57b-env-overrides\") pod \"ovnkube-node-zk8wt\" (UID: \"a5619baf-099b-4d83-ad43-fd7d0083f57b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zk8wt" Apr 23 13:31:07.329405 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.328975 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d6840304-d946-4507-bcb1-55547ba40093-iptables-alerter-script\") pod \"iptables-alerter-4xm4l\" (UID: \"d6840304-d946-4507-bcb1-55547ba40093\") " pod="openshift-network-operator/iptables-alerter-4xm4l" Apr 23 13:31:07.329405 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.329017 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c1494fbd-44b0-417d-857b-089a1705bbe9-system-cni-dir\") pod \"multus-additional-cni-plugins-rm6cm\" (UID: \"c1494fbd-44b0-417d-857b-089a1705bbe9\") " pod="openshift-multus/multus-additional-cni-plugins-rm6cm" Apr 23 13:31:07.329405 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.329057 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/abbb4dd8-8bc4-4850-bdc6-2b1ed2494101-host-var-lib-cni-bin\") pod \"multus-2g7ct\" (UID: \"abbb4dd8-8bc4-4850-bdc6-2b1ed2494101\") " pod="openshift-multus/multus-2g7ct" Apr 23 13:31:07.329405 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.329087 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/abbb4dd8-8bc4-4850-bdc6-2b1ed2494101-host-var-lib-kubelet\") pod \"multus-2g7ct\" (UID: \"abbb4dd8-8bc4-4850-bdc6-2b1ed2494101\") " pod="openshift-multus/multus-2g7ct" Apr 23 13:31:07.329405 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.329109 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/a5619baf-099b-4d83-ad43-fd7d0083f57b-systemd-units\") pod \"ovnkube-node-zk8wt\" (UID: \"a5619baf-099b-4d83-ad43-fd7d0083f57b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zk8wt" Apr 23 13:31:07.329405 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.329132 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a5619baf-099b-4d83-ad43-fd7d0083f57b-host-slash\") pod \"ovnkube-node-zk8wt\" (UID: \"a5619baf-099b-4d83-ad43-fd7d0083f57b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zk8wt" Apr 23 13:31:07.329405 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.329164 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a5619baf-099b-4d83-ad43-fd7d0083f57b-ovn-node-metrics-cert\") pod \"ovnkube-node-zk8wt\" (UID: \"a5619baf-099b-4d83-ad43-fd7d0083f57b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zk8wt" Apr 23 13:31:07.329405 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.329187 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/a5619baf-099b-4d83-ad43-fd7d0083f57b-ovnkube-script-lib\") pod \"ovnkube-node-zk8wt\" (UID: \"a5619baf-099b-4d83-ad43-fd7d0083f57b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zk8wt" Apr 23 13:31:07.330042 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.329221 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dqqw2\" (UniqueName: \"kubernetes.io/projected/6866a2aa-1943-4e03-a99a-8b054a2434c8-kube-api-access-dqqw2\") pod \"network-metrics-daemon-k958q\" (UID: \"6866a2aa-1943-4e03-a99a-8b054a2434c8\") " pod="openshift-multus/network-metrics-daemon-k958q" Apr 23 13:31:07.330042 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.329245 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/c3276fec-f4f2-47b0-bc26-38fc5eea9ab7-agent-certs\") pod \"konnectivity-agent-5kfq8\" (UID: \"c3276fec-f4f2-47b0-bc26-38fc5eea9ab7\") " pod="kube-system/konnectivity-agent-5kfq8" Apr 23 13:31:07.330042 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.329307 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z4vcp\" (UniqueName: \"kubernetes.io/projected/c1494fbd-44b0-417d-857b-089a1705bbe9-kube-api-access-z4vcp\") pod \"multus-additional-cni-plugins-rm6cm\" (UID: \"c1494fbd-44b0-417d-857b-089a1705bbe9\") " pod="openshift-multus/multus-additional-cni-plugins-rm6cm" Apr 23 13:31:07.330042 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.329384 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/abbb4dd8-8bc4-4850-bdc6-2b1ed2494101-host-run-netns\") pod \"multus-2g7ct\" (UID: \"abbb4dd8-8bc4-4850-bdc6-2b1ed2494101\") " pod="openshift-multus/multus-2g7ct" Apr 23 13:31:07.330042 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.329416 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a5619baf-099b-4d83-ad43-fd7d0083f57b-host-run-netns\") pod \"ovnkube-node-zk8wt\" (UID: \"a5619baf-099b-4d83-ad43-fd7d0083f57b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zk8wt" Apr 23 13:31:07.330042 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.329444 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/c1494fbd-44b0-417d-857b-089a1705bbe9-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-rm6cm\" (UID: \"c1494fbd-44b0-417d-857b-089a1705bbe9\") " pod="openshift-multus/multus-additional-cni-plugins-rm6cm" Apr 23 13:31:07.330042 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.329467 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a5619baf-099b-4d83-ad43-fd7d0083f57b-etc-openvswitch\") pod \"ovnkube-node-zk8wt\" (UID: \"a5619baf-099b-4d83-ad43-fd7d0083f57b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zk8wt" Apr 23 13:31:07.330042 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.329481 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lpc2h\" (UniqueName: \"kubernetes.io/projected/a5619baf-099b-4d83-ad43-fd7d0083f57b-kube-api-access-lpc2h\") pod \"ovnkube-node-zk8wt\" (UID: \"a5619baf-099b-4d83-ad43-fd7d0083f57b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zk8wt" Apr 23 13:31:07.330042 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.329514 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xkgtj\" (UniqueName: \"kubernetes.io/projected/958f9f23-bac7-4183-b8b4-7d4d89901105-kube-api-access-xkgtj\") pod \"network-check-target-qqr6d\" (UID: \"958f9f23-bac7-4183-b8b4-7d4d89901105\") " pod="openshift-network-diagnostics/network-check-target-qqr6d" Apr 23 13:31:07.330042 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.329536 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/c1494fbd-44b0-417d-857b-089a1705bbe9-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-rm6cm\" (UID: \"c1494fbd-44b0-417d-857b-089a1705bbe9\") " pod="openshift-multus/multus-additional-cni-plugins-rm6cm" Apr 23 13:31:07.330042 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.329552 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/abbb4dd8-8bc4-4850-bdc6-2b1ed2494101-system-cni-dir\") pod \"multus-2g7ct\" (UID: \"abbb4dd8-8bc4-4850-bdc6-2b1ed2494101\") " pod="openshift-multus/multus-2g7ct" Apr 23 13:31:07.330533 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.330080 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 23 13:31:07.330533 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.330142 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 23 13:31:07.330533 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.330372 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-tx9b2\"" Apr 23 13:31:07.330991 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.330978 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-2rlxj" Apr 23 13:31:07.331434 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.331415 2571 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 23 13:31:07.333134 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.333120 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 23 13:31:07.333189 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.333123 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 23 13:31:07.333189 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.333165 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 23 13:31:07.333258 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.333187 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-6vjsf\"" Apr 23 13:31:07.352380 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.352362 2571 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-h25lz" Apr 23 13:31:07.357400 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.357382 2571 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-h25lz" Apr 23 13:31:07.395440 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:07.395406 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5a1bfc99dc8a1b47af59ec4d15c476b0.slice/crio-e1b87330edccc71eba340c8e97959a39c216502ad87e0875ee610de963f63c5c WatchSource:0}: Error finding container e1b87330edccc71eba340c8e97959a39c216502ad87e0875ee610de963f63c5c: Status 404 returned error can't find the container with id e1b87330edccc71eba340c8e97959a39c216502ad87e0875ee610de963f63c5c Apr 23 13:31:07.395711 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:07.395690 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod95cd19913bda1121c44084ba2ccca700.slice/crio-05e5dba5c618e73e80bd7a667207859496380287aed393b574214fcd6fc99ed6 WatchSource:0}: Error finding container 05e5dba5c618e73e80bd7a667207859496380287aed393b574214fcd6fc99ed6: Status 404 returned error can't find the container with id 05e5dba5c618e73e80bd7a667207859496380287aed393b574214fcd6fc99ed6 Apr 23 13:31:07.401816 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.401803 2571 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 13:31:07.423072 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.423055 2571 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 23 13:31:07.429932 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.429912 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lpc2h\" (UniqueName: \"kubernetes.io/projected/a5619baf-099b-4d83-ad43-fd7d0083f57b-kube-api-access-lpc2h\") pod \"ovnkube-node-zk8wt\" (UID: \"a5619baf-099b-4d83-ad43-fd7d0083f57b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zk8wt" Apr 23 13:31:07.430032 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.429940 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/abbb4dd8-8bc4-4850-bdc6-2b1ed2494101-system-cni-dir\") pod \"multus-2g7ct\" (UID: \"abbb4dd8-8bc4-4850-bdc6-2b1ed2494101\") " pod="openshift-multus/multus-2g7ct" Apr 23 13:31:07.430032 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.429965 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/a5619baf-099b-4d83-ad43-fd7d0083f57b-run-ovn\") pod \"ovnkube-node-zk8wt\" (UID: \"a5619baf-099b-4d83-ad43-fd7d0083f57b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zk8wt" Apr 23 13:31:07.430032 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.429987 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/a5619baf-099b-4d83-ad43-fd7d0083f57b-log-socket\") pod \"ovnkube-node-zk8wt\" (UID: \"a5619baf-099b-4d83-ad43-fd7d0083f57b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zk8wt" Apr 23 13:31:07.430032 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.430010 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/abbb4dd8-8bc4-4850-bdc6-2b1ed2494101-system-cni-dir\") pod \"multus-2g7ct\" (UID: \"abbb4dd8-8bc4-4850-bdc6-2b1ed2494101\") " pod="openshift-multus/multus-2g7ct" Apr 23 13:31:07.430032 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.430014 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/8b9f46e6-e759-4c41-be74-de439958c36d-tmp\") pod \"tuned-wfrll\" (UID: \"8b9f46e6-e759-4c41-be74-de439958c36d\") " pod="openshift-cluster-node-tuning-operator/tuned-wfrll" Apr 23 13:31:07.430291 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.430070 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/a5619baf-099b-4d83-ad43-fd7d0083f57b-log-socket\") pod \"ovnkube-node-zk8wt\" (UID: \"a5619baf-099b-4d83-ad43-fd7d0083f57b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zk8wt" Apr 23 13:31:07.430291 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.430083 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/c3276fec-f4f2-47b0-bc26-38fc5eea9ab7-konnectivity-ca\") pod \"konnectivity-agent-5kfq8\" (UID: \"c3276fec-f4f2-47b0-bc26-38fc5eea9ab7\") " pod="kube-system/konnectivity-agent-5kfq8" Apr 23 13:31:07.430291 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.430110 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/a5619baf-099b-4d83-ad43-fd7d0083f57b-run-ovn\") pod \"ovnkube-node-zk8wt\" (UID: \"a5619baf-099b-4d83-ad43-fd7d0083f57b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zk8wt" Apr 23 13:31:07.430291 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.430117 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c1494fbd-44b0-417d-857b-089a1705bbe9-cnibin\") pod \"multus-additional-cni-plugins-rm6cm\" (UID: \"c1494fbd-44b0-417d-857b-089a1705bbe9\") " pod="openshift-multus/multus-additional-cni-plugins-rm6cm" Apr 23 13:31:07.430291 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.430138 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c1494fbd-44b0-417d-857b-089a1705bbe9-tuning-conf-dir\") pod \"multus-additional-cni-plugins-rm6cm\" (UID: \"c1494fbd-44b0-417d-857b-089a1705bbe9\") " pod="openshift-multus/multus-additional-cni-plugins-rm6cm" Apr 23 13:31:07.430291 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.430154 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/abbb4dd8-8bc4-4850-bdc6-2b1ed2494101-multus-daemon-config\") pod \"multus-2g7ct\" (UID: \"abbb4dd8-8bc4-4850-bdc6-2b1ed2494101\") " pod="openshift-multus/multus-2g7ct" Apr 23 13:31:07.430291 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.430169 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a5619baf-099b-4d83-ad43-fd7d0083f57b-run-openvswitch\") pod \"ovnkube-node-zk8wt\" (UID: \"a5619baf-099b-4d83-ad43-fd7d0083f57b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zk8wt" Apr 23 13:31:07.430291 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.430190 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/8b9f46e6-e759-4c41-be74-de439958c36d-etc-sysctl-conf\") pod \"tuned-wfrll\" (UID: \"8b9f46e6-e759-4c41-be74-de439958c36d\") " pod="openshift-cluster-node-tuning-operator/tuned-wfrll" Apr 23 13:31:07.430291 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.430254 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a5619baf-099b-4d83-ad43-fd7d0083f57b-run-openvswitch\") pod \"ovnkube-node-zk8wt\" (UID: \"a5619baf-099b-4d83-ad43-fd7d0083f57b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zk8wt" Apr 23 13:31:07.430736 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.430296 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c1494fbd-44b0-417d-857b-089a1705bbe9-tuning-conf-dir\") pod \"multus-additional-cni-plugins-rm6cm\" (UID: \"c1494fbd-44b0-417d-857b-089a1705bbe9\") " pod="openshift-multus/multus-additional-cni-plugins-rm6cm" Apr 23 13:31:07.430736 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.430324 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c524008d-56ba-4f94-99b4-bd3ef55ba66f-host\") pod \"node-ca-2rlxj\" (UID: \"c524008d-56ba-4f94-99b4-bd3ef55ba66f\") " pod="openshift-image-registry/node-ca-2rlxj" Apr 23 13:31:07.430736 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.430365 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c1494fbd-44b0-417d-857b-089a1705bbe9-cnibin\") pod \"multus-additional-cni-plugins-rm6cm\" (UID: \"c1494fbd-44b0-417d-857b-089a1705bbe9\") " pod="openshift-multus/multus-additional-cni-plugins-rm6cm" Apr 23 13:31:07.430736 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.430376 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xk9t4\" (UniqueName: \"kubernetes.io/projected/abbb4dd8-8bc4-4850-bdc6-2b1ed2494101-kube-api-access-xk9t4\") pod \"multus-2g7ct\" (UID: \"abbb4dd8-8bc4-4850-bdc6-2b1ed2494101\") " pod="openshift-multus/multus-2g7ct" Apr 23 13:31:07.430736 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.430406 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/8b9f46e6-e759-4c41-be74-de439958c36d-etc-sysctl-d\") pod \"tuned-wfrll\" (UID: \"8b9f46e6-e759-4c41-be74-de439958c36d\") " pod="openshift-cluster-node-tuning-operator/tuned-wfrll" Apr 23 13:31:07.430736 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.430430 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c1494fbd-44b0-417d-857b-089a1705bbe9-os-release\") pod \"multus-additional-cni-plugins-rm6cm\" (UID: \"c1494fbd-44b0-417d-857b-089a1705bbe9\") " pod="openshift-multus/multus-additional-cni-plugins-rm6cm" Apr 23 13:31:07.430736 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.430452 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a5619baf-099b-4d83-ad43-fd7d0083f57b-host-run-ovn-kubernetes\") pod \"ovnkube-node-zk8wt\" (UID: \"a5619baf-099b-4d83-ad43-fd7d0083f57b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zk8wt" Apr 23 13:31:07.430736 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.430477 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a5619baf-099b-4d83-ad43-fd7d0083f57b-env-overrides\") pod \"ovnkube-node-zk8wt\" (UID: \"a5619baf-099b-4d83-ad43-fd7d0083f57b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zk8wt" Apr 23 13:31:07.430736 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.430498 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/8b9f46e6-e759-4c41-be74-de439958c36d-etc-systemd\") pod \"tuned-wfrll\" (UID: \"8b9f46e6-e759-4c41-be74-de439958c36d\") " pod="openshift-cluster-node-tuning-operator/tuned-wfrll" Apr 23 13:31:07.430736 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.430524 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c1494fbd-44b0-417d-857b-089a1705bbe9-cni-binary-copy\") pod \"multus-additional-cni-plugins-rm6cm\" (UID: \"c1494fbd-44b0-417d-857b-089a1705bbe9\") " pod="openshift-multus/multus-additional-cni-plugins-rm6cm" Apr 23 13:31:07.430736 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.430550 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c1494fbd-44b0-417d-857b-089a1705bbe9-os-release\") pod \"multus-additional-cni-plugins-rm6cm\" (UID: \"c1494fbd-44b0-417d-857b-089a1705bbe9\") " pod="openshift-multus/multus-additional-cni-plugins-rm6cm" Apr 23 13:31:07.430736 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.430555 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a5619baf-099b-4d83-ad43-fd7d0083f57b-host-run-ovn-kubernetes\") pod \"ovnkube-node-zk8wt\" (UID: \"a5619baf-099b-4d83-ad43-fd7d0083f57b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zk8wt" Apr 23 13:31:07.430736 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.430593 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/abbb4dd8-8bc4-4850-bdc6-2b1ed2494101-hostroot\") pod \"multus-2g7ct\" (UID: \"abbb4dd8-8bc4-4850-bdc6-2b1ed2494101\") " pod="openshift-multus/multus-2g7ct" Apr 23 13:31:07.430736 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.430549 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/abbb4dd8-8bc4-4850-bdc6-2b1ed2494101-hostroot\") pod \"multus-2g7ct\" (UID: \"abbb4dd8-8bc4-4850-bdc6-2b1ed2494101\") " pod="openshift-multus/multus-2g7ct" Apr 23 13:31:07.430736 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.430629 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/76d484ba-9464-4ff8-b453-2a202a4e649c-registration-dir\") pod \"aws-ebs-csi-driver-node-zq8gn\" (UID: \"76d484ba-9464-4ff8-b453-2a202a4e649c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zq8gn" Apr 23 13:31:07.430736 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.430639 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/c3276fec-f4f2-47b0-bc26-38fc5eea9ab7-konnectivity-ca\") pod \"konnectivity-agent-5kfq8\" (UID: \"c3276fec-f4f2-47b0-bc26-38fc5eea9ab7\") " pod="kube-system/konnectivity-agent-5kfq8" Apr 23 13:31:07.430736 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.430656 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/8b9f46e6-e759-4c41-be74-de439958c36d-lib-modules\") pod \"tuned-wfrll\" (UID: \"8b9f46e6-e759-4c41-be74-de439958c36d\") " pod="openshift-cluster-node-tuning-operator/tuned-wfrll" Apr 23 13:31:07.431533 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.430681 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/a5619baf-099b-4d83-ad43-fd7d0083f57b-systemd-units\") pod \"ovnkube-node-zk8wt\" (UID: \"a5619baf-099b-4d83-ad43-fd7d0083f57b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zk8wt" Apr 23 13:31:07.431533 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.430720 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/abbb4dd8-8bc4-4850-bdc6-2b1ed2494101-multus-daemon-config\") pod \"multus-2g7ct\" (UID: \"abbb4dd8-8bc4-4850-bdc6-2b1ed2494101\") " pod="openshift-multus/multus-2g7ct" Apr 23 13:31:07.431533 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.430719 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/76d484ba-9464-4ff8-b453-2a202a4e649c-sys-fs\") pod \"aws-ebs-csi-driver-node-zq8gn\" (UID: \"76d484ba-9464-4ff8-b453-2a202a4e649c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zq8gn" Apr 23 13:31:07.431533 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.430761 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/a5619baf-099b-4d83-ad43-fd7d0083f57b-systemd-units\") pod \"ovnkube-node-zk8wt\" (UID: \"a5619baf-099b-4d83-ad43-fd7d0083f57b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zk8wt" Apr 23 13:31:07.431533 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.430768 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dqqw2\" (UniqueName: \"kubernetes.io/projected/6866a2aa-1943-4e03-a99a-8b054a2434c8-kube-api-access-dqqw2\") pod \"network-metrics-daemon-k958q\" (UID: \"6866a2aa-1943-4e03-a99a-8b054a2434c8\") " pod="openshift-multus/network-metrics-daemon-k958q" Apr 23 13:31:07.431533 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.430809 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/c3276fec-f4f2-47b0-bc26-38fc5eea9ab7-agent-certs\") pod \"konnectivity-agent-5kfq8\" (UID: \"c3276fec-f4f2-47b0-bc26-38fc5eea9ab7\") " pod="kube-system/konnectivity-agent-5kfq8" Apr 23 13:31:07.431533 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.430834 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/8b9f46e6-e759-4c41-be74-de439958c36d-run\") pod \"tuned-wfrll\" (UID: \"8b9f46e6-e759-4c41-be74-de439958c36d\") " pod="openshift-cluster-node-tuning-operator/tuned-wfrll" Apr 23 13:31:07.431533 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.430948 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xkgtj\" (UniqueName: \"kubernetes.io/projected/958f9f23-bac7-4183-b8b4-7d4d89901105-kube-api-access-xkgtj\") pod \"network-check-target-qqr6d\" (UID: \"958f9f23-bac7-4183-b8b4-7d4d89901105\") " pod="openshift-network-diagnostics/network-check-target-qqr6d" Apr 23 13:31:07.431533 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.430970 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/c1494fbd-44b0-417d-857b-089a1705bbe9-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-rm6cm\" (UID: \"c1494fbd-44b0-417d-857b-089a1705bbe9\") " pod="openshift-multus/multus-additional-cni-plugins-rm6cm" Apr 23 13:31:07.431533 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.430990 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/abbb4dd8-8bc4-4850-bdc6-2b1ed2494101-host-run-k8s-cni-cncf-io\") pod \"multus-2g7ct\" (UID: \"abbb4dd8-8bc4-4850-bdc6-2b1ed2494101\") " pod="openshift-multus/multus-2g7ct" Apr 23 13:31:07.431533 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.431008 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a5619baf-099b-4d83-ad43-fd7d0083f57b-host-cni-bin\") pod \"ovnkube-node-zk8wt\" (UID: \"a5619baf-099b-4d83-ad43-fd7d0083f57b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zk8wt" Apr 23 13:31:07.431533 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.431043 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a5619baf-099b-4d83-ad43-fd7d0083f57b-ovnkube-config\") pod \"ovnkube-node-zk8wt\" (UID: \"a5619baf-099b-4d83-ad43-fd7d0083f57b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zk8wt" Apr 23 13:31:07.431533 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.431060 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2mkdt\" (UniqueName: \"kubernetes.io/projected/d6840304-d946-4507-bcb1-55547ba40093-kube-api-access-2mkdt\") pod \"iptables-alerter-4xm4l\" (UID: \"d6840304-d946-4507-bcb1-55547ba40093\") " pod="openshift-network-operator/iptables-alerter-4xm4l" Apr 23 13:31:07.431533 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.431062 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/abbb4dd8-8bc4-4850-bdc6-2b1ed2494101-host-run-k8s-cni-cncf-io\") pod \"multus-2g7ct\" (UID: \"abbb4dd8-8bc4-4850-bdc6-2b1ed2494101\") " pod="openshift-multus/multus-2g7ct" Apr 23 13:31:07.431533 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.431074 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/abbb4dd8-8bc4-4850-bdc6-2b1ed2494101-multus-cni-dir\") pod \"multus-2g7ct\" (UID: \"abbb4dd8-8bc4-4850-bdc6-2b1ed2494101\") " pod="openshift-multus/multus-2g7ct" Apr 23 13:31:07.431533 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.431117 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a5619baf-099b-4d83-ad43-fd7d0083f57b-host-cni-bin\") pod \"ovnkube-node-zk8wt\" (UID: \"a5619baf-099b-4d83-ad43-fd7d0083f57b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zk8wt" Apr 23 13:31:07.431533 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.431127 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/abbb4dd8-8bc4-4850-bdc6-2b1ed2494101-multus-cni-dir\") pod \"multus-2g7ct\" (UID: \"abbb4dd8-8bc4-4850-bdc6-2b1ed2494101\") " pod="openshift-multus/multus-2g7ct" Apr 23 13:31:07.432385 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.431149 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/abbb4dd8-8bc4-4850-bdc6-2b1ed2494101-cni-binary-copy\") pod \"multus-2g7ct\" (UID: \"abbb4dd8-8bc4-4850-bdc6-2b1ed2494101\") " pod="openshift-multus/multus-2g7ct" Apr 23 13:31:07.432385 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.431167 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a5619baf-099b-4d83-ad43-fd7d0083f57b-env-overrides\") pod \"ovnkube-node-zk8wt\" (UID: \"a5619baf-099b-4d83-ad43-fd7d0083f57b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zk8wt" Apr 23 13:31:07.432385 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.431172 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/abbb4dd8-8bc4-4850-bdc6-2b1ed2494101-host-var-lib-cni-multus\") pod \"multus-2g7ct\" (UID: \"abbb4dd8-8bc4-4850-bdc6-2b1ed2494101\") " pod="openshift-multus/multus-2g7ct" Apr 23 13:31:07.432385 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.431290 2571 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 23 13:31:07.432385 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.431353 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c1494fbd-44b0-417d-857b-089a1705bbe9-cni-binary-copy\") pod \"multus-additional-cni-plugins-rm6cm\" (UID: \"c1494fbd-44b0-417d-857b-089a1705bbe9\") " pod="openshift-multus/multus-additional-cni-plugins-rm6cm" Apr 23 13:31:07.432385 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.431457 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/abbb4dd8-8bc4-4850-bdc6-2b1ed2494101-multus-conf-dir\") pod \"multus-2g7ct\" (UID: \"abbb4dd8-8bc4-4850-bdc6-2b1ed2494101\") " pod="openshift-multus/multus-2g7ct" Apr 23 13:31:07.432385 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.431529 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/abbb4dd8-8bc4-4850-bdc6-2b1ed2494101-multus-conf-dir\") pod \"multus-2g7ct\" (UID: \"abbb4dd8-8bc4-4850-bdc6-2b1ed2494101\") " pod="openshift-multus/multus-2g7ct" Apr 23 13:31:07.432385 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.431540 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/abbb4dd8-8bc4-4850-bdc6-2b1ed2494101-host-var-lib-cni-multus\") pod \"multus-2g7ct\" (UID: \"abbb4dd8-8bc4-4850-bdc6-2b1ed2494101\") " pod="openshift-multus/multus-2g7ct" Apr 23 13:31:07.432385 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.431570 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/a5619baf-099b-4d83-ad43-fd7d0083f57b-host-kubelet\") pod \"ovnkube-node-zk8wt\" (UID: \"a5619baf-099b-4d83-ad43-fd7d0083f57b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zk8wt" Apr 23 13:31:07.432385 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.431605 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/76d484ba-9464-4ff8-b453-2a202a4e649c-socket-dir\") pod \"aws-ebs-csi-driver-node-zq8gn\" (UID: \"76d484ba-9464-4ff8-b453-2a202a4e649c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zq8gn" Apr 23 13:31:07.432385 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.431635 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/c524008d-56ba-4f94-99b4-bd3ef55ba66f-serviceca\") pod \"node-ca-2rlxj\" (UID: \"c524008d-56ba-4f94-99b4-bd3ef55ba66f\") " pod="openshift-image-registry/node-ca-2rlxj" Apr 23 13:31:07.432385 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.431666 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/c1494fbd-44b0-417d-857b-089a1705bbe9-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-rm6cm\" (UID: \"c1494fbd-44b0-417d-857b-089a1705bbe9\") " pod="openshift-multus/multus-additional-cni-plugins-rm6cm" Apr 23 13:31:07.432385 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.431680 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/abbb4dd8-8bc4-4850-bdc6-2b1ed2494101-os-release\") pod \"multus-2g7ct\" (UID: \"abbb4dd8-8bc4-4850-bdc6-2b1ed2494101\") " pod="openshift-multus/multus-2g7ct" Apr 23 13:31:07.432385 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.431709 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/a5619baf-099b-4d83-ad43-fd7d0083f57b-host-kubelet\") pod \"ovnkube-node-zk8wt\" (UID: \"a5619baf-099b-4d83-ad43-fd7d0083f57b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zk8wt" Apr 23 13:31:07.432385 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.431749 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/abbb4dd8-8bc4-4850-bdc6-2b1ed2494101-host-run-multus-certs\") pod \"multus-2g7ct\" (UID: \"abbb4dd8-8bc4-4850-bdc6-2b1ed2494101\") " pod="openshift-multus/multus-2g7ct" Apr 23 13:31:07.432385 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.431756 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a5619baf-099b-4d83-ad43-fd7d0083f57b-ovnkube-config\") pod \"ovnkube-node-zk8wt\" (UID: \"a5619baf-099b-4d83-ad43-fd7d0083f57b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zk8wt" Apr 23 13:31:07.432385 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.431784 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/abbb4dd8-8bc4-4850-bdc6-2b1ed2494101-host-run-multus-certs\") pod \"multus-2g7ct\" (UID: \"abbb4dd8-8bc4-4850-bdc6-2b1ed2494101\") " pod="openshift-multus/multus-2g7ct" Apr 23 13:31:07.432385 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.431793 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/abbb4dd8-8bc4-4850-bdc6-2b1ed2494101-etc-kubernetes\") pod \"multus-2g7ct\" (UID: \"abbb4dd8-8bc4-4850-bdc6-2b1ed2494101\") " pod="openshift-multus/multus-2g7ct" Apr 23 13:31:07.433179 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.431836 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/a5619baf-099b-4d83-ad43-fd7d0083f57b-run-systemd\") pod \"ovnkube-node-zk8wt\" (UID: \"a5619baf-099b-4d83-ad43-fd7d0083f57b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zk8wt" Apr 23 13:31:07.433179 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.431841 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/abbb4dd8-8bc4-4850-bdc6-2b1ed2494101-cni-binary-copy\") pod \"multus-2g7ct\" (UID: \"abbb4dd8-8bc4-4850-bdc6-2b1ed2494101\") " pod="openshift-multus/multus-2g7ct" Apr 23 13:31:07.433179 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.431838 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/abbb4dd8-8bc4-4850-bdc6-2b1ed2494101-os-release\") pod \"multus-2g7ct\" (UID: \"abbb4dd8-8bc4-4850-bdc6-2b1ed2494101\") " pod="openshift-multus/multus-2g7ct" Apr 23 13:31:07.433179 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.431872 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/a5619baf-099b-4d83-ad43-fd7d0083f57b-run-systemd\") pod \"ovnkube-node-zk8wt\" (UID: \"a5619baf-099b-4d83-ad43-fd7d0083f57b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zk8wt" Apr 23 13:31:07.433179 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.431860 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/a5619baf-099b-4d83-ad43-fd7d0083f57b-node-log\") pod \"ovnkube-node-zk8wt\" (UID: \"a5619baf-099b-4d83-ad43-fd7d0083f57b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zk8wt" Apr 23 13:31:07.433179 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.431880 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/abbb4dd8-8bc4-4850-bdc6-2b1ed2494101-etc-kubernetes\") pod \"multus-2g7ct\" (UID: \"abbb4dd8-8bc4-4850-bdc6-2b1ed2494101\") " pod="openshift-multus/multus-2g7ct" Apr 23 13:31:07.433179 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.431899 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/a5619baf-099b-4d83-ad43-fd7d0083f57b-host-cni-netd\") pod \"ovnkube-node-zk8wt\" (UID: \"a5619baf-099b-4d83-ad43-fd7d0083f57b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zk8wt" Apr 23 13:31:07.433179 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.431922 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/a5619baf-099b-4d83-ad43-fd7d0083f57b-node-log\") pod \"ovnkube-node-zk8wt\" (UID: \"a5619baf-099b-4d83-ad43-fd7d0083f57b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zk8wt" Apr 23 13:31:07.433179 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.431926 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/76d484ba-9464-4ff8-b453-2a202a4e649c-etc-selinux\") pod \"aws-ebs-csi-driver-node-zq8gn\" (UID: \"76d484ba-9464-4ff8-b453-2a202a4e649c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zq8gn" Apr 23 13:31:07.433179 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.432182 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4zrj4\" (UniqueName: \"kubernetes.io/projected/c524008d-56ba-4f94-99b4-bd3ef55ba66f-kube-api-access-4zrj4\") pod \"node-ca-2rlxj\" (UID: \"c524008d-56ba-4f94-99b4-bd3ef55ba66f\") " pod="openshift-image-registry/node-ca-2rlxj" Apr 23 13:31:07.433179 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.431920 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-108.ec2.internal" event={"ID":"5a1bfc99dc8a1b47af59ec4d15c476b0","Type":"ContainerStarted","Data":"e1b87330edccc71eba340c8e97959a39c216502ad87e0875ee610de963f63c5c"} Apr 23 13:31:07.433179 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.432406 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6866a2aa-1943-4e03-a99a-8b054a2434c8-metrics-certs\") pod \"network-metrics-daemon-k958q\" (UID: \"6866a2aa-1943-4e03-a99a-8b054a2434c8\") " pod="openshift-multus/network-metrics-daemon-k958q" Apr 23 13:31:07.433179 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.432409 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/a5619baf-099b-4d83-ad43-fd7d0083f57b-host-cni-netd\") pod \"ovnkube-node-zk8wt\" (UID: \"a5619baf-099b-4d83-ad43-fd7d0083f57b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zk8wt" Apr 23 13:31:07.433179 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.432442 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d6840304-d946-4507-bcb1-55547ba40093-host-slash\") pod \"iptables-alerter-4xm4l\" (UID: \"d6840304-d946-4507-bcb1-55547ba40093\") " pod="openshift-network-operator/iptables-alerter-4xm4l" Apr 23 13:31:07.433179 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.432474 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8b9f46e6-e759-4c41-be74-de439958c36d-host\") pod \"tuned-wfrll\" (UID: \"8b9f46e6-e759-4c41-be74-de439958c36d\") " pod="openshift-cluster-node-tuning-operator/tuned-wfrll" Apr 23 13:31:07.433179 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.432505 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-86vhr\" (UniqueName: \"kubernetes.io/projected/8b9f46e6-e759-4c41-be74-de439958c36d-kube-api-access-86vhr\") pod \"tuned-wfrll\" (UID: \"8b9f46e6-e759-4c41-be74-de439958c36d\") " pod="openshift-cluster-node-tuning-operator/tuned-wfrll" Apr 23 13:31:07.433179 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.432514 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d6840304-d946-4507-bcb1-55547ba40093-host-slash\") pod \"iptables-alerter-4xm4l\" (UID: \"d6840304-d946-4507-bcb1-55547ba40093\") " pod="openshift-network-operator/iptables-alerter-4xm4l" Apr 23 13:31:07.433179 ip-10-0-128-108 kubenswrapper[2571]: E0423 13:31:07.432575 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 13:31:07.434094 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.432605 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/abbb4dd8-8bc4-4850-bdc6-2b1ed2494101-cnibin\") pod \"multus-2g7ct\" (UID: \"abbb4dd8-8bc4-4850-bdc6-2b1ed2494101\") " pod="openshift-multus/multus-2g7ct" Apr 23 13:31:07.434094 ip-10-0-128-108 kubenswrapper[2571]: E0423 13:31:07.432652 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6866a2aa-1943-4e03-a99a-8b054a2434c8-metrics-certs podName:6866a2aa-1943-4e03-a99a-8b054a2434c8 nodeName:}" failed. No retries permitted until 2026-04-23 13:31:07.932624444 +0000 UTC m=+2.070903132 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6866a2aa-1943-4e03-a99a-8b054a2434c8-metrics-certs") pod "network-metrics-daemon-k958q" (UID: "6866a2aa-1943-4e03-a99a-8b054a2434c8") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 13:31:07.434094 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.432747 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/abbb4dd8-8bc4-4850-bdc6-2b1ed2494101-cnibin\") pod \"multus-2g7ct\" (UID: \"abbb4dd8-8bc4-4850-bdc6-2b1ed2494101\") " pod="openshift-multus/multus-2g7ct" Apr 23 13:31:07.434094 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.432792 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/abbb4dd8-8bc4-4850-bdc6-2b1ed2494101-multus-socket-dir-parent\") pod \"multus-2g7ct\" (UID: \"abbb4dd8-8bc4-4850-bdc6-2b1ed2494101\") " pod="openshift-multus/multus-2g7ct" Apr 23 13:31:07.434094 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.432820 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a5619baf-099b-4d83-ad43-fd7d0083f57b-var-lib-openvswitch\") pod \"ovnkube-node-zk8wt\" (UID: \"a5619baf-099b-4d83-ad43-fd7d0083f57b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zk8wt" Apr 23 13:31:07.434094 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.432856 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a5619baf-099b-4d83-ad43-fd7d0083f57b-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-zk8wt\" (UID: \"a5619baf-099b-4d83-ad43-fd7d0083f57b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zk8wt" Apr 23 13:31:07.434094 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.432869 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/abbb4dd8-8bc4-4850-bdc6-2b1ed2494101-multus-socket-dir-parent\") pod \"multus-2g7ct\" (UID: \"abbb4dd8-8bc4-4850-bdc6-2b1ed2494101\") " pod="openshift-multus/multus-2g7ct" Apr 23 13:31:07.434094 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.432918 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a5619baf-099b-4d83-ad43-fd7d0083f57b-ovn-node-metrics-cert\") pod \"ovnkube-node-zk8wt\" (UID: \"a5619baf-099b-4d83-ad43-fd7d0083f57b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zk8wt" Apr 23 13:31:07.434094 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.432923 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a5619baf-099b-4d83-ad43-fd7d0083f57b-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-zk8wt\" (UID: \"a5619baf-099b-4d83-ad43-fd7d0083f57b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zk8wt" Apr 23 13:31:07.434094 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.432958 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/8b9f46e6-e759-4c41-be74-de439958c36d-etc-modprobe-d\") pod \"tuned-wfrll\" (UID: \"8b9f46e6-e759-4c41-be74-de439958c36d\") " pod="openshift-cluster-node-tuning-operator/tuned-wfrll" Apr 23 13:31:07.434094 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.432973 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a5619baf-099b-4d83-ad43-fd7d0083f57b-var-lib-openvswitch\") pod \"ovnkube-node-zk8wt\" (UID: \"a5619baf-099b-4d83-ad43-fd7d0083f57b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zk8wt" Apr 23 13:31:07.434094 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.432991 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8b9f46e6-e759-4c41-be74-de439958c36d-sys\") pod \"tuned-wfrll\" (UID: \"8b9f46e6-e759-4c41-be74-de439958c36d\") " pod="openshift-cluster-node-tuning-operator/tuned-wfrll" Apr 23 13:31:07.434094 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.433191 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d6840304-d946-4507-bcb1-55547ba40093-iptables-alerter-script\") pod \"iptables-alerter-4xm4l\" (UID: \"d6840304-d946-4507-bcb1-55547ba40093\") " pod="openshift-network-operator/iptables-alerter-4xm4l" Apr 23 13:31:07.434094 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.433260 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c1494fbd-44b0-417d-857b-089a1705bbe9-system-cni-dir\") pod \"multus-additional-cni-plugins-rm6cm\" (UID: \"c1494fbd-44b0-417d-857b-089a1705bbe9\") " pod="openshift-multus/multus-additional-cni-plugins-rm6cm" Apr 23 13:31:07.434094 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.433316 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/abbb4dd8-8bc4-4850-bdc6-2b1ed2494101-host-var-lib-cni-bin\") pod \"multus-2g7ct\" (UID: \"abbb4dd8-8bc4-4850-bdc6-2b1ed2494101\") " pod="openshift-multus/multus-2g7ct" Apr 23 13:31:07.435360 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.434897 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/abbb4dd8-8bc4-4850-bdc6-2b1ed2494101-host-var-lib-kubelet\") pod \"multus-2g7ct\" (UID: \"abbb4dd8-8bc4-4850-bdc6-2b1ed2494101\") " pod="openshift-multus/multus-2g7ct" Apr 23 13:31:07.435360 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.434938 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a5619baf-099b-4d83-ad43-fd7d0083f57b-host-slash\") pod \"ovnkube-node-zk8wt\" (UID: \"a5619baf-099b-4d83-ad43-fd7d0083f57b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zk8wt" Apr 23 13:31:07.435360 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.434976 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/a5619baf-099b-4d83-ad43-fd7d0083f57b-ovnkube-script-lib\") pod \"ovnkube-node-zk8wt\" (UID: \"a5619baf-099b-4d83-ad43-fd7d0083f57b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zk8wt" Apr 23 13:31:07.435360 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.435035 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/76d484ba-9464-4ff8-b453-2a202a4e649c-device-dir\") pod \"aws-ebs-csi-driver-node-zq8gn\" (UID: \"76d484ba-9464-4ff8-b453-2a202a4e649c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zq8gn" Apr 23 13:31:07.435360 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.435074 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8b9f46e6-e759-4c41-be74-de439958c36d-etc-kubernetes\") pod \"tuned-wfrll\" (UID: \"8b9f46e6-e759-4c41-be74-de439958c36d\") " pod="openshift-cluster-node-tuning-operator/tuned-wfrll" Apr 23 13:31:07.435360 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.435113 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z4vcp\" (UniqueName: \"kubernetes.io/projected/c1494fbd-44b0-417d-857b-089a1705bbe9-kube-api-access-z4vcp\") pod \"multus-additional-cni-plugins-rm6cm\" (UID: \"c1494fbd-44b0-417d-857b-089a1705bbe9\") " pod="openshift-multus/multus-additional-cni-plugins-rm6cm" Apr 23 13:31:07.435360 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.435153 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/abbb4dd8-8bc4-4850-bdc6-2b1ed2494101-host-run-netns\") pod \"multus-2g7ct\" (UID: \"abbb4dd8-8bc4-4850-bdc6-2b1ed2494101\") " pod="openshift-multus/multus-2g7ct" Apr 23 13:31:07.435360 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.435184 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a5619baf-099b-4d83-ad43-fd7d0083f57b-host-run-netns\") pod \"ovnkube-node-zk8wt\" (UID: \"a5619baf-099b-4d83-ad43-fd7d0083f57b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zk8wt" Apr 23 13:31:07.435360 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.435219 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/76d484ba-9464-4ff8-b453-2a202a4e649c-kubelet-dir\") pod \"aws-ebs-csi-driver-node-zq8gn\" (UID: \"76d484ba-9464-4ff8-b453-2a202a4e649c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zq8gn" Apr 23 13:31:07.435360 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.435254 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qg2q5\" (UniqueName: \"kubernetes.io/projected/76d484ba-9464-4ff8-b453-2a202a4e649c-kube-api-access-qg2q5\") pod \"aws-ebs-csi-driver-node-zq8gn\" (UID: \"76d484ba-9464-4ff8-b453-2a202a4e649c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zq8gn" Apr 23 13:31:07.435360 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.435287 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/8b9f46e6-e759-4c41-be74-de439958c36d-etc-sysconfig\") pod \"tuned-wfrll\" (UID: \"8b9f46e6-e759-4c41-be74-de439958c36d\") " pod="openshift-cluster-node-tuning-operator/tuned-wfrll" Apr 23 13:31:07.435360 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.435322 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/8b9f46e6-e759-4c41-be74-de439958c36d-var-lib-kubelet\") pod \"tuned-wfrll\" (UID: \"8b9f46e6-e759-4c41-be74-de439958c36d\") " pod="openshift-cluster-node-tuning-operator/tuned-wfrll" Apr 23 13:31:07.435360 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.433558 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/abbb4dd8-8bc4-4850-bdc6-2b1ed2494101-host-var-lib-cni-bin\") pod \"multus-2g7ct\" (UID: \"abbb4dd8-8bc4-4850-bdc6-2b1ed2494101\") " pod="openshift-multus/multus-2g7ct" Apr 23 13:31:07.436033 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.433513 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c1494fbd-44b0-417d-857b-089a1705bbe9-system-cni-dir\") pod \"multus-additional-cni-plugins-rm6cm\" (UID: \"c1494fbd-44b0-417d-857b-089a1705bbe9\") " pod="openshift-multus/multus-additional-cni-plugins-rm6cm" Apr 23 13:31:07.436033 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.435422 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a5619baf-099b-4d83-ad43-fd7d0083f57b-host-slash\") pod \"ovnkube-node-zk8wt\" (UID: \"a5619baf-099b-4d83-ad43-fd7d0083f57b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zk8wt" Apr 23 13:31:07.436033 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.433876 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d6840304-d946-4507-bcb1-55547ba40093-iptables-alerter-script\") pod \"iptables-alerter-4xm4l\" (UID: \"d6840304-d946-4507-bcb1-55547ba40093\") " pod="openshift-network-operator/iptables-alerter-4xm4l" Apr 23 13:31:07.436033 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.435474 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/abbb4dd8-8bc4-4850-bdc6-2b1ed2494101-host-var-lib-kubelet\") pod \"multus-2g7ct\" (UID: \"abbb4dd8-8bc4-4850-bdc6-2b1ed2494101\") " pod="openshift-multus/multus-2g7ct" Apr 23 13:31:07.436033 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.435488 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/abbb4dd8-8bc4-4850-bdc6-2b1ed2494101-host-run-netns\") pod \"multus-2g7ct\" (UID: \"abbb4dd8-8bc4-4850-bdc6-2b1ed2494101\") " pod="openshift-multus/multus-2g7ct" Apr 23 13:31:07.436033 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.435479 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a5619baf-099b-4d83-ad43-fd7d0083f57b-host-run-netns\") pod \"ovnkube-node-zk8wt\" (UID: \"a5619baf-099b-4d83-ad43-fd7d0083f57b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zk8wt" Apr 23 13:31:07.436033 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.435683 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/8b9f46e6-e759-4c41-be74-de439958c36d-etc-tuned\") pod \"tuned-wfrll\" (UID: \"8b9f46e6-e759-4c41-be74-de439958c36d\") " pod="openshift-cluster-node-tuning-operator/tuned-wfrll" Apr 23 13:31:07.436033 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.435757 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/c1494fbd-44b0-417d-857b-089a1705bbe9-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-rm6cm\" (UID: \"c1494fbd-44b0-417d-857b-089a1705bbe9\") " pod="openshift-multus/multus-additional-cni-plugins-rm6cm" Apr 23 13:31:07.436033 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.435787 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a5619baf-099b-4d83-ad43-fd7d0083f57b-etc-openvswitch\") pod \"ovnkube-node-zk8wt\" (UID: \"a5619baf-099b-4d83-ad43-fd7d0083f57b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zk8wt" Apr 23 13:31:07.436033 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.435847 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a5619baf-099b-4d83-ad43-fd7d0083f57b-etc-openvswitch\") pod \"ovnkube-node-zk8wt\" (UID: \"a5619baf-099b-4d83-ad43-fd7d0083f57b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zk8wt" Apr 23 13:31:07.436033 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.435889 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/c3276fec-f4f2-47b0-bc26-38fc5eea9ab7-agent-certs\") pod \"konnectivity-agent-5kfq8\" (UID: \"c3276fec-f4f2-47b0-bc26-38fc5eea9ab7\") " pod="kube-system/konnectivity-agent-5kfq8" Apr 23 13:31:07.436033 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.435945 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a5619baf-099b-4d83-ad43-fd7d0083f57b-ovn-node-metrics-cert\") pod \"ovnkube-node-zk8wt\" (UID: \"a5619baf-099b-4d83-ad43-fd7d0083f57b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zk8wt" Apr 23 13:31:07.436644 ip-10-0-128-108 kubenswrapper[2571]: E0423 13:31:07.436108 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 13:31:07.436644 ip-10-0-128-108 kubenswrapper[2571]: E0423 13:31:07.436124 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 13:31:07.436644 ip-10-0-128-108 kubenswrapper[2571]: E0423 13:31:07.436133 2571 projected.go:194] Error preparing data for projected volume kube-api-access-xkgtj for pod openshift-network-diagnostics/network-check-target-qqr6d: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 13:31:07.436644 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.436161 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/a5619baf-099b-4d83-ad43-fd7d0083f57b-ovnkube-script-lib\") pod \"ovnkube-node-zk8wt\" (UID: \"a5619baf-099b-4d83-ad43-fd7d0083f57b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zk8wt" Apr 23 13:31:07.436644 ip-10-0-128-108 kubenswrapper[2571]: E0423 13:31:07.436199 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/958f9f23-bac7-4183-b8b4-7d4d89901105-kube-api-access-xkgtj podName:958f9f23-bac7-4183-b8b4-7d4d89901105 nodeName:}" failed. No retries permitted until 2026-04-23 13:31:07.936182815 +0000 UTC m=+2.074461477 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-xkgtj" (UniqueName: "kubernetes.io/projected/958f9f23-bac7-4183-b8b4-7d4d89901105-kube-api-access-xkgtj") pod "network-check-target-qqr6d" (UID: "958f9f23-bac7-4183-b8b4-7d4d89901105") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 13:31:07.436908 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.436753 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-108.ec2.internal" event={"ID":"95cd19913bda1121c44084ba2ccca700","Type":"ContainerStarted","Data":"05e5dba5c618e73e80bd7a667207859496380287aed393b574214fcd6fc99ed6"} Apr 23 13:31:07.436978 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.436956 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/c1494fbd-44b0-417d-857b-089a1705bbe9-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-rm6cm\" (UID: \"c1494fbd-44b0-417d-857b-089a1705bbe9\") " pod="openshift-multus/multus-additional-cni-plugins-rm6cm" Apr 23 13:31:07.438466 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.438445 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lpc2h\" (UniqueName: \"kubernetes.io/projected/a5619baf-099b-4d83-ad43-fd7d0083f57b-kube-api-access-lpc2h\") pod \"ovnkube-node-zk8wt\" (UID: \"a5619baf-099b-4d83-ad43-fd7d0083f57b\") " pod="openshift-ovn-kubernetes/ovnkube-node-zk8wt" Apr 23 13:31:07.438537 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.438515 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2mkdt\" (UniqueName: \"kubernetes.io/projected/d6840304-d946-4507-bcb1-55547ba40093-kube-api-access-2mkdt\") pod \"iptables-alerter-4xm4l\" (UID: \"d6840304-d946-4507-bcb1-55547ba40093\") " pod="openshift-network-operator/iptables-alerter-4xm4l" Apr 23 13:31:07.439119 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.439092 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xk9t4\" (UniqueName: \"kubernetes.io/projected/abbb4dd8-8bc4-4850-bdc6-2b1ed2494101-kube-api-access-xk9t4\") pod \"multus-2g7ct\" (UID: \"abbb4dd8-8bc4-4850-bdc6-2b1ed2494101\") " pod="openshift-multus/multus-2g7ct" Apr 23 13:31:07.439184 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.439134 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dqqw2\" (UniqueName: \"kubernetes.io/projected/6866a2aa-1943-4e03-a99a-8b054a2434c8-kube-api-access-dqqw2\") pod \"network-metrics-daemon-k958q\" (UID: \"6866a2aa-1943-4e03-a99a-8b054a2434c8\") " pod="openshift-multus/network-metrics-daemon-k958q" Apr 23 13:31:07.442883 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.442866 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z4vcp\" (UniqueName: \"kubernetes.io/projected/c1494fbd-44b0-417d-857b-089a1705bbe9-kube-api-access-z4vcp\") pod \"multus-additional-cni-plugins-rm6cm\" (UID: \"c1494fbd-44b0-417d-857b-089a1705bbe9\") " pod="openshift-multus/multus-additional-cni-plugins-rm6cm" Apr 23 13:31:07.536573 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.536498 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4zrj4\" (UniqueName: \"kubernetes.io/projected/c524008d-56ba-4f94-99b4-bd3ef55ba66f-kube-api-access-4zrj4\") pod \"node-ca-2rlxj\" (UID: \"c524008d-56ba-4f94-99b4-bd3ef55ba66f\") " pod="openshift-image-registry/node-ca-2rlxj" Apr 23 13:31:07.536573 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.536543 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8b9f46e6-e759-4c41-be74-de439958c36d-host\") pod \"tuned-wfrll\" (UID: \"8b9f46e6-e759-4c41-be74-de439958c36d\") " pod="openshift-cluster-node-tuning-operator/tuned-wfrll" Apr 23 13:31:07.536758 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.536602 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8b9f46e6-e759-4c41-be74-de439958c36d-host\") pod \"tuned-wfrll\" (UID: \"8b9f46e6-e759-4c41-be74-de439958c36d\") " pod="openshift-cluster-node-tuning-operator/tuned-wfrll" Apr 23 13:31:07.536758 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.536635 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-86vhr\" (UniqueName: \"kubernetes.io/projected/8b9f46e6-e759-4c41-be74-de439958c36d-kube-api-access-86vhr\") pod \"tuned-wfrll\" (UID: \"8b9f46e6-e759-4c41-be74-de439958c36d\") " pod="openshift-cluster-node-tuning-operator/tuned-wfrll" Apr 23 13:31:07.536758 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.536663 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/8b9f46e6-e759-4c41-be74-de439958c36d-etc-modprobe-d\") pod \"tuned-wfrll\" (UID: \"8b9f46e6-e759-4c41-be74-de439958c36d\") " pod="openshift-cluster-node-tuning-operator/tuned-wfrll" Apr 23 13:31:07.536758 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.536682 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8b9f46e6-e759-4c41-be74-de439958c36d-sys\") pod \"tuned-wfrll\" (UID: \"8b9f46e6-e759-4c41-be74-de439958c36d\") " pod="openshift-cluster-node-tuning-operator/tuned-wfrll" Apr 23 13:31:07.536758 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.536708 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/76d484ba-9464-4ff8-b453-2a202a4e649c-device-dir\") pod \"aws-ebs-csi-driver-node-zq8gn\" (UID: \"76d484ba-9464-4ff8-b453-2a202a4e649c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zq8gn" Apr 23 13:31:07.536758 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.536732 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8b9f46e6-e759-4c41-be74-de439958c36d-etc-kubernetes\") pod \"tuned-wfrll\" (UID: \"8b9f46e6-e759-4c41-be74-de439958c36d\") " pod="openshift-cluster-node-tuning-operator/tuned-wfrll" Apr 23 13:31:07.536993 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.536758 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/76d484ba-9464-4ff8-b453-2a202a4e649c-kubelet-dir\") pod \"aws-ebs-csi-driver-node-zq8gn\" (UID: \"76d484ba-9464-4ff8-b453-2a202a4e649c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zq8gn" Apr 23 13:31:07.536993 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.536781 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8b9f46e6-e759-4c41-be74-de439958c36d-sys\") pod \"tuned-wfrll\" (UID: \"8b9f46e6-e759-4c41-be74-de439958c36d\") " pod="openshift-cluster-node-tuning-operator/tuned-wfrll" Apr 23 13:31:07.536993 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.536782 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qg2q5\" (UniqueName: \"kubernetes.io/projected/76d484ba-9464-4ff8-b453-2a202a4e649c-kube-api-access-qg2q5\") pod \"aws-ebs-csi-driver-node-zq8gn\" (UID: \"76d484ba-9464-4ff8-b453-2a202a4e649c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zq8gn" Apr 23 13:31:07.536993 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.536808 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/8b9f46e6-e759-4c41-be74-de439958c36d-etc-modprobe-d\") pod \"tuned-wfrll\" (UID: \"8b9f46e6-e759-4c41-be74-de439958c36d\") " pod="openshift-cluster-node-tuning-operator/tuned-wfrll" Apr 23 13:31:07.536993 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.536792 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/76d484ba-9464-4ff8-b453-2a202a4e649c-device-dir\") pod \"aws-ebs-csi-driver-node-zq8gn\" (UID: \"76d484ba-9464-4ff8-b453-2a202a4e649c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zq8gn" Apr 23 13:31:07.536993 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.536841 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8b9f46e6-e759-4c41-be74-de439958c36d-etc-kubernetes\") pod \"tuned-wfrll\" (UID: \"8b9f46e6-e759-4c41-be74-de439958c36d\") " pod="openshift-cluster-node-tuning-operator/tuned-wfrll" Apr 23 13:31:07.536993 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.536842 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/76d484ba-9464-4ff8-b453-2a202a4e649c-kubelet-dir\") pod \"aws-ebs-csi-driver-node-zq8gn\" (UID: \"76d484ba-9464-4ff8-b453-2a202a4e649c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zq8gn" Apr 23 13:31:07.536993 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.536845 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/8b9f46e6-e759-4c41-be74-de439958c36d-etc-sysconfig\") pod \"tuned-wfrll\" (UID: \"8b9f46e6-e759-4c41-be74-de439958c36d\") " pod="openshift-cluster-node-tuning-operator/tuned-wfrll" Apr 23 13:31:07.536993 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.536874 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/8b9f46e6-e759-4c41-be74-de439958c36d-etc-sysconfig\") pod \"tuned-wfrll\" (UID: \"8b9f46e6-e759-4c41-be74-de439958c36d\") " pod="openshift-cluster-node-tuning-operator/tuned-wfrll" Apr 23 13:31:07.536993 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.536892 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/8b9f46e6-e759-4c41-be74-de439958c36d-var-lib-kubelet\") pod \"tuned-wfrll\" (UID: \"8b9f46e6-e759-4c41-be74-de439958c36d\") " pod="openshift-cluster-node-tuning-operator/tuned-wfrll" Apr 23 13:31:07.536993 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.536909 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/8b9f46e6-e759-4c41-be74-de439958c36d-etc-tuned\") pod \"tuned-wfrll\" (UID: \"8b9f46e6-e759-4c41-be74-de439958c36d\") " pod="openshift-cluster-node-tuning-operator/tuned-wfrll" Apr 23 13:31:07.536993 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.536924 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/8b9f46e6-e759-4c41-be74-de439958c36d-var-lib-kubelet\") pod \"tuned-wfrll\" (UID: \"8b9f46e6-e759-4c41-be74-de439958c36d\") " pod="openshift-cluster-node-tuning-operator/tuned-wfrll" Apr 23 13:31:07.536993 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.536933 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/8b9f46e6-e759-4c41-be74-de439958c36d-tmp\") pod \"tuned-wfrll\" (UID: \"8b9f46e6-e759-4c41-be74-de439958c36d\") " pod="openshift-cluster-node-tuning-operator/tuned-wfrll" Apr 23 13:31:07.536993 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.536966 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/8b9f46e6-e759-4c41-be74-de439958c36d-etc-sysctl-conf\") pod \"tuned-wfrll\" (UID: \"8b9f46e6-e759-4c41-be74-de439958c36d\") " pod="openshift-cluster-node-tuning-operator/tuned-wfrll" Apr 23 13:31:07.536993 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.536989 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c524008d-56ba-4f94-99b4-bd3ef55ba66f-host\") pod \"node-ca-2rlxj\" (UID: \"c524008d-56ba-4f94-99b4-bd3ef55ba66f\") " pod="openshift-image-registry/node-ca-2rlxj" Apr 23 13:31:07.537566 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.537016 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/8b9f46e6-e759-4c41-be74-de439958c36d-etc-sysctl-d\") pod \"tuned-wfrll\" (UID: \"8b9f46e6-e759-4c41-be74-de439958c36d\") " pod="openshift-cluster-node-tuning-operator/tuned-wfrll" Apr 23 13:31:07.537566 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.537039 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/8b9f46e6-e759-4c41-be74-de439958c36d-etc-systemd\") pod \"tuned-wfrll\" (UID: \"8b9f46e6-e759-4c41-be74-de439958c36d\") " pod="openshift-cluster-node-tuning-operator/tuned-wfrll" Apr 23 13:31:07.537566 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.537063 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/76d484ba-9464-4ff8-b453-2a202a4e649c-registration-dir\") pod \"aws-ebs-csi-driver-node-zq8gn\" (UID: \"76d484ba-9464-4ff8-b453-2a202a4e649c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zq8gn" Apr 23 13:31:07.537566 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.537070 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c524008d-56ba-4f94-99b4-bd3ef55ba66f-host\") pod \"node-ca-2rlxj\" (UID: \"c524008d-56ba-4f94-99b4-bd3ef55ba66f\") " pod="openshift-image-registry/node-ca-2rlxj" Apr 23 13:31:07.537566 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.537085 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/8b9f46e6-e759-4c41-be74-de439958c36d-lib-modules\") pod \"tuned-wfrll\" (UID: \"8b9f46e6-e759-4c41-be74-de439958c36d\") " pod="openshift-cluster-node-tuning-operator/tuned-wfrll" Apr 23 13:31:07.537566 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.537089 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/8b9f46e6-e759-4c41-be74-de439958c36d-etc-sysctl-conf\") pod \"tuned-wfrll\" (UID: \"8b9f46e6-e759-4c41-be74-de439958c36d\") " pod="openshift-cluster-node-tuning-operator/tuned-wfrll" Apr 23 13:31:07.537566 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.537130 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/8b9f46e6-e759-4c41-be74-de439958c36d-etc-sysctl-d\") pod \"tuned-wfrll\" (UID: \"8b9f46e6-e759-4c41-be74-de439958c36d\") " pod="openshift-cluster-node-tuning-operator/tuned-wfrll" Apr 23 13:31:07.537566 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.537132 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/8b9f46e6-e759-4c41-be74-de439958c36d-etc-systemd\") pod \"tuned-wfrll\" (UID: \"8b9f46e6-e759-4c41-be74-de439958c36d\") " pod="openshift-cluster-node-tuning-operator/tuned-wfrll" Apr 23 13:31:07.537566 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.537162 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/76d484ba-9464-4ff8-b453-2a202a4e649c-sys-fs\") pod \"aws-ebs-csi-driver-node-zq8gn\" (UID: \"76d484ba-9464-4ff8-b453-2a202a4e649c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zq8gn" Apr 23 13:31:07.537566 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.537183 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/8b9f46e6-e759-4c41-be74-de439958c36d-lib-modules\") pod \"tuned-wfrll\" (UID: \"8b9f46e6-e759-4c41-be74-de439958c36d\") " pod="openshift-cluster-node-tuning-operator/tuned-wfrll" Apr 23 13:31:07.537566 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.537188 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/76d484ba-9464-4ff8-b453-2a202a4e649c-registration-dir\") pod \"aws-ebs-csi-driver-node-zq8gn\" (UID: \"76d484ba-9464-4ff8-b453-2a202a4e649c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zq8gn" Apr 23 13:31:07.537566 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.537192 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/8b9f46e6-e759-4c41-be74-de439958c36d-run\") pod \"tuned-wfrll\" (UID: \"8b9f46e6-e759-4c41-be74-de439958c36d\") " pod="openshift-cluster-node-tuning-operator/tuned-wfrll" Apr 23 13:31:07.537566 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.537227 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/8b9f46e6-e759-4c41-be74-de439958c36d-run\") pod \"tuned-wfrll\" (UID: \"8b9f46e6-e759-4c41-be74-de439958c36d\") " pod="openshift-cluster-node-tuning-operator/tuned-wfrll" Apr 23 13:31:07.537566 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.537238 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/76d484ba-9464-4ff8-b453-2a202a4e649c-sys-fs\") pod \"aws-ebs-csi-driver-node-zq8gn\" (UID: \"76d484ba-9464-4ff8-b453-2a202a4e649c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zq8gn" Apr 23 13:31:07.537566 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.537242 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/76d484ba-9464-4ff8-b453-2a202a4e649c-socket-dir\") pod \"aws-ebs-csi-driver-node-zq8gn\" (UID: \"76d484ba-9464-4ff8-b453-2a202a4e649c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zq8gn" Apr 23 13:31:07.537566 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.537304 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/c524008d-56ba-4f94-99b4-bd3ef55ba66f-serviceca\") pod \"node-ca-2rlxj\" (UID: \"c524008d-56ba-4f94-99b4-bd3ef55ba66f\") " pod="openshift-image-registry/node-ca-2rlxj" Apr 23 13:31:07.537566 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.537326 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/76d484ba-9464-4ff8-b453-2a202a4e649c-socket-dir\") pod \"aws-ebs-csi-driver-node-zq8gn\" (UID: \"76d484ba-9464-4ff8-b453-2a202a4e649c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zq8gn" Apr 23 13:31:07.538033 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.537355 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/76d484ba-9464-4ff8-b453-2a202a4e649c-etc-selinux\") pod \"aws-ebs-csi-driver-node-zq8gn\" (UID: \"76d484ba-9464-4ff8-b453-2a202a4e649c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zq8gn" Apr 23 13:31:07.538033 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.537414 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/76d484ba-9464-4ff8-b453-2a202a4e649c-etc-selinux\") pod \"aws-ebs-csi-driver-node-zq8gn\" (UID: \"76d484ba-9464-4ff8-b453-2a202a4e649c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zq8gn" Apr 23 13:31:07.538033 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.537706 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/c524008d-56ba-4f94-99b4-bd3ef55ba66f-serviceca\") pod \"node-ca-2rlxj\" (UID: \"c524008d-56ba-4f94-99b4-bd3ef55ba66f\") " pod="openshift-image-registry/node-ca-2rlxj" Apr 23 13:31:07.538948 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.538928 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/8b9f46e6-e759-4c41-be74-de439958c36d-etc-tuned\") pod \"tuned-wfrll\" (UID: \"8b9f46e6-e759-4c41-be74-de439958c36d\") " pod="openshift-cluster-node-tuning-operator/tuned-wfrll" Apr 23 13:31:07.538994 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.538962 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/8b9f46e6-e759-4c41-be74-de439958c36d-tmp\") pod \"tuned-wfrll\" (UID: \"8b9f46e6-e759-4c41-be74-de439958c36d\") " pod="openshift-cluster-node-tuning-operator/tuned-wfrll" Apr 23 13:31:07.544545 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.544526 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4zrj4\" (UniqueName: \"kubernetes.io/projected/c524008d-56ba-4f94-99b4-bd3ef55ba66f-kube-api-access-4zrj4\") pod \"node-ca-2rlxj\" (UID: \"c524008d-56ba-4f94-99b4-bd3ef55ba66f\") " pod="openshift-image-registry/node-ca-2rlxj" Apr 23 13:31:07.544650 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.544631 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-86vhr\" (UniqueName: \"kubernetes.io/projected/8b9f46e6-e759-4c41-be74-de439958c36d-kube-api-access-86vhr\") pod \"tuned-wfrll\" (UID: \"8b9f46e6-e759-4c41-be74-de439958c36d\") " pod="openshift-cluster-node-tuning-operator/tuned-wfrll" Apr 23 13:31:07.544825 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.544811 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qg2q5\" (UniqueName: \"kubernetes.io/projected/76d484ba-9464-4ff8-b453-2a202a4e649c-kube-api-access-qg2q5\") pod \"aws-ebs-csi-driver-node-zq8gn\" (UID: \"76d484ba-9464-4ff8-b453-2a202a4e649c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zq8gn" Apr 23 13:31:07.642761 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.642738 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-rm6cm" Apr 23 13:31:07.650222 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:07.650199 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc1494fbd_44b0_417d_857b_089a1705bbe9.slice/crio-241504324ae3ec97779b4e79ecab74ebbdf5b488933465b6ce2d1627fc942093 WatchSource:0}: Error finding container 241504324ae3ec97779b4e79ecab74ebbdf5b488933465b6ce2d1627fc942093: Status 404 returned error can't find the container with id 241504324ae3ec97779b4e79ecab74ebbdf5b488933465b6ce2d1627fc942093 Apr 23 13:31:07.661616 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.661596 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-5kfq8" Apr 23 13:31:07.667345 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:07.667311 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc3276fec_f4f2_47b0_bc26_38fc5eea9ab7.slice/crio-dda9092b5bea2231b3f0570f0690c9497189f65b5b01541b3bab024e860112f6 WatchSource:0}: Error finding container dda9092b5bea2231b3f0570f0690c9497189f65b5b01541b3bab024e860112f6: Status 404 returned error can't find the container with id dda9092b5bea2231b3f0570f0690c9497189f65b5b01541b3bab024e860112f6 Apr 23 13:31:07.678483 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.678464 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-2g7ct" Apr 23 13:31:07.683821 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:07.683801 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podabbb4dd8_8bc4_4850_bdc6_2b1ed2494101.slice/crio-485a16897258261c7cffa059b2277961dd05905709238b1e6f589f163e17cc60 WatchSource:0}: Error finding container 485a16897258261c7cffa059b2277961dd05905709238b1e6f589f163e17cc60: Status 404 returned error can't find the container with id 485a16897258261c7cffa059b2277961dd05905709238b1e6f589f163e17cc60 Apr 23 13:31:07.703626 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.703608 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4xm4l" Apr 23 13:31:07.708569 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:07.708551 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd6840304_d946_4507_bcb1_55547ba40093.slice/crio-b43bc8946040c9970bf4051ff04803c32931a7d66a82ee3ecaa74e16db37b012 WatchSource:0}: Error finding container b43bc8946040c9970bf4051ff04803c32931a7d66a82ee3ecaa74e16db37b012: Status 404 returned error can't find the container with id b43bc8946040c9970bf4051ff04803c32931a7d66a82ee3ecaa74e16db37b012 Apr 23 13:31:07.709221 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.709203 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-zk8wt" Apr 23 13:31:07.714864 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.714845 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zq8gn" Apr 23 13:31:07.716001 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:07.715982 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda5619baf_099b_4d83_ad43_fd7d0083f57b.slice/crio-f170188312e2266d7d5332d7f9c0ddbffac6b7237cec12432fb65e70c0198ebd WatchSource:0}: Error finding container f170188312e2266d7d5332d7f9c0ddbffac6b7237cec12432fb65e70c0198ebd: Status 404 returned error can't find the container with id f170188312e2266d7d5332d7f9c0ddbffac6b7237cec12432fb65e70c0198ebd Apr 23 13:31:07.720265 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.720246 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-wfrll" Apr 23 13:31:07.720532 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:07.720511 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod76d484ba_9464_4ff8_b453_2a202a4e649c.slice/crio-2255dbde619390c2b7d30a76ebaa4066668c57a1ac39057d9042c30b7c167f2a WatchSource:0}: Error finding container 2255dbde619390c2b7d30a76ebaa4066668c57a1ac39057d9042c30b7c167f2a: Status 404 returned error can't find the container with id 2255dbde619390c2b7d30a76ebaa4066668c57a1ac39057d9042c30b7c167f2a Apr 23 13:31:07.725091 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.725070 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-2rlxj" Apr 23 13:31:07.725271 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:07.725255 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8b9f46e6_e759_4c41_be74_de439958c36d.slice/crio-7b2a7924638ab6326f13c36f55392f99f631ce28388be1d2538ece650738bf83 WatchSource:0}: Error finding container 7b2a7924638ab6326f13c36f55392f99f631ce28388be1d2538ece650738bf83: Status 404 returned error can't find the container with id 7b2a7924638ab6326f13c36f55392f99f631ce28388be1d2538ece650738bf83 Apr 23 13:31:07.730465 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:07.730446 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc524008d_56ba_4f94_99b4_bd3ef55ba66f.slice/crio-f963a72814f923bba8515a015d8b250450c451051c92dce7a047a31d4f7c408a WatchSource:0}: Error finding container f963a72814f923bba8515a015d8b250450c451051c92dce7a047a31d4f7c408a: Status 404 returned error can't find the container with id f963a72814f923bba8515a015d8b250450c451051c92dce7a047a31d4f7c408a Apr 23 13:31:07.940074 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.939987 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xkgtj\" (UniqueName: \"kubernetes.io/projected/958f9f23-bac7-4183-b8b4-7d4d89901105-kube-api-access-xkgtj\") pod \"network-check-target-qqr6d\" (UID: \"958f9f23-bac7-4183-b8b4-7d4d89901105\") " pod="openshift-network-diagnostics/network-check-target-qqr6d" Apr 23 13:31:07.940074 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:07.940049 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6866a2aa-1943-4e03-a99a-8b054a2434c8-metrics-certs\") pod \"network-metrics-daemon-k958q\" (UID: \"6866a2aa-1943-4e03-a99a-8b054a2434c8\") " pod="openshift-multus/network-metrics-daemon-k958q" Apr 23 13:31:07.940279 ip-10-0-128-108 kubenswrapper[2571]: E0423 13:31:07.940167 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 13:31:07.940279 ip-10-0-128-108 kubenswrapper[2571]: E0423 13:31:07.940197 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 13:31:07.940279 ip-10-0-128-108 kubenswrapper[2571]: E0423 13:31:07.940209 2571 projected.go:194] Error preparing data for projected volume kube-api-access-xkgtj for pod openshift-network-diagnostics/network-check-target-qqr6d: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 13:31:07.940279 ip-10-0-128-108 kubenswrapper[2571]: E0423 13:31:07.940171 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 13:31:07.940501 ip-10-0-128-108 kubenswrapper[2571]: E0423 13:31:07.940301 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/958f9f23-bac7-4183-b8b4-7d4d89901105-kube-api-access-xkgtj podName:958f9f23-bac7-4183-b8b4-7d4d89901105 nodeName:}" failed. No retries permitted until 2026-04-23 13:31:08.940247865 +0000 UTC m=+3.078526537 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-xkgtj" (UniqueName: "kubernetes.io/projected/958f9f23-bac7-4183-b8b4-7d4d89901105-kube-api-access-xkgtj") pod "network-check-target-qqr6d" (UID: "958f9f23-bac7-4183-b8b4-7d4d89901105") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 13:31:07.940501 ip-10-0-128-108 kubenswrapper[2571]: E0423 13:31:07.940359 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6866a2aa-1943-4e03-a99a-8b054a2434c8-metrics-certs podName:6866a2aa-1943-4e03-a99a-8b054a2434c8 nodeName:}" failed. No retries permitted until 2026-04-23 13:31:08.940311434 +0000 UTC m=+3.078590099 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6866a2aa-1943-4e03-a99a-8b054a2434c8-metrics-certs") pod "network-metrics-daemon-k958q" (UID: "6866a2aa-1943-4e03-a99a-8b054a2434c8") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 13:31:08.080468 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:08.080436 2571 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 13:31:08.358841 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:08.358703 2571 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-22 13:26:07 +0000 UTC" deadline="2027-11-16 18:49:03.908676565 +0000 UTC" Apr 23 13:31:08.358841 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:08.358741 2571 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13733h17m55.549941248s" Apr 23 13:31:08.431856 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:08.431241 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k958q" Apr 23 13:31:08.431856 ip-10-0-128-108 kubenswrapper[2571]: E0423 13:31:08.431401 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k958q" podUID="6866a2aa-1943-4e03-a99a-8b054a2434c8" Apr 23 13:31:08.450186 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:08.450122 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-2rlxj" event={"ID":"c524008d-56ba-4f94-99b4-bd3ef55ba66f","Type":"ContainerStarted","Data":"f963a72814f923bba8515a015d8b250450c451051c92dce7a047a31d4f7c408a"} Apr 23 13:31:08.463421 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:08.463387 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-wfrll" event={"ID":"8b9f46e6-e759-4c41-be74-de439958c36d","Type":"ContainerStarted","Data":"7b2a7924638ab6326f13c36f55392f99f631ce28388be1d2538ece650738bf83"} Apr 23 13:31:08.470864 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:08.470834 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zq8gn" event={"ID":"76d484ba-9464-4ff8-b453-2a202a4e649c","Type":"ContainerStarted","Data":"2255dbde619390c2b7d30a76ebaa4066668c57a1ac39057d9042c30b7c167f2a"} Apr 23 13:31:08.478691 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:08.478668 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rm6cm" event={"ID":"c1494fbd-44b0-417d-857b-089a1705bbe9","Type":"ContainerStarted","Data":"241504324ae3ec97779b4e79ecab74ebbdf5b488933465b6ce2d1627fc942093"} Apr 23 13:31:08.482250 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:08.482224 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zk8wt" event={"ID":"a5619baf-099b-4d83-ad43-fd7d0083f57b","Type":"ContainerStarted","Data":"f170188312e2266d7d5332d7f9c0ddbffac6b7237cec12432fb65e70c0198ebd"} Apr 23 13:31:08.491471 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:08.491444 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4xm4l" event={"ID":"d6840304-d946-4507-bcb1-55547ba40093","Type":"ContainerStarted","Data":"b43bc8946040c9970bf4051ff04803c32931a7d66a82ee3ecaa74e16db37b012"} Apr 23 13:31:08.498981 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:08.498957 2571 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 13:31:08.503176 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:08.503152 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-2g7ct" event={"ID":"abbb4dd8-8bc4-4850-bdc6-2b1ed2494101","Type":"ContainerStarted","Data":"485a16897258261c7cffa059b2277961dd05905709238b1e6f589f163e17cc60"} Apr 23 13:31:08.505778 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:08.505754 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-5kfq8" event={"ID":"c3276fec-f4f2-47b0-bc26-38fc5eea9ab7","Type":"ContainerStarted","Data":"dda9092b5bea2231b3f0570f0690c9497189f65b5b01541b3bab024e860112f6"} Apr 23 13:31:08.747803 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:08.747736 2571 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 13:31:08.949453 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:08.948613 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6866a2aa-1943-4e03-a99a-8b054a2434c8-metrics-certs\") pod \"network-metrics-daemon-k958q\" (UID: \"6866a2aa-1943-4e03-a99a-8b054a2434c8\") " pod="openshift-multus/network-metrics-daemon-k958q" Apr 23 13:31:08.949453 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:08.948694 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xkgtj\" (UniqueName: \"kubernetes.io/projected/958f9f23-bac7-4183-b8b4-7d4d89901105-kube-api-access-xkgtj\") pod \"network-check-target-qqr6d\" (UID: \"958f9f23-bac7-4183-b8b4-7d4d89901105\") " pod="openshift-network-diagnostics/network-check-target-qqr6d" Apr 23 13:31:08.949453 ip-10-0-128-108 kubenswrapper[2571]: E0423 13:31:08.948833 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 13:31:08.949453 ip-10-0-128-108 kubenswrapper[2571]: E0423 13:31:08.948851 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 13:31:08.949453 ip-10-0-128-108 kubenswrapper[2571]: E0423 13:31:08.948864 2571 projected.go:194] Error preparing data for projected volume kube-api-access-xkgtj for pod openshift-network-diagnostics/network-check-target-qqr6d: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 13:31:08.949453 ip-10-0-128-108 kubenswrapper[2571]: E0423 13:31:08.948921 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/958f9f23-bac7-4183-b8b4-7d4d89901105-kube-api-access-xkgtj podName:958f9f23-bac7-4183-b8b4-7d4d89901105 nodeName:}" failed. No retries permitted until 2026-04-23 13:31:10.948902808 +0000 UTC m=+5.087181473 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-xkgtj" (UniqueName: "kubernetes.io/projected/958f9f23-bac7-4183-b8b4-7d4d89901105-kube-api-access-xkgtj") pod "network-check-target-qqr6d" (UID: "958f9f23-bac7-4183-b8b4-7d4d89901105") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 13:31:08.949453 ip-10-0-128-108 kubenswrapper[2571]: E0423 13:31:08.949367 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 13:31:08.949453 ip-10-0-128-108 kubenswrapper[2571]: E0423 13:31:08.949418 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6866a2aa-1943-4e03-a99a-8b054a2434c8-metrics-certs podName:6866a2aa-1943-4e03-a99a-8b054a2434c8 nodeName:}" failed. No retries permitted until 2026-04-23 13:31:10.949402814 +0000 UTC m=+5.087681494 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6866a2aa-1943-4e03-a99a-8b054a2434c8-metrics-certs") pod "network-metrics-daemon-k958q" (UID: "6866a2aa-1943-4e03-a99a-8b054a2434c8") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 13:31:09.360115 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:09.360052 2571 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-22 13:26:07 +0000 UTC" deadline="2028-01-28 08:01:07.241977764 +0000 UTC" Apr 23 13:31:09.360115 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:09.360091 2571 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15474h29m57.881890511s" Apr 23 13:31:09.429410 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:09.429378 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qqr6d" Apr 23 13:31:09.429579 ip-10-0-128-108 kubenswrapper[2571]: E0423 13:31:09.429509 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qqr6d" podUID="958f9f23-bac7-4183-b8b4-7d4d89901105" Apr 23 13:31:10.429485 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:10.429446 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k958q" Apr 23 13:31:10.429922 ip-10-0-128-108 kubenswrapper[2571]: E0423 13:31:10.429594 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k958q" podUID="6866a2aa-1943-4e03-a99a-8b054a2434c8" Apr 23 13:31:10.964042 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:10.963826 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xkgtj\" (UniqueName: \"kubernetes.io/projected/958f9f23-bac7-4183-b8b4-7d4d89901105-kube-api-access-xkgtj\") pod \"network-check-target-qqr6d\" (UID: \"958f9f23-bac7-4183-b8b4-7d4d89901105\") " pod="openshift-network-diagnostics/network-check-target-qqr6d" Apr 23 13:31:10.964042 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:10.963892 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6866a2aa-1943-4e03-a99a-8b054a2434c8-metrics-certs\") pod \"network-metrics-daemon-k958q\" (UID: \"6866a2aa-1943-4e03-a99a-8b054a2434c8\") " pod="openshift-multus/network-metrics-daemon-k958q" Apr 23 13:31:10.964042 ip-10-0-128-108 kubenswrapper[2571]: E0423 13:31:10.964021 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 13:31:10.964301 ip-10-0-128-108 kubenswrapper[2571]: E0423 13:31:10.964065 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 13:31:10.964301 ip-10-0-128-108 kubenswrapper[2571]: E0423 13:31:10.964082 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6866a2aa-1943-4e03-a99a-8b054a2434c8-metrics-certs podName:6866a2aa-1943-4e03-a99a-8b054a2434c8 nodeName:}" failed. No retries permitted until 2026-04-23 13:31:14.964063862 +0000 UTC m=+9.102342529 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6866a2aa-1943-4e03-a99a-8b054a2434c8-metrics-certs") pod "network-metrics-daemon-k958q" (UID: "6866a2aa-1943-4e03-a99a-8b054a2434c8") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 13:31:10.964301 ip-10-0-128-108 kubenswrapper[2571]: E0423 13:31:10.964090 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 13:31:10.964301 ip-10-0-128-108 kubenswrapper[2571]: E0423 13:31:10.964104 2571 projected.go:194] Error preparing data for projected volume kube-api-access-xkgtj for pod openshift-network-diagnostics/network-check-target-qqr6d: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 13:31:10.964301 ip-10-0-128-108 kubenswrapper[2571]: E0423 13:31:10.964157 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/958f9f23-bac7-4183-b8b4-7d4d89901105-kube-api-access-xkgtj podName:958f9f23-bac7-4183-b8b4-7d4d89901105 nodeName:}" failed. No retries permitted until 2026-04-23 13:31:14.964141344 +0000 UTC m=+9.102420011 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-xkgtj" (UniqueName: "kubernetes.io/projected/958f9f23-bac7-4183-b8b4-7d4d89901105-kube-api-access-xkgtj") pod "network-check-target-qqr6d" (UID: "958f9f23-bac7-4183-b8b4-7d4d89901105") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 13:31:11.429212 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:11.429135 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qqr6d" Apr 23 13:31:11.429394 ip-10-0-128-108 kubenswrapper[2571]: E0423 13:31:11.429261 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qqr6d" podUID="958f9f23-bac7-4183-b8b4-7d4d89901105" Apr 23 13:31:12.429034 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:12.429000 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k958q" Apr 23 13:31:12.429475 ip-10-0-128-108 kubenswrapper[2571]: E0423 13:31:12.429145 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k958q" podUID="6866a2aa-1943-4e03-a99a-8b054a2434c8" Apr 23 13:31:13.429549 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:13.429519 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qqr6d" Apr 23 13:31:13.429964 ip-10-0-128-108 kubenswrapper[2571]: E0423 13:31:13.429630 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qqr6d" podUID="958f9f23-bac7-4183-b8b4-7d4d89901105" Apr 23 13:31:14.429191 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:14.429160 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k958q" Apr 23 13:31:14.429378 ip-10-0-128-108 kubenswrapper[2571]: E0423 13:31:14.429304 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k958q" podUID="6866a2aa-1943-4e03-a99a-8b054a2434c8" Apr 23 13:31:14.994162 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:14.994122 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xkgtj\" (UniqueName: \"kubernetes.io/projected/958f9f23-bac7-4183-b8b4-7d4d89901105-kube-api-access-xkgtj\") pod \"network-check-target-qqr6d\" (UID: \"958f9f23-bac7-4183-b8b4-7d4d89901105\") " pod="openshift-network-diagnostics/network-check-target-qqr6d" Apr 23 13:31:14.994639 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:14.994187 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6866a2aa-1943-4e03-a99a-8b054a2434c8-metrics-certs\") pod \"network-metrics-daemon-k958q\" (UID: \"6866a2aa-1943-4e03-a99a-8b054a2434c8\") " pod="openshift-multus/network-metrics-daemon-k958q" Apr 23 13:31:14.994639 ip-10-0-128-108 kubenswrapper[2571]: E0423 13:31:14.994303 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 13:31:14.994639 ip-10-0-128-108 kubenswrapper[2571]: E0423 13:31:14.994305 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 13:31:14.994639 ip-10-0-128-108 kubenswrapper[2571]: E0423 13:31:14.994346 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 13:31:14.994639 ip-10-0-128-108 kubenswrapper[2571]: E0423 13:31:14.994360 2571 projected.go:194] Error preparing data for projected volume kube-api-access-xkgtj for pod openshift-network-diagnostics/network-check-target-qqr6d: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 13:31:14.994639 ip-10-0-128-108 kubenswrapper[2571]: E0423 13:31:14.994371 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6866a2aa-1943-4e03-a99a-8b054a2434c8-metrics-certs podName:6866a2aa-1943-4e03-a99a-8b054a2434c8 nodeName:}" failed. No retries permitted until 2026-04-23 13:31:22.99435648 +0000 UTC m=+17.132635142 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6866a2aa-1943-4e03-a99a-8b054a2434c8-metrics-certs") pod "network-metrics-daemon-k958q" (UID: "6866a2aa-1943-4e03-a99a-8b054a2434c8") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 13:31:14.994639 ip-10-0-128-108 kubenswrapper[2571]: E0423 13:31:14.994412 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/958f9f23-bac7-4183-b8b4-7d4d89901105-kube-api-access-xkgtj podName:958f9f23-bac7-4183-b8b4-7d4d89901105 nodeName:}" failed. No retries permitted until 2026-04-23 13:31:22.994394528 +0000 UTC m=+17.132673196 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-xkgtj" (UniqueName: "kubernetes.io/projected/958f9f23-bac7-4183-b8b4-7d4d89901105-kube-api-access-xkgtj") pod "network-check-target-qqr6d" (UID: "958f9f23-bac7-4183-b8b4-7d4d89901105") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 13:31:15.429219 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:15.429130 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qqr6d" Apr 23 13:31:15.429417 ip-10-0-128-108 kubenswrapper[2571]: E0423 13:31:15.429273 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qqr6d" podUID="958f9f23-bac7-4183-b8b4-7d4d89901105" Apr 23 13:31:15.503722 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:15.502961 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-nbf64"] Apr 23 13:31:15.506214 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:15.506189 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-nbf64" Apr 23 13:31:15.509661 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:15.509295 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 23 13:31:15.509661 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:15.509417 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 23 13:31:15.509661 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:15.509527 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-btbkw\"" Apr 23 13:31:15.597952 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:15.597913 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/cecba8e3-e60c-4053-96af-5ab1c4960855-tmp-dir\") pod \"node-resolver-nbf64\" (UID: \"cecba8e3-e60c-4053-96af-5ab1c4960855\") " pod="openshift-dns/node-resolver-nbf64" Apr 23 13:31:15.598112 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:15.597976 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/cecba8e3-e60c-4053-96af-5ab1c4960855-hosts-file\") pod \"node-resolver-nbf64\" (UID: \"cecba8e3-e60c-4053-96af-5ab1c4960855\") " pod="openshift-dns/node-resolver-nbf64" Apr 23 13:31:15.598112 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:15.598065 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fqdrj\" (UniqueName: \"kubernetes.io/projected/cecba8e3-e60c-4053-96af-5ab1c4960855-kube-api-access-fqdrj\") pod \"node-resolver-nbf64\" (UID: \"cecba8e3-e60c-4053-96af-5ab1c4960855\") " pod="openshift-dns/node-resolver-nbf64" Apr 23 13:31:15.699471 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:15.699381 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fqdrj\" (UniqueName: \"kubernetes.io/projected/cecba8e3-e60c-4053-96af-5ab1c4960855-kube-api-access-fqdrj\") pod \"node-resolver-nbf64\" (UID: \"cecba8e3-e60c-4053-96af-5ab1c4960855\") " pod="openshift-dns/node-resolver-nbf64" Apr 23 13:31:15.699645 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:15.699476 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/cecba8e3-e60c-4053-96af-5ab1c4960855-tmp-dir\") pod \"node-resolver-nbf64\" (UID: \"cecba8e3-e60c-4053-96af-5ab1c4960855\") " pod="openshift-dns/node-resolver-nbf64" Apr 23 13:31:15.699645 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:15.699522 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/cecba8e3-e60c-4053-96af-5ab1c4960855-hosts-file\") pod \"node-resolver-nbf64\" (UID: \"cecba8e3-e60c-4053-96af-5ab1c4960855\") " pod="openshift-dns/node-resolver-nbf64" Apr 23 13:31:15.699645 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:15.699616 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/cecba8e3-e60c-4053-96af-5ab1c4960855-hosts-file\") pod \"node-resolver-nbf64\" (UID: \"cecba8e3-e60c-4053-96af-5ab1c4960855\") " pod="openshift-dns/node-resolver-nbf64" Apr 23 13:31:15.699914 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:15.699867 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/cecba8e3-e60c-4053-96af-5ab1c4960855-tmp-dir\") pod \"node-resolver-nbf64\" (UID: \"cecba8e3-e60c-4053-96af-5ab1c4960855\") " pod="openshift-dns/node-resolver-nbf64" Apr 23 13:31:15.709711 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:15.709115 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fqdrj\" (UniqueName: \"kubernetes.io/projected/cecba8e3-e60c-4053-96af-5ab1c4960855-kube-api-access-fqdrj\") pod \"node-resolver-nbf64\" (UID: \"cecba8e3-e60c-4053-96af-5ab1c4960855\") " pod="openshift-dns/node-resolver-nbf64" Apr 23 13:31:15.820445 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:15.820409 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-nbf64" Apr 23 13:31:16.430004 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:16.429974 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k958q" Apr 23 13:31:16.430468 ip-10-0-128-108 kubenswrapper[2571]: E0423 13:31:16.430099 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k958q" podUID="6866a2aa-1943-4e03-a99a-8b054a2434c8" Apr 23 13:31:17.429279 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:17.429205 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qqr6d" Apr 23 13:31:17.429438 ip-10-0-128-108 kubenswrapper[2571]: E0423 13:31:17.429324 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qqr6d" podUID="958f9f23-bac7-4183-b8b4-7d4d89901105" Apr 23 13:31:18.429138 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:18.429108 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k958q" Apr 23 13:31:18.429595 ip-10-0-128-108 kubenswrapper[2571]: E0423 13:31:18.429267 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k958q" podUID="6866a2aa-1943-4e03-a99a-8b054a2434c8" Apr 23 13:31:19.429699 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:19.429662 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qqr6d" Apr 23 13:31:19.430124 ip-10-0-128-108 kubenswrapper[2571]: E0423 13:31:19.429782 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qqr6d" podUID="958f9f23-bac7-4183-b8b4-7d4d89901105" Apr 23 13:31:19.992502 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:19.992469 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-lv2lp"] Apr 23 13:31:19.996473 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:19.996443 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-lv2lp" Apr 23 13:31:19.996598 ip-10-0-128-108 kubenswrapper[2571]: E0423 13:31:19.996516 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-lv2lp" podUID="1e27565d-df42-41a7-9d41-eb8595cf751e" Apr 23 13:31:20.028682 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:20.028654 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/1e27565d-df42-41a7-9d41-eb8595cf751e-kubelet-config\") pod \"global-pull-secret-syncer-lv2lp\" (UID: \"1e27565d-df42-41a7-9d41-eb8595cf751e\") " pod="kube-system/global-pull-secret-syncer-lv2lp" Apr 23 13:31:20.028825 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:20.028805 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/1e27565d-df42-41a7-9d41-eb8595cf751e-dbus\") pod \"global-pull-secret-syncer-lv2lp\" (UID: \"1e27565d-df42-41a7-9d41-eb8595cf751e\") " pod="kube-system/global-pull-secret-syncer-lv2lp" Apr 23 13:31:20.028895 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:20.028840 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/1e27565d-df42-41a7-9d41-eb8595cf751e-original-pull-secret\") pod \"global-pull-secret-syncer-lv2lp\" (UID: \"1e27565d-df42-41a7-9d41-eb8595cf751e\") " pod="kube-system/global-pull-secret-syncer-lv2lp" Apr 23 13:31:20.129911 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:20.129874 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/1e27565d-df42-41a7-9d41-eb8595cf751e-kubelet-config\") pod \"global-pull-secret-syncer-lv2lp\" (UID: \"1e27565d-df42-41a7-9d41-eb8595cf751e\") " pod="kube-system/global-pull-secret-syncer-lv2lp" Apr 23 13:31:20.130087 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:20.129947 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/1e27565d-df42-41a7-9d41-eb8595cf751e-dbus\") pod \"global-pull-secret-syncer-lv2lp\" (UID: \"1e27565d-df42-41a7-9d41-eb8595cf751e\") " pod="kube-system/global-pull-secret-syncer-lv2lp" Apr 23 13:31:20.130087 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:20.129964 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/1e27565d-df42-41a7-9d41-eb8595cf751e-original-pull-secret\") pod \"global-pull-secret-syncer-lv2lp\" (UID: \"1e27565d-df42-41a7-9d41-eb8595cf751e\") " pod="kube-system/global-pull-secret-syncer-lv2lp" Apr 23 13:31:20.130087 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:20.130030 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/1e27565d-df42-41a7-9d41-eb8595cf751e-kubelet-config\") pod \"global-pull-secret-syncer-lv2lp\" (UID: \"1e27565d-df42-41a7-9d41-eb8595cf751e\") " pod="kube-system/global-pull-secret-syncer-lv2lp" Apr 23 13:31:20.130197 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:20.130142 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/1e27565d-df42-41a7-9d41-eb8595cf751e-dbus\") pod \"global-pull-secret-syncer-lv2lp\" (UID: \"1e27565d-df42-41a7-9d41-eb8595cf751e\") " pod="kube-system/global-pull-secret-syncer-lv2lp" Apr 23 13:31:20.130197 ip-10-0-128-108 kubenswrapper[2571]: E0423 13:31:20.130181 2571 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 23 13:31:20.130273 ip-10-0-128-108 kubenswrapper[2571]: E0423 13:31:20.130247 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1e27565d-df42-41a7-9d41-eb8595cf751e-original-pull-secret podName:1e27565d-df42-41a7-9d41-eb8595cf751e nodeName:}" failed. No retries permitted until 2026-04-23 13:31:20.63022913 +0000 UTC m=+14.768507811 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/1e27565d-df42-41a7-9d41-eb8595cf751e-original-pull-secret") pod "global-pull-secret-syncer-lv2lp" (UID: "1e27565d-df42-41a7-9d41-eb8595cf751e") : object "kube-system"/"original-pull-secret" not registered Apr 23 13:31:20.429118 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:20.429032 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k958q" Apr 23 13:31:20.429285 ip-10-0-128-108 kubenswrapper[2571]: E0423 13:31:20.429172 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k958q" podUID="6866a2aa-1943-4e03-a99a-8b054a2434c8" Apr 23 13:31:20.633659 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:20.633622 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/1e27565d-df42-41a7-9d41-eb8595cf751e-original-pull-secret\") pod \"global-pull-secret-syncer-lv2lp\" (UID: \"1e27565d-df42-41a7-9d41-eb8595cf751e\") " pod="kube-system/global-pull-secret-syncer-lv2lp" Apr 23 13:31:20.634074 ip-10-0-128-108 kubenswrapper[2571]: E0423 13:31:20.633753 2571 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 23 13:31:20.634074 ip-10-0-128-108 kubenswrapper[2571]: E0423 13:31:20.633827 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1e27565d-df42-41a7-9d41-eb8595cf751e-original-pull-secret podName:1e27565d-df42-41a7-9d41-eb8595cf751e nodeName:}" failed. No retries permitted until 2026-04-23 13:31:21.633808696 +0000 UTC m=+15.772087361 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/1e27565d-df42-41a7-9d41-eb8595cf751e-original-pull-secret") pod "global-pull-secret-syncer-lv2lp" (UID: "1e27565d-df42-41a7-9d41-eb8595cf751e") : object "kube-system"/"original-pull-secret" not registered Apr 23 13:31:21.429504 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:21.429462 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-lv2lp" Apr 23 13:31:21.429765 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:21.429479 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qqr6d" Apr 23 13:31:21.429765 ip-10-0-128-108 kubenswrapper[2571]: E0423 13:31:21.429590 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-lv2lp" podUID="1e27565d-df42-41a7-9d41-eb8595cf751e" Apr 23 13:31:21.429765 ip-10-0-128-108 kubenswrapper[2571]: E0423 13:31:21.429653 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qqr6d" podUID="958f9f23-bac7-4183-b8b4-7d4d89901105" Apr 23 13:31:21.640187 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:21.640145 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/1e27565d-df42-41a7-9d41-eb8595cf751e-original-pull-secret\") pod \"global-pull-secret-syncer-lv2lp\" (UID: \"1e27565d-df42-41a7-9d41-eb8595cf751e\") " pod="kube-system/global-pull-secret-syncer-lv2lp" Apr 23 13:31:21.640594 ip-10-0-128-108 kubenswrapper[2571]: E0423 13:31:21.640289 2571 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 23 13:31:21.640594 ip-10-0-128-108 kubenswrapper[2571]: E0423 13:31:21.640383 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1e27565d-df42-41a7-9d41-eb8595cf751e-original-pull-secret podName:1e27565d-df42-41a7-9d41-eb8595cf751e nodeName:}" failed. No retries permitted until 2026-04-23 13:31:23.640362388 +0000 UTC m=+17.778641063 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/1e27565d-df42-41a7-9d41-eb8595cf751e-original-pull-secret") pod "global-pull-secret-syncer-lv2lp" (UID: "1e27565d-df42-41a7-9d41-eb8595cf751e") : object "kube-system"/"original-pull-secret" not registered Apr 23 13:31:22.429059 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:22.429022 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k958q" Apr 23 13:31:22.429224 ip-10-0-128-108 kubenswrapper[2571]: E0423 13:31:22.429145 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k958q" podUID="6866a2aa-1943-4e03-a99a-8b054a2434c8" Apr 23 13:31:23.051197 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:23.051165 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xkgtj\" (UniqueName: \"kubernetes.io/projected/958f9f23-bac7-4183-b8b4-7d4d89901105-kube-api-access-xkgtj\") pod \"network-check-target-qqr6d\" (UID: \"958f9f23-bac7-4183-b8b4-7d4d89901105\") " pod="openshift-network-diagnostics/network-check-target-qqr6d" Apr 23 13:31:23.051683 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:23.051228 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6866a2aa-1943-4e03-a99a-8b054a2434c8-metrics-certs\") pod \"network-metrics-daemon-k958q\" (UID: \"6866a2aa-1943-4e03-a99a-8b054a2434c8\") " pod="openshift-multus/network-metrics-daemon-k958q" Apr 23 13:31:23.051683 ip-10-0-128-108 kubenswrapper[2571]: E0423 13:31:23.051348 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 13:31:23.051683 ip-10-0-128-108 kubenswrapper[2571]: E0423 13:31:23.051363 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 13:31:23.051683 ip-10-0-128-108 kubenswrapper[2571]: E0423 13:31:23.051389 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 13:31:23.051683 ip-10-0-128-108 kubenswrapper[2571]: E0423 13:31:23.051403 2571 projected.go:194] Error preparing data for projected volume kube-api-access-xkgtj for pod openshift-network-diagnostics/network-check-target-qqr6d: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 13:31:23.051683 ip-10-0-128-108 kubenswrapper[2571]: E0423 13:31:23.051424 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6866a2aa-1943-4e03-a99a-8b054a2434c8-metrics-certs podName:6866a2aa-1943-4e03-a99a-8b054a2434c8 nodeName:}" failed. No retries permitted until 2026-04-23 13:31:39.051401696 +0000 UTC m=+33.189680379 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6866a2aa-1943-4e03-a99a-8b054a2434c8-metrics-certs") pod "network-metrics-daemon-k958q" (UID: "6866a2aa-1943-4e03-a99a-8b054a2434c8") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 13:31:23.051683 ip-10-0-128-108 kubenswrapper[2571]: E0423 13:31:23.051451 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/958f9f23-bac7-4183-b8b4-7d4d89901105-kube-api-access-xkgtj podName:958f9f23-bac7-4183-b8b4-7d4d89901105 nodeName:}" failed. No retries permitted until 2026-04-23 13:31:39.051437971 +0000 UTC m=+33.189716634 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-xkgtj" (UniqueName: "kubernetes.io/projected/958f9f23-bac7-4183-b8b4-7d4d89901105-kube-api-access-xkgtj") pod "network-check-target-qqr6d" (UID: "958f9f23-bac7-4183-b8b4-7d4d89901105") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 13:31:23.428847 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:23.428726 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qqr6d" Apr 23 13:31:23.428847 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:23.428726 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-lv2lp" Apr 23 13:31:23.429082 ip-10-0-128-108 kubenswrapper[2571]: E0423 13:31:23.428909 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qqr6d" podUID="958f9f23-bac7-4183-b8b4-7d4d89901105" Apr 23 13:31:23.429082 ip-10-0-128-108 kubenswrapper[2571]: E0423 13:31:23.429051 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-lv2lp" podUID="1e27565d-df42-41a7-9d41-eb8595cf751e" Apr 23 13:31:23.656747 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:23.656703 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/1e27565d-df42-41a7-9d41-eb8595cf751e-original-pull-secret\") pod \"global-pull-secret-syncer-lv2lp\" (UID: \"1e27565d-df42-41a7-9d41-eb8595cf751e\") " pod="kube-system/global-pull-secret-syncer-lv2lp" Apr 23 13:31:23.657022 ip-10-0-128-108 kubenswrapper[2571]: E0423 13:31:23.656846 2571 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 23 13:31:23.657022 ip-10-0-128-108 kubenswrapper[2571]: E0423 13:31:23.656923 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1e27565d-df42-41a7-9d41-eb8595cf751e-original-pull-secret podName:1e27565d-df42-41a7-9d41-eb8595cf751e nodeName:}" failed. No retries permitted until 2026-04-23 13:31:27.656903925 +0000 UTC m=+21.795182591 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/1e27565d-df42-41a7-9d41-eb8595cf751e-original-pull-secret") pod "global-pull-secret-syncer-lv2lp" (UID: "1e27565d-df42-41a7-9d41-eb8595cf751e") : object "kube-system"/"original-pull-secret" not registered Apr 23 13:31:24.429242 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:24.429207 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k958q" Apr 23 13:31:24.429721 ip-10-0-128-108 kubenswrapper[2571]: E0423 13:31:24.429354 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k958q" podUID="6866a2aa-1943-4e03-a99a-8b054a2434c8" Apr 23 13:31:25.429775 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:25.429606 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-lv2lp" Apr 23 13:31:25.430118 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:25.429604 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qqr6d" Apr 23 13:31:25.430118 ip-10-0-128-108 kubenswrapper[2571]: E0423 13:31:25.429887 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-lv2lp" podUID="1e27565d-df42-41a7-9d41-eb8595cf751e" Apr 23 13:31:25.430118 ip-10-0-128-108 kubenswrapper[2571]: E0423 13:31:25.429926 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qqr6d" podUID="958f9f23-bac7-4183-b8b4-7d4d89901105" Apr 23 13:31:25.554692 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:25.554662 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-wfrll" event={"ID":"8b9f46e6-e759-4c41-be74-de439958c36d","Type":"ContainerStarted","Data":"84296ad5139d0c74c9a3450e9d600bae32afa4da489bc26677676f4aba3f132e"} Apr 23 13:31:25.557061 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:25.557001 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zk8wt" event={"ID":"a5619baf-099b-4d83-ad43-fd7d0083f57b","Type":"ContainerStarted","Data":"641a152c4cb919e36a751959354b6088a88f43c876d7a9581b93c0ca25e08d09"} Apr 23 13:31:25.557061 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:25.557036 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zk8wt" event={"ID":"a5619baf-099b-4d83-ad43-fd7d0083f57b","Type":"ContainerStarted","Data":"7831b5a9a68858809e076e00a25d5bfae03cfbf812d2234795f4754dda75c4db"} Apr 23 13:31:25.557061 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:25.557050 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zk8wt" event={"ID":"a5619baf-099b-4d83-ad43-fd7d0083f57b","Type":"ContainerStarted","Data":"72ceccbe0bed3fdd4076a3c85dff1ec7cc73e4f119dad5d99ca8e8a798f6e74c"} Apr 23 13:31:25.559101 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:25.559068 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-2g7ct" event={"ID":"abbb4dd8-8bc4-4850-bdc6-2b1ed2494101","Type":"ContainerStarted","Data":"90fd247bbfd542284c0575db8cb901fa4e96d768ec755b1d426176f38b2741db"} Apr 23 13:31:25.560703 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:25.560670 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-nbf64" event={"ID":"cecba8e3-e60c-4053-96af-5ab1c4960855","Type":"ContainerStarted","Data":"6551a6563b2e9165edcc727e258218f5089452f9cc80ee0ddf1b2601b5dd74db"} Apr 23 13:31:25.562424 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:25.562404 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-108.ec2.internal" event={"ID":"95cd19913bda1121c44084ba2ccca700","Type":"ContainerStarted","Data":"a30c3d5a57fa217cd1e39d536b659fec1d5667de16464fba1fcd879653a59cca"} Apr 23 13:31:25.570174 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:25.570136 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-wfrll" podStartSLOduration=1.90246677 podStartE2EDuration="19.570126648s" podCreationTimestamp="2026-04-23 13:31:06 +0000 UTC" firstStartedPulling="2026-04-23 13:31:07.727796061 +0000 UTC m=+1.866074724" lastFinishedPulling="2026-04-23 13:31:25.395455923 +0000 UTC m=+19.533734602" observedRunningTime="2026-04-23 13:31:25.569936928 +0000 UTC m=+19.708215614" watchObservedRunningTime="2026-04-23 13:31:25.570126648 +0000 UTC m=+19.708405333" Apr 23 13:31:25.585519 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:25.585462 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-2g7ct" podStartSLOduration=2.085884899 podStartE2EDuration="19.585448279s" podCreationTimestamp="2026-04-23 13:31:06 +0000 UTC" firstStartedPulling="2026-04-23 13:31:07.685197514 +0000 UTC m=+1.823476178" lastFinishedPulling="2026-04-23 13:31:25.184760892 +0000 UTC m=+19.323039558" observedRunningTime="2026-04-23 13:31:25.584855292 +0000 UTC m=+19.723133976" watchObservedRunningTime="2026-04-23 13:31:25.585448279 +0000 UTC m=+19.723726965" Apr 23 13:31:25.597681 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:25.597643 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-108.ec2.internal" podStartSLOduration=19.597632769 podStartE2EDuration="19.597632769s" podCreationTimestamp="2026-04-23 13:31:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 13:31:25.597238857 +0000 UTC m=+19.735517545" watchObservedRunningTime="2026-04-23 13:31:25.597632769 +0000 UTC m=+19.735911454" Apr 23 13:31:26.429627 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:26.429596 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k958q" Apr 23 13:31:26.429799 ip-10-0-128-108 kubenswrapper[2571]: E0423 13:31:26.429687 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k958q" podUID="6866a2aa-1943-4e03-a99a-8b054a2434c8" Apr 23 13:31:26.564996 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:26.564969 2571 generic.go:358] "Generic (PLEG): container finished" podID="5a1bfc99dc8a1b47af59ec4d15c476b0" containerID="68c93a23666e4ee5b715ded548c4df0616ad1ad1ae06b49ec646ce4a498285f1" exitCode=0 Apr 23 13:31:26.565117 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:26.565037 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-108.ec2.internal" event={"ID":"5a1bfc99dc8a1b47af59ec4d15c476b0","Type":"ContainerDied","Data":"68c93a23666e4ee5b715ded548c4df0616ad1ad1ae06b49ec646ce4a498285f1"} Apr 23 13:31:26.566221 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:26.566199 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-2rlxj" event={"ID":"c524008d-56ba-4f94-99b4-bd3ef55ba66f","Type":"ContainerStarted","Data":"4fcd694b4e0acbe872d42be7c93b10540d4be9a10f4d61c0f40f622deb3cbcf5"} Apr 23 13:31:26.567373 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:26.567352 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zq8gn" event={"ID":"76d484ba-9464-4ff8-b453-2a202a4e649c","Type":"ContainerStarted","Data":"092a272c296c29ce8f779c3dc3e85e5047209efead78a4b1a64d0a46aed50734"} Apr 23 13:31:26.570728 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:26.570706 2571 generic.go:358] "Generic (PLEG): container finished" podID="c1494fbd-44b0-417d-857b-089a1705bbe9" containerID="5af3ae367c9325f9c262768a4dc342d11dd7a0b31024b2a39a5559e51e401c84" exitCode=0 Apr 23 13:31:26.570815 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:26.570783 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rm6cm" event={"ID":"c1494fbd-44b0-417d-857b-089a1705bbe9","Type":"ContainerDied","Data":"5af3ae367c9325f9c262768a4dc342d11dd7a0b31024b2a39a5559e51e401c84"} Apr 23 13:31:26.573317 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:26.573299 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zk8wt_a5619baf-099b-4d83-ad43-fd7d0083f57b/ovn-acl-logging/0.log" Apr 23 13:31:26.573611 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:26.573589 2571 generic.go:358] "Generic (PLEG): container finished" podID="a5619baf-099b-4d83-ad43-fd7d0083f57b" containerID="7831b5a9a68858809e076e00a25d5bfae03cfbf812d2234795f4754dda75c4db" exitCode=1 Apr 23 13:31:26.573711 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:26.573609 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zk8wt" event={"ID":"a5619baf-099b-4d83-ad43-fd7d0083f57b","Type":"ContainerDied","Data":"7831b5a9a68858809e076e00a25d5bfae03cfbf812d2234795f4754dda75c4db"} Apr 23 13:31:26.573711 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:26.573635 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zk8wt" event={"ID":"a5619baf-099b-4d83-ad43-fd7d0083f57b","Type":"ContainerStarted","Data":"40a73012a737bc48a82bb7e21bf6561dd82385d79c2102ef969b30899b484bf3"} Apr 23 13:31:26.573711 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:26.573655 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zk8wt" event={"ID":"a5619baf-099b-4d83-ad43-fd7d0083f57b","Type":"ContainerStarted","Data":"1929f91e2e9628d8df8051d309fc8ffaf2f4aa909607a578c6cd98d04fe6da3d"} Apr 23 13:31:26.573711 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:26.573664 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zk8wt" event={"ID":"a5619baf-099b-4d83-ad43-fd7d0083f57b","Type":"ContainerStarted","Data":"6e6b5bd40486938fa7dc23f9d520860190d1026466ad4893ce1a46d4bec02b87"} Apr 23 13:31:26.574865 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:26.574834 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4xm4l" event={"ID":"d6840304-d946-4507-bcb1-55547ba40093","Type":"ContainerStarted","Data":"1a73516a94b8462613ce4991b0f5ee444aedeaf14d5d32318bd61285d3f2ccdb"} Apr 23 13:31:26.576072 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:26.576053 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-5kfq8" event={"ID":"c3276fec-f4f2-47b0-bc26-38fc5eea9ab7","Type":"ContainerStarted","Data":"94edf2a2f9a0af2b392594f827c8e83320ceaef5f982b7f00d51c1bf205b52a5"} Apr 23 13:31:26.577224 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:26.577205 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-nbf64" event={"ID":"cecba8e3-e60c-4053-96af-5ab1c4960855","Type":"ContainerStarted","Data":"8b419b1048535d03c1eb71053fb4bbd07835f6042591645b7e42361939ea2e83"} Apr 23 13:31:26.594928 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:26.594887 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-4xm4l" podStartSLOduration=3.157506101 podStartE2EDuration="20.59487565s" podCreationTimestamp="2026-04-23 13:31:06 +0000 UTC" firstStartedPulling="2026-04-23 13:31:07.70994274 +0000 UTC m=+1.848221403" lastFinishedPulling="2026-04-23 13:31:25.147312289 +0000 UTC m=+19.285590952" observedRunningTime="2026-04-23 13:31:26.594495839 +0000 UTC m=+20.732774528" watchObservedRunningTime="2026-04-23 13:31:26.59487565 +0000 UTC m=+20.733154345" Apr 23 13:31:26.607320 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:26.607272 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-2rlxj" podStartSLOduration=3.245346472 podStartE2EDuration="20.607257014s" podCreationTimestamp="2026-04-23 13:31:06 +0000 UTC" firstStartedPulling="2026-04-23 13:31:07.73191728 +0000 UTC m=+1.870195943" lastFinishedPulling="2026-04-23 13:31:25.093827807 +0000 UTC m=+19.232106485" observedRunningTime="2026-04-23 13:31:26.60711133 +0000 UTC m=+20.745390017" watchObservedRunningTime="2026-04-23 13:31:26.607257014 +0000 UTC m=+20.745535700" Apr 23 13:31:26.621956 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:26.621910 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-nbf64" podStartSLOduration=11.621894667 podStartE2EDuration="11.621894667s" podCreationTimestamp="2026-04-23 13:31:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 13:31:26.620970944 +0000 UTC m=+20.759249629" watchObservedRunningTime="2026-04-23 13:31:26.621894667 +0000 UTC m=+20.760173352" Apr 23 13:31:26.656016 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:26.655980 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-5kfq8" podStartSLOduration=3.1777855329999998 podStartE2EDuration="20.65596891s" podCreationTimestamp="2026-04-23 13:31:06 +0000 UTC" firstStartedPulling="2026-04-23 13:31:07.668767577 +0000 UTC m=+1.807046239" lastFinishedPulling="2026-04-23 13:31:25.146950946 +0000 UTC m=+19.285229616" observedRunningTime="2026-04-23 13:31:26.655780108 +0000 UTC m=+20.794058794" watchObservedRunningTime="2026-04-23 13:31:26.65596891 +0000 UTC m=+20.794247594" Apr 23 13:31:27.277305 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:27.277282 2571 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 23 13:31:27.397241 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:27.397069 2571 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-23T13:31:27.277301724Z","UUID":"d3e8e773-d700-431e-9c52-165b1b60a892","Handler":null,"Name":"","Endpoint":""} Apr 23 13:31:27.398854 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:27.398833 2571 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 23 13:31:27.398953 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:27.398863 2571 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 23 13:31:27.429312 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:27.429276 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-lv2lp" Apr 23 13:31:27.429494 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:27.429276 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qqr6d" Apr 23 13:31:27.429494 ip-10-0-128-108 kubenswrapper[2571]: E0423 13:31:27.429397 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-lv2lp" podUID="1e27565d-df42-41a7-9d41-eb8595cf751e" Apr 23 13:31:27.429494 ip-10-0-128-108 kubenswrapper[2571]: E0423 13:31:27.429436 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qqr6d" podUID="958f9f23-bac7-4183-b8b4-7d4d89901105" Apr 23 13:31:27.545765 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:27.545731 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-5kfq8" Apr 23 13:31:27.580969 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:27.580936 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-108.ec2.internal" event={"ID":"5a1bfc99dc8a1b47af59ec4d15c476b0","Type":"ContainerStarted","Data":"f296e648f03c7d90250c2a4171f2123290bb590f22bafa4a8c96146d1de336a2"} Apr 23 13:31:27.582685 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:27.582656 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zq8gn" event={"ID":"76d484ba-9464-4ff8-b453-2a202a4e649c","Type":"ContainerStarted","Data":"e6efa66dbe4b8d522e19c827dd7ba99382118bc29d8fc852707649e910631ce1"} Apr 23 13:31:27.690066 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:27.689960 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/1e27565d-df42-41a7-9d41-eb8595cf751e-original-pull-secret\") pod \"global-pull-secret-syncer-lv2lp\" (UID: \"1e27565d-df42-41a7-9d41-eb8595cf751e\") " pod="kube-system/global-pull-secret-syncer-lv2lp" Apr 23 13:31:27.690233 ip-10-0-128-108 kubenswrapper[2571]: E0423 13:31:27.690115 2571 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 23 13:31:27.690233 ip-10-0-128-108 kubenswrapper[2571]: E0423 13:31:27.690183 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1e27565d-df42-41a7-9d41-eb8595cf751e-original-pull-secret podName:1e27565d-df42-41a7-9d41-eb8595cf751e nodeName:}" failed. No retries permitted until 2026-04-23 13:31:35.690163402 +0000 UTC m=+29.828442067 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/1e27565d-df42-41a7-9d41-eb8595cf751e-original-pull-secret") pod "global-pull-secret-syncer-lv2lp" (UID: "1e27565d-df42-41a7-9d41-eb8595cf751e") : object "kube-system"/"original-pull-secret" not registered Apr 23 13:31:28.429643 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:28.429569 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k958q" Apr 23 13:31:28.429795 ip-10-0-128-108 kubenswrapper[2571]: E0423 13:31:28.429704 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k958q" podUID="6866a2aa-1943-4e03-a99a-8b054a2434c8" Apr 23 13:31:28.586791 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:28.586753 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zq8gn" event={"ID":"76d484ba-9464-4ff8-b453-2a202a4e649c","Type":"ContainerStarted","Data":"2569150584f5563c50a78d89415aa44aee7be6529b3ad89582b04bede6aef2f9"} Apr 23 13:31:28.589711 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:28.589685 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zk8wt_a5619baf-099b-4d83-ad43-fd7d0083f57b/ovn-acl-logging/0.log" Apr 23 13:31:28.590024 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:28.589985 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zk8wt" event={"ID":"a5619baf-099b-4d83-ad43-fd7d0083f57b","Type":"ContainerStarted","Data":"24b7c43872dfbe837fdfb3d361deffa77c3533dcdf44079df909f8c32d9f387d"} Apr 23 13:31:28.606800 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:28.605136 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-108.ec2.internal" podStartSLOduration=22.60511872 podStartE2EDuration="22.60511872s" podCreationTimestamp="2026-04-23 13:31:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 13:31:27.59648853 +0000 UTC m=+21.734767214" watchObservedRunningTime="2026-04-23 13:31:28.60511872 +0000 UTC m=+22.743397405" Apr 23 13:31:29.428786 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:29.428754 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qqr6d" Apr 23 13:31:29.428950 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:29.428757 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-lv2lp" Apr 23 13:31:29.428950 ip-10-0-128-108 kubenswrapper[2571]: E0423 13:31:29.428851 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qqr6d" podUID="958f9f23-bac7-4183-b8b4-7d4d89901105" Apr 23 13:31:29.429039 ip-10-0-128-108 kubenswrapper[2571]: E0423 13:31:29.428964 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-lv2lp" podUID="1e27565d-df42-41a7-9d41-eb8595cf751e" Apr 23 13:31:30.312664 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:30.312626 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-5kfq8" Apr 23 13:31:30.313323 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:30.313261 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-5kfq8" Apr 23 13:31:30.331578 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:30.331528 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zq8gn" podStartSLOduration=3.9321753 podStartE2EDuration="24.331512845s" podCreationTimestamp="2026-04-23 13:31:06 +0000 UTC" firstStartedPulling="2026-04-23 13:31:07.722834601 +0000 UTC m=+1.861113268" lastFinishedPulling="2026-04-23 13:31:28.122172142 +0000 UTC m=+22.260450813" observedRunningTime="2026-04-23 13:31:28.607365849 +0000 UTC m=+22.745644536" watchObservedRunningTime="2026-04-23 13:31:30.331512845 +0000 UTC m=+24.469791531" Apr 23 13:31:30.429292 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:30.429095 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k958q" Apr 23 13:31:30.429436 ip-10-0-128-108 kubenswrapper[2571]: E0423 13:31:30.429383 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k958q" podUID="6866a2aa-1943-4e03-a99a-8b054a2434c8" Apr 23 13:31:30.596612 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:30.596589 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zk8wt_a5619baf-099b-4d83-ad43-fd7d0083f57b/ovn-acl-logging/0.log" Apr 23 13:31:30.596939 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:30.596912 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zk8wt" event={"ID":"a5619baf-099b-4d83-ad43-fd7d0083f57b","Type":"ContainerStarted","Data":"5e5afac7654d8c2cd47ad54d47a20a4adf08e8307963baa5500ea910d74f9819"} Apr 23 13:31:30.597269 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:30.597248 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-zk8wt" Apr 23 13:31:30.597653 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:30.597465 2571 scope.go:117] "RemoveContainer" containerID="7831b5a9a68858809e076e00a25d5bfae03cfbf812d2234795f4754dda75c4db" Apr 23 13:31:30.597727 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:30.597663 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-5kfq8" Apr 23 13:31:30.612511 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:30.612490 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-zk8wt" Apr 23 13:31:31.429004 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:31.428975 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qqr6d" Apr 23 13:31:31.429568 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:31.428975 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-lv2lp" Apr 23 13:31:31.429568 ip-10-0-128-108 kubenswrapper[2571]: E0423 13:31:31.429146 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-lv2lp" podUID="1e27565d-df42-41a7-9d41-eb8595cf751e" Apr 23 13:31:31.429568 ip-10-0-128-108 kubenswrapper[2571]: E0423 13:31:31.429064 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qqr6d" podUID="958f9f23-bac7-4183-b8b4-7d4d89901105" Apr 23 13:31:31.599958 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:31.599923 2571 generic.go:358] "Generic (PLEG): container finished" podID="c1494fbd-44b0-417d-857b-089a1705bbe9" containerID="46fbcfd5d15aa249f6ab1db9d18ffb24a654a415162095596934be13ff391b35" exitCode=0 Apr 23 13:31:31.600139 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:31.599991 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rm6cm" event={"ID":"c1494fbd-44b0-417d-857b-089a1705bbe9","Type":"ContainerDied","Data":"46fbcfd5d15aa249f6ab1db9d18ffb24a654a415162095596934be13ff391b35"} Apr 23 13:31:31.603360 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:31.603326 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zk8wt_a5619baf-099b-4d83-ad43-fd7d0083f57b/ovn-acl-logging/0.log" Apr 23 13:31:31.603769 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:31.603740 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zk8wt" event={"ID":"a5619baf-099b-4d83-ad43-fd7d0083f57b","Type":"ContainerStarted","Data":"f82be9830f2c8e4a2d3ffe3eb3db4957b28ba765bfbbe46df058d22ebad5cc6d"} Apr 23 13:31:31.603908 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:31.603888 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-zk8wt" Apr 23 13:31:31.603992 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:31.603917 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-zk8wt" Apr 23 13:31:31.618821 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:31.618802 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-zk8wt" Apr 23 13:31:31.675262 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:31.675209 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-zk8wt" podStartSLOduration=8.221012046 podStartE2EDuration="25.675191867s" podCreationTimestamp="2026-04-23 13:31:06 +0000 UTC" firstStartedPulling="2026-04-23 13:31:07.717877221 +0000 UTC m=+1.856155884" lastFinishedPulling="2026-04-23 13:31:25.17205704 +0000 UTC m=+19.310335705" observedRunningTime="2026-04-23 13:31:31.674644162 +0000 UTC m=+25.812922849" watchObservedRunningTime="2026-04-23 13:31:31.675191867 +0000 UTC m=+25.813470587" Apr 23 13:31:32.433064 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:32.433040 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k958q" Apr 23 13:31:32.433509 ip-10-0-128-108 kubenswrapper[2571]: E0423 13:31:32.433186 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k958q" podUID="6866a2aa-1943-4e03-a99a-8b054a2434c8" Apr 23 13:31:32.660650 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:32.660621 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-k958q"] Apr 23 13:31:32.660772 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:32.660742 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k958q" Apr 23 13:31:32.660917 ip-10-0-128-108 kubenswrapper[2571]: E0423 13:31:32.660892 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k958q" podUID="6866a2aa-1943-4e03-a99a-8b054a2434c8" Apr 23 13:31:32.661764 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:32.661720 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-qqr6d"] Apr 23 13:31:32.661985 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:32.661970 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qqr6d" Apr 23 13:31:32.662217 ip-10-0-128-108 kubenswrapper[2571]: E0423 13:31:32.662197 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qqr6d" podUID="958f9f23-bac7-4183-b8b4-7d4d89901105" Apr 23 13:31:32.662746 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:32.662702 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-lv2lp"] Apr 23 13:31:32.662832 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:32.662774 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-lv2lp" Apr 23 13:31:32.662888 ip-10-0-128-108 kubenswrapper[2571]: E0423 13:31:32.662866 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-lv2lp" podUID="1e27565d-df42-41a7-9d41-eb8595cf751e" Apr 23 13:31:33.608950 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:33.608916 2571 generic.go:358] "Generic (PLEG): container finished" podID="c1494fbd-44b0-417d-857b-089a1705bbe9" containerID="37287c9debfc1c2768b3acdde31b1f8773db9f12f1e46b358d2b1e9ffa4eaa16" exitCode=0 Apr 23 13:31:33.609534 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:33.608996 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rm6cm" event={"ID":"c1494fbd-44b0-417d-857b-089a1705bbe9","Type":"ContainerDied","Data":"37287c9debfc1c2768b3acdde31b1f8773db9f12f1e46b358d2b1e9ffa4eaa16"} Apr 23 13:31:34.429301 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:34.429268 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-lv2lp" Apr 23 13:31:34.429440 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:34.429273 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k958q" Apr 23 13:31:34.429440 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:34.429281 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qqr6d" Apr 23 13:31:34.429440 ip-10-0-128-108 kubenswrapper[2571]: E0423 13:31:34.429399 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-lv2lp" podUID="1e27565d-df42-41a7-9d41-eb8595cf751e" Apr 23 13:31:34.429557 ip-10-0-128-108 kubenswrapper[2571]: E0423 13:31:34.429477 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qqr6d" podUID="958f9f23-bac7-4183-b8b4-7d4d89901105" Apr 23 13:31:34.429557 ip-10-0-128-108 kubenswrapper[2571]: E0423 13:31:34.429548 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k958q" podUID="6866a2aa-1943-4e03-a99a-8b054a2434c8" Apr 23 13:31:35.614688 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:35.614651 2571 generic.go:358] "Generic (PLEG): container finished" podID="c1494fbd-44b0-417d-857b-089a1705bbe9" containerID="1fb92b4a81f1912e63b41f2adf5e4c9dff1df7367759a473e8aa2c399d4c24b5" exitCode=0 Apr 23 13:31:35.615219 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:35.614703 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rm6cm" event={"ID":"c1494fbd-44b0-417d-857b-089a1705bbe9","Type":"ContainerDied","Data":"1fb92b4a81f1912e63b41f2adf5e4c9dff1df7367759a473e8aa2c399d4c24b5"} Apr 23 13:31:35.757316 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:35.757281 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/1e27565d-df42-41a7-9d41-eb8595cf751e-original-pull-secret\") pod \"global-pull-secret-syncer-lv2lp\" (UID: \"1e27565d-df42-41a7-9d41-eb8595cf751e\") " pod="kube-system/global-pull-secret-syncer-lv2lp" Apr 23 13:31:35.757483 ip-10-0-128-108 kubenswrapper[2571]: E0423 13:31:35.757448 2571 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 23 13:31:35.757541 ip-10-0-128-108 kubenswrapper[2571]: E0423 13:31:35.757528 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1e27565d-df42-41a7-9d41-eb8595cf751e-original-pull-secret podName:1e27565d-df42-41a7-9d41-eb8595cf751e nodeName:}" failed. No retries permitted until 2026-04-23 13:31:51.757509377 +0000 UTC m=+45.895788060 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/1e27565d-df42-41a7-9d41-eb8595cf751e-original-pull-secret") pod "global-pull-secret-syncer-lv2lp" (UID: "1e27565d-df42-41a7-9d41-eb8595cf751e") : object "kube-system"/"original-pull-secret" not registered Apr 23 13:31:36.429789 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:36.429755 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-lv2lp" Apr 23 13:31:36.430023 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:36.429801 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qqr6d" Apr 23 13:31:36.430023 ip-10-0-128-108 kubenswrapper[2571]: E0423 13:31:36.429855 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-lv2lp" podUID="1e27565d-df42-41a7-9d41-eb8595cf751e" Apr 23 13:31:36.430023 ip-10-0-128-108 kubenswrapper[2571]: E0423 13:31:36.429911 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qqr6d" podUID="958f9f23-bac7-4183-b8b4-7d4d89901105" Apr 23 13:31:36.430023 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:36.429924 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k958q" Apr 23 13:31:36.430233 ip-10-0-128-108 kubenswrapper[2571]: E0423 13:31:36.430042 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k958q" podUID="6866a2aa-1943-4e03-a99a-8b054a2434c8" Apr 23 13:31:38.204240 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:38.204059 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-108.ec2.internal" event="NodeReady" Apr 23 13:31:38.204700 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:38.204395 2571 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 23 13:31:38.246784 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:38.246754 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-vf5fb"] Apr 23 13:31:38.279168 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:38.279140 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-wtf4m"] Apr 23 13:31:38.279364 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:38.279324 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-vf5fb" Apr 23 13:31:38.282313 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:38.282293 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 23 13:31:38.282601 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:38.282584 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 23 13:31:38.283689 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:38.283670 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-gtr2j\"" Apr 23 13:31:38.302709 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:38.302691 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-vf5fb"] Apr 23 13:31:38.302709 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:38.302713 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-wtf4m"] Apr 23 13:31:38.302869 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:38.302805 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-wtf4m" Apr 23 13:31:38.305395 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:38.305374 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 23 13:31:38.305531 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:38.305515 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 23 13:31:38.306015 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:38.305996 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-pj4z2\"" Apr 23 13:31:38.306801 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:38.306718 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 23 13:31:38.377543 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:38.377431 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h2wnn\" (UniqueName: \"kubernetes.io/projected/6566a883-ca9d-4251-b7f3-7e0e087e3020-kube-api-access-h2wnn\") pod \"ingress-canary-wtf4m\" (UID: \"6566a883-ca9d-4251-b7f3-7e0e087e3020\") " pod="openshift-ingress-canary/ingress-canary-wtf4m" Apr 23 13:31:38.377684 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:38.377581 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hmz6h\" (UniqueName: \"kubernetes.io/projected/495e17e1-7de5-454d-a2b3-240f3cf879a4-kube-api-access-hmz6h\") pod \"dns-default-vf5fb\" (UID: \"495e17e1-7de5-454d-a2b3-240f3cf879a4\") " pod="openshift-dns/dns-default-vf5fb" Apr 23 13:31:38.377684 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:38.377615 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/495e17e1-7de5-454d-a2b3-240f3cf879a4-config-volume\") pod \"dns-default-vf5fb\" (UID: \"495e17e1-7de5-454d-a2b3-240f3cf879a4\") " pod="openshift-dns/dns-default-vf5fb" Apr 23 13:31:38.377684 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:38.377640 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/495e17e1-7de5-454d-a2b3-240f3cf879a4-tmp-dir\") pod \"dns-default-vf5fb\" (UID: \"495e17e1-7de5-454d-a2b3-240f3cf879a4\") " pod="openshift-dns/dns-default-vf5fb" Apr 23 13:31:38.377684 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:38.377682 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6566a883-ca9d-4251-b7f3-7e0e087e3020-cert\") pod \"ingress-canary-wtf4m\" (UID: \"6566a883-ca9d-4251-b7f3-7e0e087e3020\") " pod="openshift-ingress-canary/ingress-canary-wtf4m" Apr 23 13:31:38.377843 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:38.377699 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/495e17e1-7de5-454d-a2b3-240f3cf879a4-metrics-tls\") pod \"dns-default-vf5fb\" (UID: \"495e17e1-7de5-454d-a2b3-240f3cf879a4\") " pod="openshift-dns/dns-default-vf5fb" Apr 23 13:31:38.428805 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:38.428770 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-lv2lp" Apr 23 13:31:38.428805 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:38.428787 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k958q" Apr 23 13:31:38.428805 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:38.428794 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qqr6d" Apr 23 13:31:38.431787 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:38.431763 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 23 13:31:38.431932 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:38.431914 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 23 13:31:38.432075 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:38.432059 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-lbjnn\"" Apr 23 13:31:38.432150 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:38.432063 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 23 13:31:38.432403 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:38.432383 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 23 13:31:38.432514 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:38.432386 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-5zcnp\"" Apr 23 13:31:38.478924 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:38.478891 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hmz6h\" (UniqueName: \"kubernetes.io/projected/495e17e1-7de5-454d-a2b3-240f3cf879a4-kube-api-access-hmz6h\") pod \"dns-default-vf5fb\" (UID: \"495e17e1-7de5-454d-a2b3-240f3cf879a4\") " pod="openshift-dns/dns-default-vf5fb" Apr 23 13:31:38.479061 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:38.478933 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/495e17e1-7de5-454d-a2b3-240f3cf879a4-config-volume\") pod \"dns-default-vf5fb\" (UID: \"495e17e1-7de5-454d-a2b3-240f3cf879a4\") " pod="openshift-dns/dns-default-vf5fb" Apr 23 13:31:38.479061 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:38.478953 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/495e17e1-7de5-454d-a2b3-240f3cf879a4-tmp-dir\") pod \"dns-default-vf5fb\" (UID: \"495e17e1-7de5-454d-a2b3-240f3cf879a4\") " pod="openshift-dns/dns-default-vf5fb" Apr 23 13:31:38.479061 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:38.478980 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6566a883-ca9d-4251-b7f3-7e0e087e3020-cert\") pod \"ingress-canary-wtf4m\" (UID: \"6566a883-ca9d-4251-b7f3-7e0e087e3020\") " pod="openshift-ingress-canary/ingress-canary-wtf4m" Apr 23 13:31:38.479061 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:38.479007 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/495e17e1-7de5-454d-a2b3-240f3cf879a4-metrics-tls\") pod \"dns-default-vf5fb\" (UID: \"495e17e1-7de5-454d-a2b3-240f3cf879a4\") " pod="openshift-dns/dns-default-vf5fb" Apr 23 13:31:38.479061 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:38.479036 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h2wnn\" (UniqueName: \"kubernetes.io/projected/6566a883-ca9d-4251-b7f3-7e0e087e3020-kube-api-access-h2wnn\") pod \"ingress-canary-wtf4m\" (UID: \"6566a883-ca9d-4251-b7f3-7e0e087e3020\") " pod="openshift-ingress-canary/ingress-canary-wtf4m" Apr 23 13:31:38.479260 ip-10-0-128-108 kubenswrapper[2571]: E0423 13:31:38.479144 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 13:31:38.479260 ip-10-0-128-108 kubenswrapper[2571]: E0423 13:31:38.479219 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/495e17e1-7de5-454d-a2b3-240f3cf879a4-metrics-tls podName:495e17e1-7de5-454d-a2b3-240f3cf879a4 nodeName:}" failed. No retries permitted until 2026-04-23 13:31:38.979198685 +0000 UTC m=+33.117477353 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/495e17e1-7de5-454d-a2b3-240f3cf879a4-metrics-tls") pod "dns-default-vf5fb" (UID: "495e17e1-7de5-454d-a2b3-240f3cf879a4") : secret "dns-default-metrics-tls" not found Apr 23 13:31:38.479260 ip-10-0-128-108 kubenswrapper[2571]: E0423 13:31:38.479144 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 13:31:38.479378 ip-10-0-128-108 kubenswrapper[2571]: E0423 13:31:38.479289 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6566a883-ca9d-4251-b7f3-7e0e087e3020-cert podName:6566a883-ca9d-4251-b7f3-7e0e087e3020 nodeName:}" failed. No retries permitted until 2026-04-23 13:31:38.979276 +0000 UTC m=+33.117554665 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6566a883-ca9d-4251-b7f3-7e0e087e3020-cert") pod "ingress-canary-wtf4m" (UID: "6566a883-ca9d-4251-b7f3-7e0e087e3020") : secret "canary-serving-cert" not found Apr 23 13:31:38.479378 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:38.479293 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/495e17e1-7de5-454d-a2b3-240f3cf879a4-tmp-dir\") pod \"dns-default-vf5fb\" (UID: \"495e17e1-7de5-454d-a2b3-240f3cf879a4\") " pod="openshift-dns/dns-default-vf5fb" Apr 23 13:31:38.479509 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:38.479491 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/495e17e1-7de5-454d-a2b3-240f3cf879a4-config-volume\") pod \"dns-default-vf5fb\" (UID: \"495e17e1-7de5-454d-a2b3-240f3cf879a4\") " pod="openshift-dns/dns-default-vf5fb" Apr 23 13:31:38.489212 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:38.489188 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hmz6h\" (UniqueName: \"kubernetes.io/projected/495e17e1-7de5-454d-a2b3-240f3cf879a4-kube-api-access-hmz6h\") pod \"dns-default-vf5fb\" (UID: \"495e17e1-7de5-454d-a2b3-240f3cf879a4\") " pod="openshift-dns/dns-default-vf5fb" Apr 23 13:31:38.489321 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:38.489244 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h2wnn\" (UniqueName: \"kubernetes.io/projected/6566a883-ca9d-4251-b7f3-7e0e087e3020-kube-api-access-h2wnn\") pod \"ingress-canary-wtf4m\" (UID: \"6566a883-ca9d-4251-b7f3-7e0e087e3020\") " pod="openshift-ingress-canary/ingress-canary-wtf4m" Apr 23 13:31:38.983146 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:38.983111 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6566a883-ca9d-4251-b7f3-7e0e087e3020-cert\") pod \"ingress-canary-wtf4m\" (UID: \"6566a883-ca9d-4251-b7f3-7e0e087e3020\") " pod="openshift-ingress-canary/ingress-canary-wtf4m" Apr 23 13:31:38.983146 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:38.983152 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/495e17e1-7de5-454d-a2b3-240f3cf879a4-metrics-tls\") pod \"dns-default-vf5fb\" (UID: \"495e17e1-7de5-454d-a2b3-240f3cf879a4\") " pod="openshift-dns/dns-default-vf5fb" Apr 23 13:31:38.983404 ip-10-0-128-108 kubenswrapper[2571]: E0423 13:31:38.983285 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 13:31:38.983404 ip-10-0-128-108 kubenswrapper[2571]: E0423 13:31:38.983297 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 13:31:38.983404 ip-10-0-128-108 kubenswrapper[2571]: E0423 13:31:38.983357 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/495e17e1-7de5-454d-a2b3-240f3cf879a4-metrics-tls podName:495e17e1-7de5-454d-a2b3-240f3cf879a4 nodeName:}" failed. No retries permitted until 2026-04-23 13:31:39.983322123 +0000 UTC m=+34.121600786 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/495e17e1-7de5-454d-a2b3-240f3cf879a4-metrics-tls") pod "dns-default-vf5fb" (UID: "495e17e1-7de5-454d-a2b3-240f3cf879a4") : secret "dns-default-metrics-tls" not found Apr 23 13:31:38.983404 ip-10-0-128-108 kubenswrapper[2571]: E0423 13:31:38.983375 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6566a883-ca9d-4251-b7f3-7e0e087e3020-cert podName:6566a883-ca9d-4251-b7f3-7e0e087e3020 nodeName:}" failed. No retries permitted until 2026-04-23 13:31:39.983369438 +0000 UTC m=+34.121648101 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6566a883-ca9d-4251-b7f3-7e0e087e3020-cert") pod "ingress-canary-wtf4m" (UID: "6566a883-ca9d-4251-b7f3-7e0e087e3020") : secret "canary-serving-cert" not found Apr 23 13:31:39.083868 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:39.083835 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xkgtj\" (UniqueName: \"kubernetes.io/projected/958f9f23-bac7-4183-b8b4-7d4d89901105-kube-api-access-xkgtj\") pod \"network-check-target-qqr6d\" (UID: \"958f9f23-bac7-4183-b8b4-7d4d89901105\") " pod="openshift-network-diagnostics/network-check-target-qqr6d" Apr 23 13:31:39.084040 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:39.083882 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6866a2aa-1943-4e03-a99a-8b054a2434c8-metrics-certs\") pod \"network-metrics-daemon-k958q\" (UID: \"6866a2aa-1943-4e03-a99a-8b054a2434c8\") " pod="openshift-multus/network-metrics-daemon-k958q" Apr 23 13:31:39.084040 ip-10-0-128-108 kubenswrapper[2571]: E0423 13:31:39.084009 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 23 13:31:39.084151 ip-10-0-128-108 kubenswrapper[2571]: E0423 13:31:39.084076 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6866a2aa-1943-4e03-a99a-8b054a2434c8-metrics-certs podName:6866a2aa-1943-4e03-a99a-8b054a2434c8 nodeName:}" failed. No retries permitted until 2026-04-23 13:32:11.084055392 +0000 UTC m=+65.222334057 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6866a2aa-1943-4e03-a99a-8b054a2434c8-metrics-certs") pod "network-metrics-daemon-k958q" (UID: "6866a2aa-1943-4e03-a99a-8b054a2434c8") : secret "metrics-daemon-secret" not found Apr 23 13:31:39.086501 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:39.086475 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xkgtj\" (UniqueName: \"kubernetes.io/projected/958f9f23-bac7-4183-b8b4-7d4d89901105-kube-api-access-xkgtj\") pod \"network-check-target-qqr6d\" (UID: \"958f9f23-bac7-4183-b8b4-7d4d89901105\") " pod="openshift-network-diagnostics/network-check-target-qqr6d" Apr 23 13:31:39.352423 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:39.352351 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qqr6d" Apr 23 13:31:39.516211 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:39.516183 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-qqr6d"] Apr 23 13:31:39.519284 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:39.519259 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod958f9f23_bac7_4183_b8b4_7d4d89901105.slice/crio-07c94ca597b7976f67e0ec0c4619632e41de9bd6c55b2b0af571c215f2778564 WatchSource:0}: Error finding container 07c94ca597b7976f67e0ec0c4619632e41de9bd6c55b2b0af571c215f2778564: Status 404 returned error can't find the container with id 07c94ca597b7976f67e0ec0c4619632e41de9bd6c55b2b0af571c215f2778564 Apr 23 13:31:39.623258 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:39.623175 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-qqr6d" event={"ID":"958f9f23-bac7-4183-b8b4-7d4d89901105","Type":"ContainerStarted","Data":"07c94ca597b7976f67e0ec0c4619632e41de9bd6c55b2b0af571c215f2778564"} Apr 23 13:31:39.991382 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:39.991348 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6566a883-ca9d-4251-b7f3-7e0e087e3020-cert\") pod \"ingress-canary-wtf4m\" (UID: \"6566a883-ca9d-4251-b7f3-7e0e087e3020\") " pod="openshift-ingress-canary/ingress-canary-wtf4m" Apr 23 13:31:39.991567 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:39.991392 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/495e17e1-7de5-454d-a2b3-240f3cf879a4-metrics-tls\") pod \"dns-default-vf5fb\" (UID: \"495e17e1-7de5-454d-a2b3-240f3cf879a4\") " pod="openshift-dns/dns-default-vf5fb" Apr 23 13:31:39.991567 ip-10-0-128-108 kubenswrapper[2571]: E0423 13:31:39.991529 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 13:31:39.991567 ip-10-0-128-108 kubenswrapper[2571]: E0423 13:31:39.991537 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 13:31:39.991719 ip-10-0-128-108 kubenswrapper[2571]: E0423 13:31:39.991589 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/495e17e1-7de5-454d-a2b3-240f3cf879a4-metrics-tls podName:495e17e1-7de5-454d-a2b3-240f3cf879a4 nodeName:}" failed. No retries permitted until 2026-04-23 13:31:41.991570471 +0000 UTC m=+36.129849137 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/495e17e1-7de5-454d-a2b3-240f3cf879a4-metrics-tls") pod "dns-default-vf5fb" (UID: "495e17e1-7de5-454d-a2b3-240f3cf879a4") : secret "dns-default-metrics-tls" not found Apr 23 13:31:39.991719 ip-10-0-128-108 kubenswrapper[2571]: E0423 13:31:39.991611 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6566a883-ca9d-4251-b7f3-7e0e087e3020-cert podName:6566a883-ca9d-4251-b7f3-7e0e087e3020 nodeName:}" failed. No retries permitted until 2026-04-23 13:31:41.99160142 +0000 UTC m=+36.129880084 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6566a883-ca9d-4251-b7f3-7e0e087e3020-cert") pod "ingress-canary-wtf4m" (UID: "6566a883-ca9d-4251-b7f3-7e0e087e3020") : secret "canary-serving-cert" not found Apr 23 13:31:42.005391 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:42.005357 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6566a883-ca9d-4251-b7f3-7e0e087e3020-cert\") pod \"ingress-canary-wtf4m\" (UID: \"6566a883-ca9d-4251-b7f3-7e0e087e3020\") " pod="openshift-ingress-canary/ingress-canary-wtf4m" Apr 23 13:31:42.005796 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:42.005407 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/495e17e1-7de5-454d-a2b3-240f3cf879a4-metrics-tls\") pod \"dns-default-vf5fb\" (UID: \"495e17e1-7de5-454d-a2b3-240f3cf879a4\") " pod="openshift-dns/dns-default-vf5fb" Apr 23 13:31:42.005796 ip-10-0-128-108 kubenswrapper[2571]: E0423 13:31:42.005536 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 13:31:42.005796 ip-10-0-128-108 kubenswrapper[2571]: E0423 13:31:42.005559 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 13:31:42.005796 ip-10-0-128-108 kubenswrapper[2571]: E0423 13:31:42.005614 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6566a883-ca9d-4251-b7f3-7e0e087e3020-cert podName:6566a883-ca9d-4251-b7f3-7e0e087e3020 nodeName:}" failed. No retries permitted until 2026-04-23 13:31:46.00559258 +0000 UTC m=+40.143871266 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6566a883-ca9d-4251-b7f3-7e0e087e3020-cert") pod "ingress-canary-wtf4m" (UID: "6566a883-ca9d-4251-b7f3-7e0e087e3020") : secret "canary-serving-cert" not found Apr 23 13:31:42.005796 ip-10-0-128-108 kubenswrapper[2571]: E0423 13:31:42.005636 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/495e17e1-7de5-454d-a2b3-240f3cf879a4-metrics-tls podName:495e17e1-7de5-454d-a2b3-240f3cf879a4 nodeName:}" failed. No retries permitted until 2026-04-23 13:31:46.005626601 +0000 UTC m=+40.143905272 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/495e17e1-7de5-454d-a2b3-240f3cf879a4-metrics-tls") pod "dns-default-vf5fb" (UID: "495e17e1-7de5-454d-a2b3-240f3cf879a4") : secret "dns-default-metrics-tls" not found Apr 23 13:31:42.630975 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:42.630939 2571 generic.go:358] "Generic (PLEG): container finished" podID="c1494fbd-44b0-417d-857b-089a1705bbe9" containerID="8b86dac75e44bb61fa61ca7211ccf9edcb817061de32be36c997bd95bd3c6aaa" exitCode=0 Apr 23 13:31:42.631134 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:42.631013 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rm6cm" event={"ID":"c1494fbd-44b0-417d-857b-089a1705bbe9","Type":"ContainerDied","Data":"8b86dac75e44bb61fa61ca7211ccf9edcb817061de32be36c997bd95bd3c6aaa"} Apr 23 13:31:43.636627 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:43.636516 2571 generic.go:358] "Generic (PLEG): container finished" podID="c1494fbd-44b0-417d-857b-089a1705bbe9" containerID="dc9656f9926e2ee3afd18ea7287ec1a32c14345655945fbb0dc88669a5661d29" exitCode=0 Apr 23 13:31:43.636627 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:43.636589 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rm6cm" event={"ID":"c1494fbd-44b0-417d-857b-089a1705bbe9","Type":"ContainerDied","Data":"dc9656f9926e2ee3afd18ea7287ec1a32c14345655945fbb0dc88669a5661d29"} Apr 23 13:31:44.641307 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:44.641259 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rm6cm" event={"ID":"c1494fbd-44b0-417d-857b-089a1705bbe9","Type":"ContainerStarted","Data":"43d277e6e123e5ca3b320c0c09e3c1ab5ab39c280c4e7aa1be5582db41b17add"} Apr 23 13:31:44.642622 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:44.642597 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-qqr6d" event={"ID":"958f9f23-bac7-4183-b8b4-7d4d89901105","Type":"ContainerStarted","Data":"b0071b1d0a45013e8ddb3dd4edc1e934fab041d7a29403b891fd74187b0b876e"} Apr 23 13:31:44.642751 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:44.642734 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-qqr6d" Apr 23 13:31:44.662933 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:44.662898 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-rm6cm" podStartSLOduration=4.51494228 podStartE2EDuration="38.662885437s" podCreationTimestamp="2026-04-23 13:31:06 +0000 UTC" firstStartedPulling="2026-04-23 13:31:07.651746862 +0000 UTC m=+1.790025526" lastFinishedPulling="2026-04-23 13:31:41.799690018 +0000 UTC m=+35.937968683" observedRunningTime="2026-04-23 13:31:44.661615422 +0000 UTC m=+38.799894098" watchObservedRunningTime="2026-04-23 13:31:44.662885437 +0000 UTC m=+38.801164119" Apr 23 13:31:46.033689 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:46.033657 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6566a883-ca9d-4251-b7f3-7e0e087e3020-cert\") pod \"ingress-canary-wtf4m\" (UID: \"6566a883-ca9d-4251-b7f3-7e0e087e3020\") " pod="openshift-ingress-canary/ingress-canary-wtf4m" Apr 23 13:31:46.033985 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:46.033697 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/495e17e1-7de5-454d-a2b3-240f3cf879a4-metrics-tls\") pod \"dns-default-vf5fb\" (UID: \"495e17e1-7de5-454d-a2b3-240f3cf879a4\") " pod="openshift-dns/dns-default-vf5fb" Apr 23 13:31:46.033985 ip-10-0-128-108 kubenswrapper[2571]: E0423 13:31:46.033792 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 13:31:46.033985 ip-10-0-128-108 kubenswrapper[2571]: E0423 13:31:46.033826 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 13:31:46.033985 ip-10-0-128-108 kubenswrapper[2571]: E0423 13:31:46.033849 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/495e17e1-7de5-454d-a2b3-240f3cf879a4-metrics-tls podName:495e17e1-7de5-454d-a2b3-240f3cf879a4 nodeName:}" failed. No retries permitted until 2026-04-23 13:31:54.033833902 +0000 UTC m=+48.172112567 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/495e17e1-7de5-454d-a2b3-240f3cf879a4-metrics-tls") pod "dns-default-vf5fb" (UID: "495e17e1-7de5-454d-a2b3-240f3cf879a4") : secret "dns-default-metrics-tls" not found Apr 23 13:31:46.033985 ip-10-0-128-108 kubenswrapper[2571]: E0423 13:31:46.033892 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6566a883-ca9d-4251-b7f3-7e0e087e3020-cert podName:6566a883-ca9d-4251-b7f3-7e0e087e3020 nodeName:}" failed. No retries permitted until 2026-04-23 13:31:54.033873169 +0000 UTC m=+48.172151855 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6566a883-ca9d-4251-b7f3-7e0e087e3020-cert") pod "ingress-canary-wtf4m" (UID: "6566a883-ca9d-4251-b7f3-7e0e087e3020") : secret "canary-serving-cert" not found Apr 23 13:31:49.444887 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:49.444834 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-qqr6d" podStartSLOduration=38.492317932 podStartE2EDuration="43.444820321s" podCreationTimestamp="2026-04-23 13:31:06 +0000 UTC" firstStartedPulling="2026-04-23 13:31:39.521314867 +0000 UTC m=+33.659593538" lastFinishedPulling="2026-04-23 13:31:44.47381725 +0000 UTC m=+38.612095927" observedRunningTime="2026-04-23 13:31:44.675189927 +0000 UTC m=+38.813468613" watchObservedRunningTime="2026-04-23 13:31:49.444820321 +0000 UTC m=+43.583098998" Apr 23 13:31:49.445398 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:49.445379 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-f9c55fcd5-df8v4"] Apr 23 13:31:49.468517 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:49.468493 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-58f594c6dd-vbgbt"] Apr 23 13:31:49.468649 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:49.468631 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-f9c55fcd5-df8v4" Apr 23 13:31:49.471498 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:49.471461 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-dockercfg-2hjrs\"" Apr 23 13:31:49.472191 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:49.472168 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 23 13:31:49.472301 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:49.472230 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-hub-kubeconfig\"" Apr 23 13:31:49.472388 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:49.472313 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 23 13:31:49.472693 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:49.472671 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 23 13:31:49.490188 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:49.490163 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-68b669577-z4bj9"] Apr 23 13:31:49.490312 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:49.490297 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-58f594c6dd-vbgbt" Apr 23 13:31:49.494716 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:49.494699 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"work-manager-hub-kubeconfig\"" Apr 23 13:31:49.518269 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:49.518247 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-f9c55fcd5-df8v4"] Apr 23 13:31:49.518269 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:49.518269 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-68b669577-z4bj9"] Apr 23 13:31:49.518403 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:49.518277 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-58f594c6dd-vbgbt"] Apr 23 13:31:49.518403 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:49.518367 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-68b669577-z4bj9" Apr 23 13:31:49.521621 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:49.521600 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-service-proxy-server-certificates\"" Apr 23 13:31:49.521714 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:49.521627 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-hub-kubeconfig\"" Apr 23 13:31:49.521834 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:49.521818 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-ca\"" Apr 23 13:31:49.521875 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:49.521864 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-open-cluster-management.io-proxy-agent-signer-client-cert\"" Apr 23 13:31:49.555852 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:49.555825 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2txnv\" (UniqueName: \"kubernetes.io/projected/c71cc2a2-8c5e-442b-8c2f-d450cc0e0cb2-kube-api-access-2txnv\") pod \"managed-serviceaccount-addon-agent-f9c55fcd5-df8v4\" (UID: \"c71cc2a2-8c5e-442b-8c2f-d450cc0e0cb2\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-f9c55fcd5-df8v4" Apr 23 13:31:49.555964 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:49.555889 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/c71cc2a2-8c5e-442b-8c2f-d450cc0e0cb2-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-f9c55fcd5-df8v4\" (UID: \"c71cc2a2-8c5e-442b-8c2f-d450cc0e0cb2\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-f9c55fcd5-df8v4" Apr 23 13:31:49.656488 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:49.656458 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2txnv\" (UniqueName: \"kubernetes.io/projected/c71cc2a2-8c5e-442b-8c2f-d450cc0e0cb2-kube-api-access-2txnv\") pod \"managed-serviceaccount-addon-agent-f9c55fcd5-df8v4\" (UID: \"c71cc2a2-8c5e-442b-8c2f-d450cc0e0cb2\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-f9c55fcd5-df8v4" Apr 23 13:31:49.656650 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:49.656492 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/e3810e76-cf77-4c5b-8b1a-652d2b3cb359-klusterlet-config\") pod \"klusterlet-addon-workmgr-58f594c6dd-vbgbt\" (UID: \"e3810e76-cf77-4c5b-8b1a-652d2b3cb359\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-58f594c6dd-vbgbt" Apr 23 13:31:49.656650 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:49.656518 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/9f54f5af-8da9-41e6-bba0-09abda7835d7-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-68b669577-z4bj9\" (UID: \"9f54f5af-8da9-41e6-bba0-09abda7835d7\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-68b669577-z4bj9" Apr 23 13:31:49.656727 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:49.656669 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/e3810e76-cf77-4c5b-8b1a-652d2b3cb359-tmp\") pod \"klusterlet-addon-workmgr-58f594c6dd-vbgbt\" (UID: \"e3810e76-cf77-4c5b-8b1a-652d2b3cb359\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-58f594c6dd-vbgbt" Apr 23 13:31:49.656727 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:49.656705 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/9f54f5af-8da9-41e6-bba0-09abda7835d7-ca\") pod \"cluster-proxy-proxy-agent-68b669577-z4bj9\" (UID: \"9f54f5af-8da9-41e6-bba0-09abda7835d7\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-68b669577-z4bj9" Apr 23 13:31:49.656727 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:49.656721 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/9f54f5af-8da9-41e6-bba0-09abda7835d7-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-68b669577-z4bj9\" (UID: \"9f54f5af-8da9-41e6-bba0-09abda7835d7\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-68b669577-z4bj9" Apr 23 13:31:49.656854 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:49.656739 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/c71cc2a2-8c5e-442b-8c2f-d450cc0e0cb2-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-f9c55fcd5-df8v4\" (UID: \"c71cc2a2-8c5e-442b-8c2f-d450cc0e0cb2\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-f9c55fcd5-df8v4" Apr 23 13:31:49.656854 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:49.656785 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jgttq\" (UniqueName: \"kubernetes.io/projected/9f54f5af-8da9-41e6-bba0-09abda7835d7-kube-api-access-jgttq\") pod \"cluster-proxy-proxy-agent-68b669577-z4bj9\" (UID: \"9f54f5af-8da9-41e6-bba0-09abda7835d7\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-68b669577-z4bj9" Apr 23 13:31:49.656854 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:49.656819 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qj4gv\" (UniqueName: \"kubernetes.io/projected/e3810e76-cf77-4c5b-8b1a-652d2b3cb359-kube-api-access-qj4gv\") pod \"klusterlet-addon-workmgr-58f594c6dd-vbgbt\" (UID: \"e3810e76-cf77-4c5b-8b1a-652d2b3cb359\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-58f594c6dd-vbgbt" Apr 23 13:31:49.656854 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:49.656844 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/9f54f5af-8da9-41e6-bba0-09abda7835d7-hub\") pod \"cluster-proxy-proxy-agent-68b669577-z4bj9\" (UID: \"9f54f5af-8da9-41e6-bba0-09abda7835d7\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-68b669577-z4bj9" Apr 23 13:31:49.656991 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:49.656894 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/9f54f5af-8da9-41e6-bba0-09abda7835d7-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-68b669577-z4bj9\" (UID: \"9f54f5af-8da9-41e6-bba0-09abda7835d7\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-68b669577-z4bj9" Apr 23 13:31:49.660192 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:49.660174 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/c71cc2a2-8c5e-442b-8c2f-d450cc0e0cb2-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-f9c55fcd5-df8v4\" (UID: \"c71cc2a2-8c5e-442b-8c2f-d450cc0e0cb2\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-f9c55fcd5-df8v4" Apr 23 13:31:49.667290 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:49.667265 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2txnv\" (UniqueName: \"kubernetes.io/projected/c71cc2a2-8c5e-442b-8c2f-d450cc0e0cb2-kube-api-access-2txnv\") pod \"managed-serviceaccount-addon-agent-f9c55fcd5-df8v4\" (UID: \"c71cc2a2-8c5e-442b-8c2f-d450cc0e0cb2\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-f9c55fcd5-df8v4" Apr 23 13:31:49.757558 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:49.757530 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/9f54f5af-8da9-41e6-bba0-09abda7835d7-ca\") pod \"cluster-proxy-proxy-agent-68b669577-z4bj9\" (UID: \"9f54f5af-8da9-41e6-bba0-09abda7835d7\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-68b669577-z4bj9" Apr 23 13:31:49.757709 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:49.757565 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/9f54f5af-8da9-41e6-bba0-09abda7835d7-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-68b669577-z4bj9\" (UID: \"9f54f5af-8da9-41e6-bba0-09abda7835d7\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-68b669577-z4bj9" Apr 23 13:31:49.757709 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:49.757584 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jgttq\" (UniqueName: \"kubernetes.io/projected/9f54f5af-8da9-41e6-bba0-09abda7835d7-kube-api-access-jgttq\") pod \"cluster-proxy-proxy-agent-68b669577-z4bj9\" (UID: \"9f54f5af-8da9-41e6-bba0-09abda7835d7\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-68b669577-z4bj9" Apr 23 13:31:49.757709 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:49.757601 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qj4gv\" (UniqueName: \"kubernetes.io/projected/e3810e76-cf77-4c5b-8b1a-652d2b3cb359-kube-api-access-qj4gv\") pod \"klusterlet-addon-workmgr-58f594c6dd-vbgbt\" (UID: \"e3810e76-cf77-4c5b-8b1a-652d2b3cb359\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-58f594c6dd-vbgbt" Apr 23 13:31:49.757709 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:49.757620 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/9f54f5af-8da9-41e6-bba0-09abda7835d7-hub\") pod \"cluster-proxy-proxy-agent-68b669577-z4bj9\" (UID: \"9f54f5af-8da9-41e6-bba0-09abda7835d7\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-68b669577-z4bj9" Apr 23 13:31:49.757709 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:49.757675 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/9f54f5af-8da9-41e6-bba0-09abda7835d7-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-68b669577-z4bj9\" (UID: \"9f54f5af-8da9-41e6-bba0-09abda7835d7\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-68b669577-z4bj9" Apr 23 13:31:49.757936 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:49.757728 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/e3810e76-cf77-4c5b-8b1a-652d2b3cb359-klusterlet-config\") pod \"klusterlet-addon-workmgr-58f594c6dd-vbgbt\" (UID: \"e3810e76-cf77-4c5b-8b1a-652d2b3cb359\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-58f594c6dd-vbgbt" Apr 23 13:31:49.757936 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:49.757824 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/9f54f5af-8da9-41e6-bba0-09abda7835d7-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-68b669577-z4bj9\" (UID: \"9f54f5af-8da9-41e6-bba0-09abda7835d7\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-68b669577-z4bj9" Apr 23 13:31:49.757936 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:49.757861 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/e3810e76-cf77-4c5b-8b1a-652d2b3cb359-tmp\") pod \"klusterlet-addon-workmgr-58f594c6dd-vbgbt\" (UID: \"e3810e76-cf77-4c5b-8b1a-652d2b3cb359\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-58f594c6dd-vbgbt" Apr 23 13:31:49.758257 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:49.758233 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/e3810e76-cf77-4c5b-8b1a-652d2b3cb359-tmp\") pod \"klusterlet-addon-workmgr-58f594c6dd-vbgbt\" (UID: \"e3810e76-cf77-4c5b-8b1a-652d2b3cb359\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-58f594c6dd-vbgbt" Apr 23 13:31:49.760257 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:49.760237 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/e3810e76-cf77-4c5b-8b1a-652d2b3cb359-klusterlet-config\") pod \"klusterlet-addon-workmgr-58f594c6dd-vbgbt\" (UID: \"e3810e76-cf77-4c5b-8b1a-652d2b3cb359\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-58f594c6dd-vbgbt" Apr 23 13:31:49.768225 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:49.768202 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca\" (UniqueName: \"kubernetes.io/secret/9f54f5af-8da9-41e6-bba0-09abda7835d7-ca\") pod \"cluster-proxy-proxy-agent-68b669577-z4bj9\" (UID: \"9f54f5af-8da9-41e6-bba0-09abda7835d7\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-68b669577-z4bj9" Apr 23 13:31:49.768426 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:49.768403 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/9f54f5af-8da9-41e6-bba0-09abda7835d7-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-68b669577-z4bj9\" (UID: \"9f54f5af-8da9-41e6-bba0-09abda7835d7\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-68b669577-z4bj9" Apr 23 13:31:49.768426 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:49.768418 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub\" (UniqueName: \"kubernetes.io/secret/9f54f5af-8da9-41e6-bba0-09abda7835d7-hub\") pod \"cluster-proxy-proxy-agent-68b669577-z4bj9\" (UID: \"9f54f5af-8da9-41e6-bba0-09abda7835d7\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-68b669577-z4bj9" Apr 23 13:31:49.768516 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:49.768446 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/9f54f5af-8da9-41e6-bba0-09abda7835d7-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-68b669577-z4bj9\" (UID: \"9f54f5af-8da9-41e6-bba0-09abda7835d7\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-68b669577-z4bj9" Apr 23 13:31:49.768516 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:49.768417 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/9f54f5af-8da9-41e6-bba0-09abda7835d7-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-68b669577-z4bj9\" (UID: \"9f54f5af-8da9-41e6-bba0-09abda7835d7\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-68b669577-z4bj9" Apr 23 13:31:49.772731 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:49.772706 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qj4gv\" (UniqueName: \"kubernetes.io/projected/e3810e76-cf77-4c5b-8b1a-652d2b3cb359-kube-api-access-qj4gv\") pod \"klusterlet-addon-workmgr-58f594c6dd-vbgbt\" (UID: \"e3810e76-cf77-4c5b-8b1a-652d2b3cb359\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-58f594c6dd-vbgbt" Apr 23 13:31:49.774325 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:49.774306 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jgttq\" (UniqueName: \"kubernetes.io/projected/9f54f5af-8da9-41e6-bba0-09abda7835d7-kube-api-access-jgttq\") pod \"cluster-proxy-proxy-agent-68b669577-z4bj9\" (UID: \"9f54f5af-8da9-41e6-bba0-09abda7835d7\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-68b669577-z4bj9" Apr 23 13:31:49.790239 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:49.790220 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-f9c55fcd5-df8v4" Apr 23 13:31:49.800923 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:49.800901 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-58f594c6dd-vbgbt" Apr 23 13:31:49.826049 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:49.826005 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-68b669577-z4bj9" Apr 23 13:31:49.942620 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:49.942593 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-f9c55fcd5-df8v4"] Apr 23 13:31:49.945755 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:49.945700 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc71cc2a2_8c5e_442b_8c2f_d450cc0e0cb2.slice/crio-766785226a280c6292127352d65408f96c83ca6e0a71a9525f7187744f366c22 WatchSource:0}: Error finding container 766785226a280c6292127352d65408f96c83ca6e0a71a9525f7187744f366c22: Status 404 returned error can't find the container with id 766785226a280c6292127352d65408f96c83ca6e0a71a9525f7187744f366c22 Apr 23 13:31:49.957035 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:49.957010 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-58f594c6dd-vbgbt"] Apr 23 13:31:49.959243 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:49.959221 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode3810e76_cf77_4c5b_8b1a_652d2b3cb359.slice/crio-e152ebcadebd33f3d1b58d5edfeef5a37f7bdedfd0ffc615eb9fc40e41582536 WatchSource:0}: Error finding container e152ebcadebd33f3d1b58d5edfeef5a37f7bdedfd0ffc615eb9fc40e41582536: Status 404 returned error can't find the container with id e152ebcadebd33f3d1b58d5edfeef5a37f7bdedfd0ffc615eb9fc40e41582536 Apr 23 13:31:49.981062 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:49.981041 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-68b669577-z4bj9"] Apr 23 13:31:49.983421 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:49.983398 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9f54f5af_8da9_41e6_bba0_09abda7835d7.slice/crio-6c88f6fbd128612c0f46eb0246f47b62a88518caee02d3d70018ad0df0c74444 WatchSource:0}: Error finding container 6c88f6fbd128612c0f46eb0246f47b62a88518caee02d3d70018ad0df0c74444: Status 404 returned error can't find the container with id 6c88f6fbd128612c0f46eb0246f47b62a88518caee02d3d70018ad0df0c74444 Apr 23 13:31:50.655114 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:50.655063 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-68b669577-z4bj9" event={"ID":"9f54f5af-8da9-41e6-bba0-09abda7835d7","Type":"ContainerStarted","Data":"6c88f6fbd128612c0f46eb0246f47b62a88518caee02d3d70018ad0df0c74444"} Apr 23 13:31:50.657194 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:50.656572 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-f9c55fcd5-df8v4" event={"ID":"c71cc2a2-8c5e-442b-8c2f-d450cc0e0cb2","Type":"ContainerStarted","Data":"766785226a280c6292127352d65408f96c83ca6e0a71a9525f7187744f366c22"} Apr 23 13:31:50.658293 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:50.658266 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-58f594c6dd-vbgbt" event={"ID":"e3810e76-cf77-4c5b-8b1a-652d2b3cb359","Type":"ContainerStarted","Data":"e152ebcadebd33f3d1b58d5edfeef5a37f7bdedfd0ffc615eb9fc40e41582536"} Apr 23 13:31:51.777141 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:51.776628 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/1e27565d-df42-41a7-9d41-eb8595cf751e-original-pull-secret\") pod \"global-pull-secret-syncer-lv2lp\" (UID: \"1e27565d-df42-41a7-9d41-eb8595cf751e\") " pod="kube-system/global-pull-secret-syncer-lv2lp" Apr 23 13:31:51.785847 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:51.785753 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/1e27565d-df42-41a7-9d41-eb8595cf751e-original-pull-secret\") pod \"global-pull-secret-syncer-lv2lp\" (UID: \"1e27565d-df42-41a7-9d41-eb8595cf751e\") " pod="kube-system/global-pull-secret-syncer-lv2lp" Apr 23 13:31:51.940183 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:51.940147 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-lv2lp" Apr 23 13:31:54.095783 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:54.095738 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/495e17e1-7de5-454d-a2b3-240f3cf879a4-metrics-tls\") pod \"dns-default-vf5fb\" (UID: \"495e17e1-7de5-454d-a2b3-240f3cf879a4\") " pod="openshift-dns/dns-default-vf5fb" Apr 23 13:31:54.096248 ip-10-0-128-108 kubenswrapper[2571]: E0423 13:31:54.095929 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 13:31:54.096248 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:54.095970 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6566a883-ca9d-4251-b7f3-7e0e087e3020-cert\") pod \"ingress-canary-wtf4m\" (UID: \"6566a883-ca9d-4251-b7f3-7e0e087e3020\") " pod="openshift-ingress-canary/ingress-canary-wtf4m" Apr 23 13:31:54.096248 ip-10-0-128-108 kubenswrapper[2571]: E0423 13:31:54.096002 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/495e17e1-7de5-454d-a2b3-240f3cf879a4-metrics-tls podName:495e17e1-7de5-454d-a2b3-240f3cf879a4 nodeName:}" failed. No retries permitted until 2026-04-23 13:32:10.09598221 +0000 UTC m=+64.234260873 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/495e17e1-7de5-454d-a2b3-240f3cf879a4-metrics-tls") pod "dns-default-vf5fb" (UID: "495e17e1-7de5-454d-a2b3-240f3cf879a4") : secret "dns-default-metrics-tls" not found Apr 23 13:31:54.096248 ip-10-0-128-108 kubenswrapper[2571]: E0423 13:31:54.096044 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 13:31:54.096248 ip-10-0-128-108 kubenswrapper[2571]: E0423 13:31:54.096078 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6566a883-ca9d-4251-b7f3-7e0e087e3020-cert podName:6566a883-ca9d-4251-b7f3-7e0e087e3020 nodeName:}" failed. No retries permitted until 2026-04-23 13:32:10.096066737 +0000 UTC m=+64.234345415 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6566a883-ca9d-4251-b7f3-7e0e087e3020-cert") pod "ingress-canary-wtf4m" (UID: "6566a883-ca9d-4251-b7f3-7e0e087e3020") : secret "canary-serving-cert" not found Apr 23 13:31:55.041052 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:55.040968 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-lv2lp"] Apr 23 13:31:55.045079 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:31:55.045047 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1e27565d_df42_41a7_9d41_eb8595cf751e.slice/crio-e00ca8a7fe2a3bdbef8c0057f0b5564e8115f61591275cc9b9084de381b9ea36 WatchSource:0}: Error finding container e00ca8a7fe2a3bdbef8c0057f0b5564e8115f61591275cc9b9084de381b9ea36: Status 404 returned error can't find the container with id e00ca8a7fe2a3bdbef8c0057f0b5564e8115f61591275cc9b9084de381b9ea36 Apr 23 13:31:55.672127 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:55.672086 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-68b669577-z4bj9" event={"ID":"9f54f5af-8da9-41e6-bba0-09abda7835d7","Type":"ContainerStarted","Data":"d807e56074d7795f246c0dbc24756916a1e92b879113a79b7c601ee190b8d1af"} Apr 23 13:31:55.673478 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:55.673450 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-f9c55fcd5-df8v4" event={"ID":"c71cc2a2-8c5e-442b-8c2f-d450cc0e0cb2","Type":"ContainerStarted","Data":"ced7c49b285280528d1632b0ae52d306363a3f554daecfd927e29af7b09ed902"} Apr 23 13:31:55.674801 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:55.674776 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-58f594c6dd-vbgbt" event={"ID":"e3810e76-cf77-4c5b-8b1a-652d2b3cb359","Type":"ContainerStarted","Data":"5b8cd4605ef9cc6cb33b37236635fc549c9035130da2a5bed373af4af88c7cf8"} Apr 23 13:31:55.675015 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:55.674984 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-58f594c6dd-vbgbt" Apr 23 13:31:55.676111 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:55.676088 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-lv2lp" event={"ID":"1e27565d-df42-41a7-9d41-eb8595cf751e","Type":"ContainerStarted","Data":"e00ca8a7fe2a3bdbef8c0057f0b5564e8115f61591275cc9b9084de381b9ea36"} Apr 23 13:31:55.676578 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:55.676560 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-58f594c6dd-vbgbt" Apr 23 13:31:55.689960 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:55.689658 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-f9c55fcd5-df8v4" podStartSLOduration=1.701598363 podStartE2EDuration="6.689646801s" podCreationTimestamp="2026-04-23 13:31:49 +0000 UTC" firstStartedPulling="2026-04-23 13:31:49.947783957 +0000 UTC m=+44.086062619" lastFinishedPulling="2026-04-23 13:31:54.93583239 +0000 UTC m=+49.074111057" observedRunningTime="2026-04-23 13:31:55.688996839 +0000 UTC m=+49.827275524" watchObservedRunningTime="2026-04-23 13:31:55.689646801 +0000 UTC m=+49.827925487" Apr 23 13:31:55.704751 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:55.704705 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-58f594c6dd-vbgbt" podStartSLOduration=1.705325711 podStartE2EDuration="6.704691932s" podCreationTimestamp="2026-04-23 13:31:49 +0000 UTC" firstStartedPulling="2026-04-23 13:31:49.961057428 +0000 UTC m=+44.099336095" lastFinishedPulling="2026-04-23 13:31:54.960423652 +0000 UTC m=+49.098702316" observedRunningTime="2026-04-23 13:31:55.704020852 +0000 UTC m=+49.842299539" watchObservedRunningTime="2026-04-23 13:31:55.704691932 +0000 UTC m=+49.842970613" Apr 23 13:31:59.687208 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:59.687166 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-lv2lp" event={"ID":"1e27565d-df42-41a7-9d41-eb8595cf751e","Type":"ContainerStarted","Data":"517f8c33328f34b7e353a237a65f9b58c8d2869361d4ee1dfa96c420440f2c2d"} Apr 23 13:31:59.688873 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:59.688849 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-68b669577-z4bj9" event={"ID":"9f54f5af-8da9-41e6-bba0-09abda7835d7","Type":"ContainerStarted","Data":"82bda2a26b1e0894f2e452d3eab41925ee1d383b14336846c2c7d0e005d65087"} Apr 23 13:31:59.688989 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:59.688878 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-68b669577-z4bj9" event={"ID":"9f54f5af-8da9-41e6-bba0-09abda7835d7","Type":"ContainerStarted","Data":"097eded5826238eb9f98404ada5454263c41ee1882a23e0d69a10f733a83d83a"} Apr 23 13:31:59.702561 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:59.702519 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-lv2lp" podStartSLOduration=36.239122678 podStartE2EDuration="40.702505698s" podCreationTimestamp="2026-04-23 13:31:19 +0000 UTC" firstStartedPulling="2026-04-23 13:31:55.047172848 +0000 UTC m=+49.185451526" lastFinishedPulling="2026-04-23 13:31:59.510555869 +0000 UTC m=+53.648834546" observedRunningTime="2026-04-23 13:31:59.701870231 +0000 UTC m=+53.840148915" watchObservedRunningTime="2026-04-23 13:31:59.702505698 +0000 UTC m=+53.840784383" Apr 23 13:31:59.718317 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:31:59.718252 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-68b669577-z4bj9" podStartSLOduration=1.203287368 podStartE2EDuration="10.718237566s" podCreationTimestamp="2026-04-23 13:31:49 +0000 UTC" firstStartedPulling="2026-04-23 13:31:49.984988843 +0000 UTC m=+44.123267510" lastFinishedPulling="2026-04-23 13:31:59.499939035 +0000 UTC m=+53.638217708" observedRunningTime="2026-04-23 13:31:59.718117883 +0000 UTC m=+53.856396580" watchObservedRunningTime="2026-04-23 13:31:59.718237566 +0000 UTC m=+53.856516248" Apr 23 13:32:03.626848 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:32:03.626819 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-zk8wt" Apr 23 13:32:10.105774 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:32:10.105737 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6566a883-ca9d-4251-b7f3-7e0e087e3020-cert\") pod \"ingress-canary-wtf4m\" (UID: \"6566a883-ca9d-4251-b7f3-7e0e087e3020\") " pod="openshift-ingress-canary/ingress-canary-wtf4m" Apr 23 13:32:10.105774 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:32:10.105774 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/495e17e1-7de5-454d-a2b3-240f3cf879a4-metrics-tls\") pod \"dns-default-vf5fb\" (UID: \"495e17e1-7de5-454d-a2b3-240f3cf879a4\") " pod="openshift-dns/dns-default-vf5fb" Apr 23 13:32:10.106245 ip-10-0-128-108 kubenswrapper[2571]: E0423 13:32:10.105876 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 13:32:10.106245 ip-10-0-128-108 kubenswrapper[2571]: E0423 13:32:10.105882 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 13:32:10.106245 ip-10-0-128-108 kubenswrapper[2571]: E0423 13:32:10.105928 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/495e17e1-7de5-454d-a2b3-240f3cf879a4-metrics-tls podName:495e17e1-7de5-454d-a2b3-240f3cf879a4 nodeName:}" failed. No retries permitted until 2026-04-23 13:32:42.10591366 +0000 UTC m=+96.244192322 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/495e17e1-7de5-454d-a2b3-240f3cf879a4-metrics-tls") pod "dns-default-vf5fb" (UID: "495e17e1-7de5-454d-a2b3-240f3cf879a4") : secret "dns-default-metrics-tls" not found Apr 23 13:32:10.106245 ip-10-0-128-108 kubenswrapper[2571]: E0423 13:32:10.105946 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6566a883-ca9d-4251-b7f3-7e0e087e3020-cert podName:6566a883-ca9d-4251-b7f3-7e0e087e3020 nodeName:}" failed. No retries permitted until 2026-04-23 13:32:42.105933509 +0000 UTC m=+96.244212176 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6566a883-ca9d-4251-b7f3-7e0e087e3020-cert") pod "ingress-canary-wtf4m" (UID: "6566a883-ca9d-4251-b7f3-7e0e087e3020") : secret "canary-serving-cert" not found Apr 23 13:32:11.114594 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:32:11.114555 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6866a2aa-1943-4e03-a99a-8b054a2434c8-metrics-certs\") pod \"network-metrics-daemon-k958q\" (UID: \"6866a2aa-1943-4e03-a99a-8b054a2434c8\") " pod="openshift-multus/network-metrics-daemon-k958q" Apr 23 13:32:11.114967 ip-10-0-128-108 kubenswrapper[2571]: E0423 13:32:11.114662 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 23 13:32:11.114967 ip-10-0-128-108 kubenswrapper[2571]: E0423 13:32:11.114716 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6866a2aa-1943-4e03-a99a-8b054a2434c8-metrics-certs podName:6866a2aa-1943-4e03-a99a-8b054a2434c8 nodeName:}" failed. No retries permitted until 2026-04-23 13:33:15.114701822 +0000 UTC m=+129.252980485 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6866a2aa-1943-4e03-a99a-8b054a2434c8-metrics-certs") pod "network-metrics-daemon-k958q" (UID: "6866a2aa-1943-4e03-a99a-8b054a2434c8") : secret "metrics-daemon-secret" not found Apr 23 13:32:15.646370 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:32:15.646317 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-qqr6d" Apr 23 13:32:42.126953 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:32:42.126909 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6566a883-ca9d-4251-b7f3-7e0e087e3020-cert\") pod \"ingress-canary-wtf4m\" (UID: \"6566a883-ca9d-4251-b7f3-7e0e087e3020\") " pod="openshift-ingress-canary/ingress-canary-wtf4m" Apr 23 13:32:42.126953 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:32:42.126954 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/495e17e1-7de5-454d-a2b3-240f3cf879a4-metrics-tls\") pod \"dns-default-vf5fb\" (UID: \"495e17e1-7de5-454d-a2b3-240f3cf879a4\") " pod="openshift-dns/dns-default-vf5fb" Apr 23 13:32:42.127397 ip-10-0-128-108 kubenswrapper[2571]: E0423 13:32:42.127051 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 13:32:42.127397 ip-10-0-128-108 kubenswrapper[2571]: E0423 13:32:42.127108 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/495e17e1-7de5-454d-a2b3-240f3cf879a4-metrics-tls podName:495e17e1-7de5-454d-a2b3-240f3cf879a4 nodeName:}" failed. No retries permitted until 2026-04-23 13:33:46.127092191 +0000 UTC m=+160.265370854 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/495e17e1-7de5-454d-a2b3-240f3cf879a4-metrics-tls") pod "dns-default-vf5fb" (UID: "495e17e1-7de5-454d-a2b3-240f3cf879a4") : secret "dns-default-metrics-tls" not found Apr 23 13:32:42.127397 ip-10-0-128-108 kubenswrapper[2571]: E0423 13:32:42.127051 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 13:32:42.127397 ip-10-0-128-108 kubenswrapper[2571]: E0423 13:32:42.127197 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6566a883-ca9d-4251-b7f3-7e0e087e3020-cert podName:6566a883-ca9d-4251-b7f3-7e0e087e3020 nodeName:}" failed. No retries permitted until 2026-04-23 13:33:46.127181888 +0000 UTC m=+160.265460571 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6566a883-ca9d-4251-b7f3-7e0e087e3020-cert") pod "ingress-canary-wtf4m" (UID: "6566a883-ca9d-4251-b7f3-7e0e087e3020") : secret "canary-serving-cert" not found Apr 23 13:33:15.153728 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:33:15.153684 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6866a2aa-1943-4e03-a99a-8b054a2434c8-metrics-certs\") pod \"network-metrics-daemon-k958q\" (UID: \"6866a2aa-1943-4e03-a99a-8b054a2434c8\") " pod="openshift-multus/network-metrics-daemon-k958q" Apr 23 13:33:15.154180 ip-10-0-128-108 kubenswrapper[2571]: E0423 13:33:15.153829 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 23 13:33:15.154180 ip-10-0-128-108 kubenswrapper[2571]: E0423 13:33:15.153903 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6866a2aa-1943-4e03-a99a-8b054a2434c8-metrics-certs podName:6866a2aa-1943-4e03-a99a-8b054a2434c8 nodeName:}" failed. No retries permitted until 2026-04-23 13:35:17.153886586 +0000 UTC m=+251.292165250 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6866a2aa-1943-4e03-a99a-8b054a2434c8-metrics-certs") pod "network-metrics-daemon-k958q" (UID: "6866a2aa-1943-4e03-a99a-8b054a2434c8") : secret "metrics-daemon-secret" not found Apr 23 13:33:34.181172 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:33:34.181144 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-nbf64_cecba8e3-e60c-4053-96af-5ab1c4960855/dns-node-resolver/0.log" Apr 23 13:33:34.781516 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:33:34.781487 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-2rlxj_c524008d-56ba-4f94-99b4-bd3ef55ba66f/node-ca/0.log" Apr 23 13:33:41.291211 ip-10-0-128-108 kubenswrapper[2571]: E0423 13:33:41.291151 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-vf5fb" podUID="495e17e1-7de5-454d-a2b3-240f3cf879a4" Apr 23 13:33:41.314755 ip-10-0-128-108 kubenswrapper[2571]: E0423 13:33:41.314723 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-wtf4m" podUID="6566a883-ca9d-4251-b7f3-7e0e087e3020" Apr 23 13:33:41.446666 ip-10-0-128-108 kubenswrapper[2571]: E0423 13:33:41.446634 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-k958q" podUID="6866a2aa-1943-4e03-a99a-8b054a2434c8" Apr 23 13:33:41.928755 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:33:41.928729 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-vf5fb" Apr 23 13:33:46.164985 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:33:46.164958 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6566a883-ca9d-4251-b7f3-7e0e087e3020-cert\") pod \"ingress-canary-wtf4m\" (UID: \"6566a883-ca9d-4251-b7f3-7e0e087e3020\") " pod="openshift-ingress-canary/ingress-canary-wtf4m" Apr 23 13:33:46.165387 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:33:46.164992 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/495e17e1-7de5-454d-a2b3-240f3cf879a4-metrics-tls\") pod \"dns-default-vf5fb\" (UID: \"495e17e1-7de5-454d-a2b3-240f3cf879a4\") " pod="openshift-dns/dns-default-vf5fb" Apr 23 13:33:46.165387 ip-10-0-128-108 kubenswrapper[2571]: E0423 13:33:46.165102 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 13:33:46.165387 ip-10-0-128-108 kubenswrapper[2571]: E0423 13:33:46.165169 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/495e17e1-7de5-454d-a2b3-240f3cf879a4-metrics-tls podName:495e17e1-7de5-454d-a2b3-240f3cf879a4 nodeName:}" failed. No retries permitted until 2026-04-23 13:35:48.165154187 +0000 UTC m=+282.303432851 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/495e17e1-7de5-454d-a2b3-240f3cf879a4-metrics-tls") pod "dns-default-vf5fb" (UID: "495e17e1-7de5-454d-a2b3-240f3cf879a4") : secret "dns-default-metrics-tls" not found Apr 23 13:33:46.165387 ip-10-0-128-108 kubenswrapper[2571]: E0423 13:33:46.165102 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 13:33:46.165387 ip-10-0-128-108 kubenswrapper[2571]: E0423 13:33:46.165236 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6566a883-ca9d-4251-b7f3-7e0e087e3020-cert podName:6566a883-ca9d-4251-b7f3-7e0e087e3020 nodeName:}" failed. No retries permitted until 2026-04-23 13:35:48.165224798 +0000 UTC m=+282.303503465 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6566a883-ca9d-4251-b7f3-7e0e087e3020-cert") pod "ingress-canary-wtf4m" (UID: "6566a883-ca9d-4251-b7f3-7e0e087e3020") : secret "canary-serving-cert" not found Apr 23 13:33:53.428917 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:33:53.428830 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k958q" Apr 23 13:33:55.429700 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:33:55.429608 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-wtf4m" Apr 23 13:33:55.675399 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:33:55.675322 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-58f594c6dd-vbgbt" podUID="e3810e76-cf77-4c5b-8b1a-652d2b3cb359" containerName="acm-agent" probeResult="failure" output="Get \"http://10.133.0.9:8000/readyz\": dial tcp 10.133.0.9:8000: connect: connection refused" Apr 23 13:33:55.960390 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:33:55.960353 2571 generic.go:358] "Generic (PLEG): container finished" podID="c71cc2a2-8c5e-442b-8c2f-d450cc0e0cb2" containerID="ced7c49b285280528d1632b0ae52d306363a3f554daecfd927e29af7b09ed902" exitCode=255 Apr 23 13:33:55.960574 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:33:55.960424 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-f9c55fcd5-df8v4" event={"ID":"c71cc2a2-8c5e-442b-8c2f-d450cc0e0cb2","Type":"ContainerDied","Data":"ced7c49b285280528d1632b0ae52d306363a3f554daecfd927e29af7b09ed902"} Apr 23 13:33:55.960786 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:33:55.960768 2571 scope.go:117] "RemoveContainer" containerID="ced7c49b285280528d1632b0ae52d306363a3f554daecfd927e29af7b09ed902" Apr 23 13:33:55.961670 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:33:55.961646 2571 generic.go:358] "Generic (PLEG): container finished" podID="e3810e76-cf77-4c5b-8b1a-652d2b3cb359" containerID="5b8cd4605ef9cc6cb33b37236635fc549c9035130da2a5bed373af4af88c7cf8" exitCode=1 Apr 23 13:33:55.961738 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:33:55.961687 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-58f594c6dd-vbgbt" event={"ID":"e3810e76-cf77-4c5b-8b1a-652d2b3cb359","Type":"ContainerDied","Data":"5b8cd4605ef9cc6cb33b37236635fc549c9035130da2a5bed373af4af88c7cf8"} Apr 23 13:33:55.961941 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:33:55.961927 2571 scope.go:117] "RemoveContainer" containerID="5b8cd4605ef9cc6cb33b37236635fc549c9035130da2a5bed373af4af88c7cf8" Apr 23 13:33:56.965515 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:33:56.965476 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-58f594c6dd-vbgbt" event={"ID":"e3810e76-cf77-4c5b-8b1a-652d2b3cb359","Type":"ContainerStarted","Data":"60c2a85357856e8963ba2c0ef14c7a3b26294a5746eee2a17e2f2f4edc557076"} Apr 23 13:33:56.966000 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:33:56.965810 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-58f594c6dd-vbgbt" Apr 23 13:33:56.966530 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:33:56.966507 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-58f594c6dd-vbgbt" Apr 23 13:33:56.966917 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:33:56.966899 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-f9c55fcd5-df8v4" event={"ID":"c71cc2a2-8c5e-442b-8c2f-d450cc0e0cb2","Type":"ContainerStarted","Data":"fe457da0d8e5a38888d0bce7c7f299f78126e9cd09b89869b6295568c401a68e"} Apr 23 13:34:02.459831 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:34:02.459798 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-pn526"] Apr 23 13:34:02.462909 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:34:02.462889 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-pn526" Apr 23 13:34:02.465451 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:34:02.465431 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-qjjwc\"" Apr 23 13:34:02.466584 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:34:02.466566 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 23 13:34:02.466671 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:34:02.466566 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 23 13:34:02.466671 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:34:02.466611 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 23 13:34:02.466671 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:34:02.466642 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 23 13:34:02.478715 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:34:02.478690 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/12e5c21a-627a-4ad7-b68e-6300b337cf27-crio-socket\") pod \"insights-runtime-extractor-pn526\" (UID: \"12e5c21a-627a-4ad7-b68e-6300b337cf27\") " pod="openshift-insights/insights-runtime-extractor-pn526" Apr 23 13:34:02.478824 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:34:02.478730 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/12e5c21a-627a-4ad7-b68e-6300b337cf27-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-pn526\" (UID: \"12e5c21a-627a-4ad7-b68e-6300b337cf27\") " pod="openshift-insights/insights-runtime-extractor-pn526" Apr 23 13:34:02.478824 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:34:02.478755 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/12e5c21a-627a-4ad7-b68e-6300b337cf27-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-pn526\" (UID: \"12e5c21a-627a-4ad7-b68e-6300b337cf27\") " pod="openshift-insights/insights-runtime-extractor-pn526" Apr 23 13:34:02.478929 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:34:02.478840 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gvjvg\" (UniqueName: \"kubernetes.io/projected/12e5c21a-627a-4ad7-b68e-6300b337cf27-kube-api-access-gvjvg\") pod \"insights-runtime-extractor-pn526\" (UID: \"12e5c21a-627a-4ad7-b68e-6300b337cf27\") " pod="openshift-insights/insights-runtime-extractor-pn526" Apr 23 13:34:02.478929 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:34:02.478923 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/12e5c21a-627a-4ad7-b68e-6300b337cf27-data-volume\") pod \"insights-runtime-extractor-pn526\" (UID: \"12e5c21a-627a-4ad7-b68e-6300b337cf27\") " pod="openshift-insights/insights-runtime-extractor-pn526" Apr 23 13:34:02.479613 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:34:02.479594 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-pn526"] Apr 23 13:34:02.580223 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:34:02.580189 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/12e5c21a-627a-4ad7-b68e-6300b337cf27-crio-socket\") pod \"insights-runtime-extractor-pn526\" (UID: \"12e5c21a-627a-4ad7-b68e-6300b337cf27\") " pod="openshift-insights/insights-runtime-extractor-pn526" Apr 23 13:34:02.580223 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:34:02.580222 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/12e5c21a-627a-4ad7-b68e-6300b337cf27-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-pn526\" (UID: \"12e5c21a-627a-4ad7-b68e-6300b337cf27\") " pod="openshift-insights/insights-runtime-extractor-pn526" Apr 23 13:34:02.580475 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:34:02.580240 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/12e5c21a-627a-4ad7-b68e-6300b337cf27-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-pn526\" (UID: \"12e5c21a-627a-4ad7-b68e-6300b337cf27\") " pod="openshift-insights/insights-runtime-extractor-pn526" Apr 23 13:34:02.580475 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:34:02.580268 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gvjvg\" (UniqueName: \"kubernetes.io/projected/12e5c21a-627a-4ad7-b68e-6300b337cf27-kube-api-access-gvjvg\") pod \"insights-runtime-extractor-pn526\" (UID: \"12e5c21a-627a-4ad7-b68e-6300b337cf27\") " pod="openshift-insights/insights-runtime-extractor-pn526" Apr 23 13:34:02.580475 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:34:02.580315 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/12e5c21a-627a-4ad7-b68e-6300b337cf27-crio-socket\") pod \"insights-runtime-extractor-pn526\" (UID: \"12e5c21a-627a-4ad7-b68e-6300b337cf27\") " pod="openshift-insights/insights-runtime-extractor-pn526" Apr 23 13:34:02.580475 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:34:02.580326 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/12e5c21a-627a-4ad7-b68e-6300b337cf27-data-volume\") pod \"insights-runtime-extractor-pn526\" (UID: \"12e5c21a-627a-4ad7-b68e-6300b337cf27\") " pod="openshift-insights/insights-runtime-extractor-pn526" Apr 23 13:34:02.580759 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:34:02.580740 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/12e5c21a-627a-4ad7-b68e-6300b337cf27-data-volume\") pod \"insights-runtime-extractor-pn526\" (UID: \"12e5c21a-627a-4ad7-b68e-6300b337cf27\") " pod="openshift-insights/insights-runtime-extractor-pn526" Apr 23 13:34:02.580807 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:34:02.580785 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/12e5c21a-627a-4ad7-b68e-6300b337cf27-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-pn526\" (UID: \"12e5c21a-627a-4ad7-b68e-6300b337cf27\") " pod="openshift-insights/insights-runtime-extractor-pn526" Apr 23 13:34:02.582554 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:34:02.582521 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/12e5c21a-627a-4ad7-b68e-6300b337cf27-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-pn526\" (UID: \"12e5c21a-627a-4ad7-b68e-6300b337cf27\") " pod="openshift-insights/insights-runtime-extractor-pn526" Apr 23 13:34:02.611595 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:34:02.611555 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gvjvg\" (UniqueName: \"kubernetes.io/projected/12e5c21a-627a-4ad7-b68e-6300b337cf27-kube-api-access-gvjvg\") pod \"insights-runtime-extractor-pn526\" (UID: \"12e5c21a-627a-4ad7-b68e-6300b337cf27\") " pod="openshift-insights/insights-runtime-extractor-pn526" Apr 23 13:34:02.775170 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:34:02.775080 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-pn526" Apr 23 13:34:02.899942 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:34:02.899905 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-pn526"] Apr 23 13:34:02.904970 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:34:02.904936 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod12e5c21a_627a_4ad7_b68e_6300b337cf27.slice/crio-792532919beab39b087f0019727b0538685fe6efa8a76d5543e5133f2475c9db WatchSource:0}: Error finding container 792532919beab39b087f0019727b0538685fe6efa8a76d5543e5133f2475c9db: Status 404 returned error can't find the container with id 792532919beab39b087f0019727b0538685fe6efa8a76d5543e5133f2475c9db Apr 23 13:34:02.981430 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:34:02.981399 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-pn526" event={"ID":"12e5c21a-627a-4ad7-b68e-6300b337cf27","Type":"ContainerStarted","Data":"5e7ef794b23616c8de5a1666eb6807b3fb57d94d4a649f361edab46084da9d9b"} Apr 23 13:34:02.981430 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:34:02.981434 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-pn526" event={"ID":"12e5c21a-627a-4ad7-b68e-6300b337cf27","Type":"ContainerStarted","Data":"792532919beab39b087f0019727b0538685fe6efa8a76d5543e5133f2475c9db"} Apr 23 13:34:03.984945 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:34:03.984908 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-pn526" event={"ID":"12e5c21a-627a-4ad7-b68e-6300b337cf27","Type":"ContainerStarted","Data":"d1b6fc3bcf34e1d2c21a5221e6c819c1b12b8d643d9ed29afecaa48870297779"} Apr 23 13:34:05.991254 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:34:05.991220 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-pn526" event={"ID":"12e5c21a-627a-4ad7-b68e-6300b337cf27","Type":"ContainerStarted","Data":"fca55cb6790354912af43e0dcb2e6e5a4e459ce8a50de3508f4aa686c2bba71d"} Apr 23 13:34:06.010051 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:34:06.009929 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-pn526" podStartSLOduration=2.003399808 podStartE2EDuration="4.00991387s" podCreationTimestamp="2026-04-23 13:34:02 +0000 UTC" firstStartedPulling="2026-04-23 13:34:02.958527666 +0000 UTC m=+177.096806335" lastFinishedPulling="2026-04-23 13:34:04.965041724 +0000 UTC m=+179.103320397" observedRunningTime="2026-04-23 13:34:06.009436666 +0000 UTC m=+180.147715345" watchObservedRunningTime="2026-04-23 13:34:06.00991387 +0000 UTC m=+180.148192555" Apr 23 13:34:11.922415 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:34:11.922378 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-cxbrs"] Apr 23 13:34:11.925639 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:34:11.925619 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-cxbrs" Apr 23 13:34:11.928099 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:34:11.928077 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 23 13:34:11.928190 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:34:11.928127 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 23 13:34:11.928190 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:34:11.928149 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 23 13:34:11.928190 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:34:11.928160 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-d2mkv\"" Apr 23 13:34:11.928993 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:34:11.928970 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 23 13:34:11.929099 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:34:11.929025 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 23 13:34:11.929099 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:34:11.929091 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 23 13:34:11.945307 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:34:11.945287 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/743cfb34-da8a-415a-8dbe-192227645691-sys\") pod \"node-exporter-cxbrs\" (UID: \"743cfb34-da8a-415a-8dbe-192227645691\") " pod="openshift-monitoring/node-exporter-cxbrs" Apr 23 13:34:11.945420 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:34:11.945315 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/743cfb34-da8a-415a-8dbe-192227645691-node-exporter-textfile\") pod \"node-exporter-cxbrs\" (UID: \"743cfb34-da8a-415a-8dbe-192227645691\") " pod="openshift-monitoring/node-exporter-cxbrs" Apr 23 13:34:11.945420 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:34:11.945349 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/743cfb34-da8a-415a-8dbe-192227645691-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-cxbrs\" (UID: \"743cfb34-da8a-415a-8dbe-192227645691\") " pod="openshift-monitoring/node-exporter-cxbrs" Apr 23 13:34:11.945420 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:34:11.945373 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/743cfb34-da8a-415a-8dbe-192227645691-node-exporter-tls\") pod \"node-exporter-cxbrs\" (UID: \"743cfb34-da8a-415a-8dbe-192227645691\") " pod="openshift-monitoring/node-exporter-cxbrs" Apr 23 13:34:11.945552 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:34:11.945436 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/743cfb34-da8a-415a-8dbe-192227645691-node-exporter-wtmp\") pod \"node-exporter-cxbrs\" (UID: \"743cfb34-da8a-415a-8dbe-192227645691\") " pod="openshift-monitoring/node-exporter-cxbrs" Apr 23 13:34:11.945552 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:34:11.945529 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/743cfb34-da8a-415a-8dbe-192227645691-node-exporter-accelerators-collector-config\") pod \"node-exporter-cxbrs\" (UID: \"743cfb34-da8a-415a-8dbe-192227645691\") " pod="openshift-monitoring/node-exporter-cxbrs" Apr 23 13:34:11.945624 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:34:11.945582 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/743cfb34-da8a-415a-8dbe-192227645691-root\") pod \"node-exporter-cxbrs\" (UID: \"743cfb34-da8a-415a-8dbe-192227645691\") " pod="openshift-monitoring/node-exporter-cxbrs" Apr 23 13:34:11.945665 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:34:11.945651 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/743cfb34-da8a-415a-8dbe-192227645691-metrics-client-ca\") pod \"node-exporter-cxbrs\" (UID: \"743cfb34-da8a-415a-8dbe-192227645691\") " pod="openshift-monitoring/node-exporter-cxbrs" Apr 23 13:34:11.945700 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:34:11.945677 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ltlpn\" (UniqueName: \"kubernetes.io/projected/743cfb34-da8a-415a-8dbe-192227645691-kube-api-access-ltlpn\") pod \"node-exporter-cxbrs\" (UID: \"743cfb34-da8a-415a-8dbe-192227645691\") " pod="openshift-monitoring/node-exporter-cxbrs" Apr 23 13:34:12.046121 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:34:12.046090 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/743cfb34-da8a-415a-8dbe-192227645691-node-exporter-textfile\") pod \"node-exporter-cxbrs\" (UID: \"743cfb34-da8a-415a-8dbe-192227645691\") " pod="openshift-monitoring/node-exporter-cxbrs" Apr 23 13:34:12.046121 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:34:12.046125 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/743cfb34-da8a-415a-8dbe-192227645691-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-cxbrs\" (UID: \"743cfb34-da8a-415a-8dbe-192227645691\") " pod="openshift-monitoring/node-exporter-cxbrs" Apr 23 13:34:12.046370 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:34:12.046141 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/743cfb34-da8a-415a-8dbe-192227645691-node-exporter-tls\") pod \"node-exporter-cxbrs\" (UID: \"743cfb34-da8a-415a-8dbe-192227645691\") " pod="openshift-monitoring/node-exporter-cxbrs" Apr 23 13:34:12.046370 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:34:12.046158 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/743cfb34-da8a-415a-8dbe-192227645691-node-exporter-wtmp\") pod \"node-exporter-cxbrs\" (UID: \"743cfb34-da8a-415a-8dbe-192227645691\") " pod="openshift-monitoring/node-exporter-cxbrs" Apr 23 13:34:12.046370 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:34:12.046243 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/743cfb34-da8a-415a-8dbe-192227645691-node-exporter-accelerators-collector-config\") pod \"node-exporter-cxbrs\" (UID: \"743cfb34-da8a-415a-8dbe-192227645691\") " pod="openshift-monitoring/node-exporter-cxbrs" Apr 23 13:34:12.046370 ip-10-0-128-108 kubenswrapper[2571]: E0423 13:34:12.046258 2571 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 23 13:34:12.046370 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:34:12.046303 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/743cfb34-da8a-415a-8dbe-192227645691-root\") pod \"node-exporter-cxbrs\" (UID: \"743cfb34-da8a-415a-8dbe-192227645691\") " pod="openshift-monitoring/node-exporter-cxbrs" Apr 23 13:34:12.046370 ip-10-0-128-108 kubenswrapper[2571]: E0423 13:34:12.046348 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/743cfb34-da8a-415a-8dbe-192227645691-node-exporter-tls podName:743cfb34-da8a-415a-8dbe-192227645691 nodeName:}" failed. No retries permitted until 2026-04-23 13:34:12.546309539 +0000 UTC m=+186.684588202 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/743cfb34-da8a-415a-8dbe-192227645691-node-exporter-tls") pod "node-exporter-cxbrs" (UID: "743cfb34-da8a-415a-8dbe-192227645691") : secret "node-exporter-tls" not found Apr 23 13:34:12.046370 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:34:12.046360 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/743cfb34-da8a-415a-8dbe-192227645691-root\") pod \"node-exporter-cxbrs\" (UID: \"743cfb34-da8a-415a-8dbe-192227645691\") " pod="openshift-monitoring/node-exporter-cxbrs" Apr 23 13:34:12.046682 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:34:12.046407 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/743cfb34-da8a-415a-8dbe-192227645691-node-exporter-wtmp\") pod \"node-exporter-cxbrs\" (UID: \"743cfb34-da8a-415a-8dbe-192227645691\") " pod="openshift-monitoring/node-exporter-cxbrs" Apr 23 13:34:12.046682 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:34:12.046439 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/743cfb34-da8a-415a-8dbe-192227645691-metrics-client-ca\") pod \"node-exporter-cxbrs\" (UID: \"743cfb34-da8a-415a-8dbe-192227645691\") " pod="openshift-monitoring/node-exporter-cxbrs" Apr 23 13:34:12.046921 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:34:12.046889 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/743cfb34-da8a-415a-8dbe-192227645691-node-exporter-textfile\") pod \"node-exporter-cxbrs\" (UID: \"743cfb34-da8a-415a-8dbe-192227645691\") " pod="openshift-monitoring/node-exporter-cxbrs" Apr 23 13:34:12.047045 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:34:12.046939 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ltlpn\" (UniqueName: \"kubernetes.io/projected/743cfb34-da8a-415a-8dbe-192227645691-kube-api-access-ltlpn\") pod \"node-exporter-cxbrs\" (UID: \"743cfb34-da8a-415a-8dbe-192227645691\") " pod="openshift-monitoring/node-exporter-cxbrs" Apr 23 13:34:12.047045 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:34:12.046992 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/743cfb34-da8a-415a-8dbe-192227645691-sys\") pod \"node-exporter-cxbrs\" (UID: \"743cfb34-da8a-415a-8dbe-192227645691\") " pod="openshift-monitoring/node-exporter-cxbrs" Apr 23 13:34:12.047157 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:34:12.047057 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/743cfb34-da8a-415a-8dbe-192227645691-sys\") pod \"node-exporter-cxbrs\" (UID: \"743cfb34-da8a-415a-8dbe-192227645691\") " pod="openshift-monitoring/node-exporter-cxbrs" Apr 23 13:34:12.047284 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:34:12.047262 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/743cfb34-da8a-415a-8dbe-192227645691-node-exporter-accelerators-collector-config\") pod \"node-exporter-cxbrs\" (UID: \"743cfb34-da8a-415a-8dbe-192227645691\") " pod="openshift-monitoring/node-exporter-cxbrs" Apr 23 13:34:12.047433 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:34:12.047410 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/743cfb34-da8a-415a-8dbe-192227645691-metrics-client-ca\") pod \"node-exporter-cxbrs\" (UID: \"743cfb34-da8a-415a-8dbe-192227645691\") " pod="openshift-monitoring/node-exporter-cxbrs" Apr 23 13:34:12.051467 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:34:12.051441 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/743cfb34-da8a-415a-8dbe-192227645691-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-cxbrs\" (UID: \"743cfb34-da8a-415a-8dbe-192227645691\") " pod="openshift-monitoring/node-exporter-cxbrs" Apr 23 13:34:12.057212 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:34:12.057193 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ltlpn\" (UniqueName: \"kubernetes.io/projected/743cfb34-da8a-415a-8dbe-192227645691-kube-api-access-ltlpn\") pod \"node-exporter-cxbrs\" (UID: \"743cfb34-da8a-415a-8dbe-192227645691\") " pod="openshift-monitoring/node-exporter-cxbrs" Apr 23 13:34:12.552853 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:34:12.552818 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/743cfb34-da8a-415a-8dbe-192227645691-node-exporter-tls\") pod \"node-exporter-cxbrs\" (UID: \"743cfb34-da8a-415a-8dbe-192227645691\") " pod="openshift-monitoring/node-exporter-cxbrs" Apr 23 13:34:12.554991 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:34:12.554959 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/743cfb34-da8a-415a-8dbe-192227645691-node-exporter-tls\") pod \"node-exporter-cxbrs\" (UID: \"743cfb34-da8a-415a-8dbe-192227645691\") " pod="openshift-monitoring/node-exporter-cxbrs" Apr 23 13:34:12.834891 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:34:12.834796 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-cxbrs" Apr 23 13:34:12.842423 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:34:12.842396 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod743cfb34_da8a_415a_8dbe_192227645691.slice/crio-72060e7cc9c0673d6134185765023971fbc354fedf794e80d20e23f1be6c4745 WatchSource:0}: Error finding container 72060e7cc9c0673d6134185765023971fbc354fedf794e80d20e23f1be6c4745: Status 404 returned error can't find the container with id 72060e7cc9c0673d6134185765023971fbc354fedf794e80d20e23f1be6c4745 Apr 23 13:34:13.008654 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:34:13.008615 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-cxbrs" event={"ID":"743cfb34-da8a-415a-8dbe-192227645691","Type":"ContainerStarted","Data":"72060e7cc9c0673d6134185765023971fbc354fedf794e80d20e23f1be6c4745"} Apr 23 13:34:14.011765 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:34:14.011735 2571 generic.go:358] "Generic (PLEG): container finished" podID="743cfb34-da8a-415a-8dbe-192227645691" containerID="49d50397d342341a12eae308e373232d7e0774324beb06daeb18fce9348985e9" exitCode=0 Apr 23 13:34:14.012132 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:34:14.011816 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-cxbrs" event={"ID":"743cfb34-da8a-415a-8dbe-192227645691","Type":"ContainerDied","Data":"49d50397d342341a12eae308e373232d7e0774324beb06daeb18fce9348985e9"} Apr 23 13:34:15.016534 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:34:15.016498 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-cxbrs" event={"ID":"743cfb34-da8a-415a-8dbe-192227645691","Type":"ContainerStarted","Data":"fb7256b666883a5a7b4bdd0ab636f98240d8d390118bec0a64eec21d58d5396e"} Apr 23 13:34:15.016534 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:34:15.016533 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-cxbrs" event={"ID":"743cfb34-da8a-415a-8dbe-192227645691","Type":"ContainerStarted","Data":"26247e7b84b7a7ea73c26de12c4eb83831eb369635a5c86670c897b428aab669"} Apr 23 13:34:15.035527 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:34:15.035464 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-cxbrs" podStartSLOduration=3.228041496 podStartE2EDuration="4.035451223s" podCreationTimestamp="2026-04-23 13:34:11 +0000 UTC" firstStartedPulling="2026-04-23 13:34:12.844070915 +0000 UTC m=+186.982349581" lastFinishedPulling="2026-04-23 13:34:13.651480635 +0000 UTC m=+187.789759308" observedRunningTime="2026-04-23 13:34:15.034349852 +0000 UTC m=+189.172628528" watchObservedRunningTime="2026-04-23 13:34:15.035451223 +0000 UTC m=+189.173729908" Apr 23 13:34:39.827296 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:34:39.827250 2571 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-68b669577-z4bj9" podUID="9f54f5af-8da9-41e6-bba0-09abda7835d7" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 23 13:34:49.827084 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:34:49.827040 2571 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-68b669577-z4bj9" podUID="9f54f5af-8da9-41e6-bba0-09abda7835d7" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 23 13:34:59.827195 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:34:59.827151 2571 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-68b669577-z4bj9" podUID="9f54f5af-8da9-41e6-bba0-09abda7835d7" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 23 13:34:59.827571 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:34:59.827233 2571 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-68b669577-z4bj9" Apr 23 13:34:59.827736 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:34:59.827705 2571 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="service-proxy" containerStatusID={"Type":"cri-o","ID":"82bda2a26b1e0894f2e452d3eab41925ee1d383b14336846c2c7d0e005d65087"} pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-68b669577-z4bj9" containerMessage="Container service-proxy failed liveness probe, will be restarted" Apr 23 13:34:59.827782 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:34:59.827768 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-68b669577-z4bj9" podUID="9f54f5af-8da9-41e6-bba0-09abda7835d7" containerName="service-proxy" containerID="cri-o://82bda2a26b1e0894f2e452d3eab41925ee1d383b14336846c2c7d0e005d65087" gracePeriod=30 Apr 23 13:35:00.130903 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:35:00.130816 2571 generic.go:358] "Generic (PLEG): container finished" podID="9f54f5af-8da9-41e6-bba0-09abda7835d7" containerID="82bda2a26b1e0894f2e452d3eab41925ee1d383b14336846c2c7d0e005d65087" exitCode=2 Apr 23 13:35:00.130903 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:35:00.130872 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-68b669577-z4bj9" event={"ID":"9f54f5af-8da9-41e6-bba0-09abda7835d7","Type":"ContainerDied","Data":"82bda2a26b1e0894f2e452d3eab41925ee1d383b14336846c2c7d0e005d65087"} Apr 23 13:35:00.130903 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:35:00.130897 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-68b669577-z4bj9" event={"ID":"9f54f5af-8da9-41e6-bba0-09abda7835d7","Type":"ContainerStarted","Data":"0ef386b32383dd68866dd4a107645e297bd6b1286ee61659393b14f092658e6e"} Apr 23 13:35:03.248296 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:35:03.248267 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-nbf64_cecba8e3-e60c-4053-96af-5ab1c4960855/dns-node-resolver/0.log" Apr 23 13:35:17.245901 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:35:17.245855 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6866a2aa-1943-4e03-a99a-8b054a2434c8-metrics-certs\") pod \"network-metrics-daemon-k958q\" (UID: \"6866a2aa-1943-4e03-a99a-8b054a2434c8\") " pod="openshift-multus/network-metrics-daemon-k958q" Apr 23 13:35:17.248070 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:35:17.248049 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6866a2aa-1943-4e03-a99a-8b054a2434c8-metrics-certs\") pod \"network-metrics-daemon-k958q\" (UID: \"6866a2aa-1943-4e03-a99a-8b054a2434c8\") " pod="openshift-multus/network-metrics-daemon-k958q" Apr 23 13:35:17.432711 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:35:17.432681 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-lbjnn\"" Apr 23 13:35:17.440399 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:35:17.440382 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k958q" Apr 23 13:35:17.553757 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:35:17.553728 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-k958q"] Apr 23 13:35:17.556905 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:35:17.556875 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6866a2aa_1943_4e03_a99a_8b054a2434c8.slice/crio-05e084522ffa93a16f1852d6cfe6db3a40161d8e85d54a1f4b16c50d91559c8a WatchSource:0}: Error finding container 05e084522ffa93a16f1852d6cfe6db3a40161d8e85d54a1f4b16c50d91559c8a: Status 404 returned error can't find the container with id 05e084522ffa93a16f1852d6cfe6db3a40161d8e85d54a1f4b16c50d91559c8a Apr 23 13:35:18.176145 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:35:18.176103 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-k958q" event={"ID":"6866a2aa-1943-4e03-a99a-8b054a2434c8","Type":"ContainerStarted","Data":"05e084522ffa93a16f1852d6cfe6db3a40161d8e85d54a1f4b16c50d91559c8a"} Apr 23 13:35:19.181020 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:35:19.180984 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-k958q" event={"ID":"6866a2aa-1943-4e03-a99a-8b054a2434c8","Type":"ContainerStarted","Data":"742d7c66929b333f905b0a58c9f0ea9c8aa64626dd078931c85efe8cb1d52e4b"} Apr 23 13:35:19.181020 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:35:19.181019 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-k958q" event={"ID":"6866a2aa-1943-4e03-a99a-8b054a2434c8","Type":"ContainerStarted","Data":"9a78d7cdb43225021510a7eb44683f9645052bf31757169cfda927a9c137ed31"} Apr 23 13:35:19.198376 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:35:19.198294 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-k958q" podStartSLOduration=252.315012425 podStartE2EDuration="4m13.19827833s" podCreationTimestamp="2026-04-23 13:31:06 +0000 UTC" firstStartedPulling="2026-04-23 13:35:17.55925176 +0000 UTC m=+251.697530426" lastFinishedPulling="2026-04-23 13:35:18.442517668 +0000 UTC m=+252.580796331" observedRunningTime="2026-04-23 13:35:19.197590187 +0000 UTC m=+253.335868871" watchObservedRunningTime="2026-04-23 13:35:19.19827833 +0000 UTC m=+253.336557016" Apr 23 13:35:44.929935 ip-10-0-128-108 kubenswrapper[2571]: E0423 13:35:44.929888 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-vf5fb" podUID="495e17e1-7de5-454d-a2b3-240f3cf879a4" Apr 23 13:35:45.240156 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:35:45.240128 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-vf5fb" Apr 23 13:35:48.259834 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:35:48.259799 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/495e17e1-7de5-454d-a2b3-240f3cf879a4-metrics-tls\") pod \"dns-default-vf5fb\" (UID: \"495e17e1-7de5-454d-a2b3-240f3cf879a4\") " pod="openshift-dns/dns-default-vf5fb" Apr 23 13:35:48.260284 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:35:48.259849 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6566a883-ca9d-4251-b7f3-7e0e087e3020-cert\") pod \"ingress-canary-wtf4m\" (UID: \"6566a883-ca9d-4251-b7f3-7e0e087e3020\") " pod="openshift-ingress-canary/ingress-canary-wtf4m" Apr 23 13:35:48.262148 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:35:48.262118 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/495e17e1-7de5-454d-a2b3-240f3cf879a4-metrics-tls\") pod \"dns-default-vf5fb\" (UID: \"495e17e1-7de5-454d-a2b3-240f3cf879a4\") " pod="openshift-dns/dns-default-vf5fb" Apr 23 13:35:48.262148 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:35:48.262129 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6566a883-ca9d-4251-b7f3-7e0e087e3020-cert\") pod \"ingress-canary-wtf4m\" (UID: \"6566a883-ca9d-4251-b7f3-7e0e087e3020\") " pod="openshift-ingress-canary/ingress-canary-wtf4m" Apr 23 13:35:48.533040 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:35:48.532953 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-pj4z2\"" Apr 23 13:35:48.541260 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:35:48.541239 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-wtf4m" Apr 23 13:35:48.543712 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:35:48.543694 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-gtr2j\"" Apr 23 13:35:48.552207 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:35:48.552183 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-vf5fb" Apr 23 13:35:48.687120 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:35:48.687048 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-wtf4m"] Apr 23 13:35:48.689746 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:35:48.689710 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6566a883_ca9d_4251_b7f3_7e0e087e3020.slice/crio-ff64be886a3be9e4bf7c37fef20015663fd72fbc6c012a4d26c2f0ef4de505e1 WatchSource:0}: Error finding container ff64be886a3be9e4bf7c37fef20015663fd72fbc6c012a4d26c2f0ef4de505e1: Status 404 returned error can't find the container with id ff64be886a3be9e4bf7c37fef20015663fd72fbc6c012a4d26c2f0ef4de505e1 Apr 23 13:35:48.713326 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:35:48.713295 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-vf5fb"] Apr 23 13:35:48.716554 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:35:48.716529 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod495e17e1_7de5_454d_a2b3_240f3cf879a4.slice/crio-084e064dcda370b18e47d1665427d83444ec6e16904de1d1cea134257b26c988 WatchSource:0}: Error finding container 084e064dcda370b18e47d1665427d83444ec6e16904de1d1cea134257b26c988: Status 404 returned error can't find the container with id 084e064dcda370b18e47d1665427d83444ec6e16904de1d1cea134257b26c988 Apr 23 13:35:49.251487 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:35:49.251433 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-wtf4m" event={"ID":"6566a883-ca9d-4251-b7f3-7e0e087e3020","Type":"ContainerStarted","Data":"ff64be886a3be9e4bf7c37fef20015663fd72fbc6c012a4d26c2f0ef4de505e1"} Apr 23 13:35:49.252562 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:35:49.252527 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-vf5fb" event={"ID":"495e17e1-7de5-454d-a2b3-240f3cf879a4","Type":"ContainerStarted","Data":"084e064dcda370b18e47d1665427d83444ec6e16904de1d1cea134257b26c988"} Apr 23 13:35:51.258830 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:35:51.258792 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-wtf4m" event={"ID":"6566a883-ca9d-4251-b7f3-7e0e087e3020","Type":"ContainerStarted","Data":"00ef00d0ca6c3d6ad70c9edec50e9db347ec62da8684d29478ade45832b93ab2"} Apr 23 13:35:51.260357 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:35:51.260310 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-vf5fb" event={"ID":"495e17e1-7de5-454d-a2b3-240f3cf879a4","Type":"ContainerStarted","Data":"acd0fd31b373357d9a44a319280bdcbffb47b180619539286897fe4f98ea6f3d"} Apr 23 13:35:51.260480 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:35:51.260362 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-vf5fb" event={"ID":"495e17e1-7de5-454d-a2b3-240f3cf879a4","Type":"ContainerStarted","Data":"cc1a7716374b78ae12e2eb0ae964a20c954a81c1f9f054a0d8d7721942327e52"} Apr 23 13:35:51.260480 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:35:51.260440 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-vf5fb" Apr 23 13:35:51.274180 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:35:51.274140 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-wtf4m" podStartSLOduration=251.55014637 podStartE2EDuration="4m13.274127643s" podCreationTimestamp="2026-04-23 13:31:38 +0000 UTC" firstStartedPulling="2026-04-23 13:35:48.691779846 +0000 UTC m=+282.830058510" lastFinishedPulling="2026-04-23 13:35:50.415761119 +0000 UTC m=+284.554039783" observedRunningTime="2026-04-23 13:35:51.27291834 +0000 UTC m=+285.411197025" watchObservedRunningTime="2026-04-23 13:35:51.274127643 +0000 UTC m=+285.412406365" Apr 23 13:35:51.289924 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:35:51.289882 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-vf5fb" podStartSLOduration=251.595757591 podStartE2EDuration="4m13.289868304s" podCreationTimestamp="2026-04-23 13:31:38 +0000 UTC" firstStartedPulling="2026-04-23 13:35:48.718288222 +0000 UTC m=+282.856566885" lastFinishedPulling="2026-04-23 13:35:50.41239893 +0000 UTC m=+284.550677598" observedRunningTime="2026-04-23 13:35:51.288886324 +0000 UTC m=+285.427165010" watchObservedRunningTime="2026-04-23 13:35:51.289868304 +0000 UTC m=+285.428146987" Apr 23 13:36:01.265396 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:36:01.265367 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-vf5fb" Apr 23 13:36:06.339370 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:36:06.339344 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zk8wt_a5619baf-099b-4d83-ad43-fd7d0083f57b/ovn-acl-logging/0.log" Apr 23 13:36:06.339370 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:36:06.339365 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zk8wt_a5619baf-099b-4d83-ad43-fd7d0083f57b/ovn-acl-logging/0.log" Apr 23 13:36:06.344192 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:36:06.344172 2571 kubelet.go:1628] "Image garbage collection succeeded" Apr 23 13:37:45.246973 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:37:45.246936 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c6kbqj"] Apr 23 13:37:45.250200 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:37:45.250183 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c6kbqj" Apr 23 13:37:45.252821 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:37:45.252802 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 23 13:37:45.253969 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:37:45.253953 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-bxpk4\"" Apr 23 13:37:45.254030 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:37:45.253995 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 23 13:37:45.258899 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:37:45.258875 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c6kbqj"] Apr 23 13:37:45.325327 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:37:45.325296 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9ptsf\" (UniqueName: \"kubernetes.io/projected/37ec7919-ce98-4c8b-b1aa-8ada2efb7563-kube-api-access-9ptsf\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c6kbqj\" (UID: \"37ec7919-ce98-4c8b-b1aa-8ada2efb7563\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c6kbqj" Apr 23 13:37:45.325499 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:37:45.325347 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/37ec7919-ce98-4c8b-b1aa-8ada2efb7563-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c6kbqj\" (UID: \"37ec7919-ce98-4c8b-b1aa-8ada2efb7563\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c6kbqj" Apr 23 13:37:45.325499 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:37:45.325371 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/37ec7919-ce98-4c8b-b1aa-8ada2efb7563-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c6kbqj\" (UID: \"37ec7919-ce98-4c8b-b1aa-8ada2efb7563\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c6kbqj" Apr 23 13:37:45.426241 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:37:45.426197 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/37ec7919-ce98-4c8b-b1aa-8ada2efb7563-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c6kbqj\" (UID: \"37ec7919-ce98-4c8b-b1aa-8ada2efb7563\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c6kbqj" Apr 23 13:37:45.426379 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:37:45.426258 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/37ec7919-ce98-4c8b-b1aa-8ada2efb7563-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c6kbqj\" (UID: \"37ec7919-ce98-4c8b-b1aa-8ada2efb7563\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c6kbqj" Apr 23 13:37:45.426379 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:37:45.426347 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9ptsf\" (UniqueName: \"kubernetes.io/projected/37ec7919-ce98-4c8b-b1aa-8ada2efb7563-kube-api-access-9ptsf\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c6kbqj\" (UID: \"37ec7919-ce98-4c8b-b1aa-8ada2efb7563\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c6kbqj" Apr 23 13:37:45.426697 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:37:45.426672 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/37ec7919-ce98-4c8b-b1aa-8ada2efb7563-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c6kbqj\" (UID: \"37ec7919-ce98-4c8b-b1aa-8ada2efb7563\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c6kbqj" Apr 23 13:37:45.426735 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:37:45.426682 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/37ec7919-ce98-4c8b-b1aa-8ada2efb7563-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c6kbqj\" (UID: \"37ec7919-ce98-4c8b-b1aa-8ada2efb7563\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c6kbqj" Apr 23 13:37:45.435181 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:37:45.435153 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9ptsf\" (UniqueName: \"kubernetes.io/projected/37ec7919-ce98-4c8b-b1aa-8ada2efb7563-kube-api-access-9ptsf\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c6kbqj\" (UID: \"37ec7919-ce98-4c8b-b1aa-8ada2efb7563\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c6kbqj" Apr 23 13:37:45.560075 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:37:45.559998 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c6kbqj" Apr 23 13:37:45.676018 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:37:45.675990 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c6kbqj"] Apr 23 13:37:45.679059 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:37:45.679033 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37ec7919_ce98_4c8b_b1aa_8ada2efb7563.slice/crio-5c433870e96cdeb87df1cf652e0a7c070e0bdbfe6c474255bed7df164284f895 WatchSource:0}: Error finding container 5c433870e96cdeb87df1cf652e0a7c070e0bdbfe6c474255bed7df164284f895: Status 404 returned error can't find the container with id 5c433870e96cdeb87df1cf652e0a7c070e0bdbfe6c474255bed7df164284f895 Apr 23 13:37:45.680805 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:37:45.680788 2571 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 13:37:46.542343 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:37:46.542285 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c6kbqj" event={"ID":"37ec7919-ce98-4c8b-b1aa-8ada2efb7563","Type":"ContainerStarted","Data":"5c433870e96cdeb87df1cf652e0a7c070e0bdbfe6c474255bed7df164284f895"} Apr 23 13:37:50.555743 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:37:50.555712 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c6kbqj" event={"ID":"37ec7919-ce98-4c8b-b1aa-8ada2efb7563","Type":"ContainerStarted","Data":"74a1f657a4748d0dbe34e9f14cfedceae04c2d7d002551191bb87b27be00b812"} Apr 23 13:37:51.559054 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:37:51.559019 2571 generic.go:358] "Generic (PLEG): container finished" podID="37ec7919-ce98-4c8b-b1aa-8ada2efb7563" containerID="74a1f657a4748d0dbe34e9f14cfedceae04c2d7d002551191bb87b27be00b812" exitCode=0 Apr 23 13:37:51.559508 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:37:51.559110 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c6kbqj" event={"ID":"37ec7919-ce98-4c8b-b1aa-8ada2efb7563","Type":"ContainerDied","Data":"74a1f657a4748d0dbe34e9f14cfedceae04c2d7d002551191bb87b27be00b812"} Apr 23 13:37:53.566812 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:37:53.566780 2571 generic.go:358] "Generic (PLEG): container finished" podID="37ec7919-ce98-4c8b-b1aa-8ada2efb7563" containerID="a54c94556e1144e7328f31f22079f8f4db0524e57933adbb93c850780509ce10" exitCode=0 Apr 23 13:37:53.567116 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:37:53.566832 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c6kbqj" event={"ID":"37ec7919-ce98-4c8b-b1aa-8ada2efb7563","Type":"ContainerDied","Data":"a54c94556e1144e7328f31f22079f8f4db0524e57933adbb93c850780509ce10"} Apr 23 13:37:59.585100 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:37:59.585068 2571 generic.go:358] "Generic (PLEG): container finished" podID="37ec7919-ce98-4c8b-b1aa-8ada2efb7563" containerID="0c3e9d90be526e5fe82f7cf59d1039b1c20a91f8ab768602b229f068a2f67e5e" exitCode=0 Apr 23 13:37:59.585490 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:37:59.585141 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c6kbqj" event={"ID":"37ec7919-ce98-4c8b-b1aa-8ada2efb7563","Type":"ContainerDied","Data":"0c3e9d90be526e5fe82f7cf59d1039b1c20a91f8ab768602b229f068a2f67e5e"} Apr 23 13:38:00.700198 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:38:00.700178 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c6kbqj" Apr 23 13:38:00.745598 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:38:00.745570 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/37ec7919-ce98-4c8b-b1aa-8ada2efb7563-bundle\") pod \"37ec7919-ce98-4c8b-b1aa-8ada2efb7563\" (UID: \"37ec7919-ce98-4c8b-b1aa-8ada2efb7563\") " Apr 23 13:38:00.745727 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:38:00.745612 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/37ec7919-ce98-4c8b-b1aa-8ada2efb7563-util\") pod \"37ec7919-ce98-4c8b-b1aa-8ada2efb7563\" (UID: \"37ec7919-ce98-4c8b-b1aa-8ada2efb7563\") " Apr 23 13:38:00.745727 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:38:00.745646 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9ptsf\" (UniqueName: \"kubernetes.io/projected/37ec7919-ce98-4c8b-b1aa-8ada2efb7563-kube-api-access-9ptsf\") pod \"37ec7919-ce98-4c8b-b1aa-8ada2efb7563\" (UID: \"37ec7919-ce98-4c8b-b1aa-8ada2efb7563\") " Apr 23 13:38:00.746144 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:38:00.746110 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/37ec7919-ce98-4c8b-b1aa-8ada2efb7563-bundle" (OuterVolumeSpecName: "bundle") pod "37ec7919-ce98-4c8b-b1aa-8ada2efb7563" (UID: "37ec7919-ce98-4c8b-b1aa-8ada2efb7563"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 13:38:00.747791 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:38:00.747768 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/37ec7919-ce98-4c8b-b1aa-8ada2efb7563-kube-api-access-9ptsf" (OuterVolumeSpecName: "kube-api-access-9ptsf") pod "37ec7919-ce98-4c8b-b1aa-8ada2efb7563" (UID: "37ec7919-ce98-4c8b-b1aa-8ada2efb7563"). InnerVolumeSpecName "kube-api-access-9ptsf". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 13:38:00.749966 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:38:00.749936 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/37ec7919-ce98-4c8b-b1aa-8ada2efb7563-util" (OuterVolumeSpecName: "util") pod "37ec7919-ce98-4c8b-b1aa-8ada2efb7563" (UID: "37ec7919-ce98-4c8b-b1aa-8ada2efb7563"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 13:38:00.846571 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:38:00.846471 2571 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/37ec7919-ce98-4c8b-b1aa-8ada2efb7563-bundle\") on node \"ip-10-0-128-108.ec2.internal\" DevicePath \"\"" Apr 23 13:38:00.846571 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:38:00.846514 2571 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/37ec7919-ce98-4c8b-b1aa-8ada2efb7563-util\") on node \"ip-10-0-128-108.ec2.internal\" DevicePath \"\"" Apr 23 13:38:00.846571 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:38:00.846525 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-9ptsf\" (UniqueName: \"kubernetes.io/projected/37ec7919-ce98-4c8b-b1aa-8ada2efb7563-kube-api-access-9ptsf\") on node \"ip-10-0-128-108.ec2.internal\" DevicePath \"\"" Apr 23 13:38:01.591129 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:38:01.591093 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c6kbqj" event={"ID":"37ec7919-ce98-4c8b-b1aa-8ada2efb7563","Type":"ContainerDied","Data":"5c433870e96cdeb87df1cf652e0a7c070e0bdbfe6c474255bed7df164284f895"} Apr 23 13:38:01.591129 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:38:01.591130 2571 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5c433870e96cdeb87df1cf652e0a7c070e0bdbfe6c474255bed7df164284f895" Apr 23 13:38:01.591356 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:38:01.591112 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c6kbqj" Apr 23 13:38:12.686082 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:38:12.686050 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-db79b"] Apr 23 13:38:12.686495 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:38:12.686285 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="37ec7919-ce98-4c8b-b1aa-8ada2efb7563" containerName="pull" Apr 23 13:38:12.686495 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:38:12.686295 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="37ec7919-ce98-4c8b-b1aa-8ada2efb7563" containerName="pull" Apr 23 13:38:12.686495 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:38:12.686303 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="37ec7919-ce98-4c8b-b1aa-8ada2efb7563" containerName="extract" Apr 23 13:38:12.686495 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:38:12.686308 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="37ec7919-ce98-4c8b-b1aa-8ada2efb7563" containerName="extract" Apr 23 13:38:12.686495 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:38:12.686316 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="37ec7919-ce98-4c8b-b1aa-8ada2efb7563" containerName="util" Apr 23 13:38:12.686495 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:38:12.686321 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="37ec7919-ce98-4c8b-b1aa-8ada2efb7563" containerName="util" Apr 23 13:38:12.686495 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:38:12.686378 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="37ec7919-ce98-4c8b-b1aa-8ada2efb7563" containerName="extract" Apr 23 13:38:12.693082 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:38:12.693065 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-db79b" Apr 23 13:38:12.695898 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:38:12.695875 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"keda-ocp-cabundle\"" Apr 23 13:38:12.695898 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:38:12.695887 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"kedaorg-certs\"" Apr 23 13:38:12.696078 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:38:12.695900 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-dockercfg-l4gv6\"" Apr 23 13:38:12.696078 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:38:12.695881 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-certs\"" Apr 23 13:38:12.696856 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:38:12.696841 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"kube-root-ca.crt\"" Apr 23 13:38:12.696933 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:38:12.696856 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"openshift-service-ca.crt\"" Apr 23 13:38:12.701681 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:38:12.701661 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-db79b"] Apr 23 13:38:12.726029 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:38:12.726004 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/7f608843-ca86-4399-b6a4-85d583e0315f-certificates\") pod \"keda-operator-ffbb595cb-db79b\" (UID: \"7f608843-ca86-4399-b6a4-85d583e0315f\") " pod="openshift-keda/keda-operator-ffbb595cb-db79b" Apr 23 13:38:12.726129 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:38:12.726043 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/7f608843-ca86-4399-b6a4-85d583e0315f-cabundle0\") pod \"keda-operator-ffbb595cb-db79b\" (UID: \"7f608843-ca86-4399-b6a4-85d583e0315f\") " pod="openshift-keda/keda-operator-ffbb595cb-db79b" Apr 23 13:38:12.726195 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:38:12.726126 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nmjvc\" (UniqueName: \"kubernetes.io/projected/7f608843-ca86-4399-b6a4-85d583e0315f-kube-api-access-nmjvc\") pod \"keda-operator-ffbb595cb-db79b\" (UID: \"7f608843-ca86-4399-b6a4-85d583e0315f\") " pod="openshift-keda/keda-operator-ffbb595cb-db79b" Apr 23 13:38:12.827012 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:38:12.826986 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nmjvc\" (UniqueName: \"kubernetes.io/projected/7f608843-ca86-4399-b6a4-85d583e0315f-kube-api-access-nmjvc\") pod \"keda-operator-ffbb595cb-db79b\" (UID: \"7f608843-ca86-4399-b6a4-85d583e0315f\") " pod="openshift-keda/keda-operator-ffbb595cb-db79b" Apr 23 13:38:12.827137 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:38:12.827029 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/7f608843-ca86-4399-b6a4-85d583e0315f-certificates\") pod \"keda-operator-ffbb595cb-db79b\" (UID: \"7f608843-ca86-4399-b6a4-85d583e0315f\") " pod="openshift-keda/keda-operator-ffbb595cb-db79b" Apr 23 13:38:12.827137 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:38:12.827077 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/7f608843-ca86-4399-b6a4-85d583e0315f-cabundle0\") pod \"keda-operator-ffbb595cb-db79b\" (UID: \"7f608843-ca86-4399-b6a4-85d583e0315f\") " pod="openshift-keda/keda-operator-ffbb595cb-db79b" Apr 23 13:38:12.827320 ip-10-0-128-108 kubenswrapper[2571]: E0423 13:38:12.827300 2571 secret.go:281] references non-existent secret key: ca.crt Apr 23 13:38:12.827396 ip-10-0-128-108 kubenswrapper[2571]: E0423 13:38:12.827349 2571 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 23 13:38:12.827396 ip-10-0-128-108 kubenswrapper[2571]: E0423 13:38:12.827364 2571 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-db79b: references non-existent secret key: ca.crt Apr 23 13:38:12.827462 ip-10-0-128-108 kubenswrapper[2571]: E0423 13:38:12.827436 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7f608843-ca86-4399-b6a4-85d583e0315f-certificates podName:7f608843-ca86-4399-b6a4-85d583e0315f nodeName:}" failed. No retries permitted until 2026-04-23 13:38:13.327415958 +0000 UTC m=+427.465694634 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/7f608843-ca86-4399-b6a4-85d583e0315f-certificates") pod "keda-operator-ffbb595cb-db79b" (UID: "7f608843-ca86-4399-b6a4-85d583e0315f") : references non-existent secret key: ca.crt Apr 23 13:38:12.827761 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:38:12.827744 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/7f608843-ca86-4399-b6a4-85d583e0315f-cabundle0\") pod \"keda-operator-ffbb595cb-db79b\" (UID: \"7f608843-ca86-4399-b6a4-85d583e0315f\") " pod="openshift-keda/keda-operator-ffbb595cb-db79b" Apr 23 13:38:12.855174 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:38:12.855153 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nmjvc\" (UniqueName: \"kubernetes.io/projected/7f608843-ca86-4399-b6a4-85d583e0315f-kube-api-access-nmjvc\") pod \"keda-operator-ffbb595cb-db79b\" (UID: \"7f608843-ca86-4399-b6a4-85d583e0315f\") " pod="openshift-keda/keda-operator-ffbb595cb-db79b" Apr 23 13:38:12.989507 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:38:12.989477 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-qvldq"] Apr 23 13:38:12.992704 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:38:12.992684 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-qvldq" Apr 23 13:38:12.995084 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:38:12.995058 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-metrics-apiserver-certs\"" Apr 23 13:38:13.001163 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:38:13.001132 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-qvldq"] Apr 23 13:38:13.029062 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:38:13.029036 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/1dfad772-5fbf-4c0b-a2ca-8c4268065e53-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-qvldq\" (UID: \"1dfad772-5fbf-4c0b-a2ca-8c4268065e53\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-qvldq" Apr 23 13:38:13.029257 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:38:13.029076 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-trtcx\" (UniqueName: \"kubernetes.io/projected/1dfad772-5fbf-4c0b-a2ca-8c4268065e53-kube-api-access-trtcx\") pod \"keda-metrics-apiserver-7c9f485588-qvldq\" (UID: \"1dfad772-5fbf-4c0b-a2ca-8c4268065e53\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-qvldq" Apr 23 13:38:13.029257 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:38:13.029096 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/1dfad772-5fbf-4c0b-a2ca-8c4268065e53-certificates\") pod \"keda-metrics-apiserver-7c9f485588-qvldq\" (UID: \"1dfad772-5fbf-4c0b-a2ca-8c4268065e53\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-qvldq" Apr 23 13:38:13.129717 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:38:13.129691 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-trtcx\" (UniqueName: \"kubernetes.io/projected/1dfad772-5fbf-4c0b-a2ca-8c4268065e53-kube-api-access-trtcx\") pod \"keda-metrics-apiserver-7c9f485588-qvldq\" (UID: \"1dfad772-5fbf-4c0b-a2ca-8c4268065e53\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-qvldq" Apr 23 13:38:13.129892 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:38:13.129721 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/1dfad772-5fbf-4c0b-a2ca-8c4268065e53-certificates\") pod \"keda-metrics-apiserver-7c9f485588-qvldq\" (UID: \"1dfad772-5fbf-4c0b-a2ca-8c4268065e53\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-qvldq" Apr 23 13:38:13.129892 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:38:13.129767 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/1dfad772-5fbf-4c0b-a2ca-8c4268065e53-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-qvldq\" (UID: \"1dfad772-5fbf-4c0b-a2ca-8c4268065e53\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-qvldq" Apr 23 13:38:13.129892 ip-10-0-128-108 kubenswrapper[2571]: E0423 13:38:13.129886 2571 secret.go:281] references non-existent secret key: tls.crt Apr 23 13:38:13.129994 ip-10-0-128-108 kubenswrapper[2571]: E0423 13:38:13.129902 2571 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 23 13:38:13.129994 ip-10-0-128-108 kubenswrapper[2571]: E0423 13:38:13.129918 2571 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-qvldq: references non-existent secret key: tls.crt Apr 23 13:38:13.129994 ip-10-0-128-108 kubenswrapper[2571]: E0423 13:38:13.129967 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1dfad772-5fbf-4c0b-a2ca-8c4268065e53-certificates podName:1dfad772-5fbf-4c0b-a2ca-8c4268065e53 nodeName:}" failed. No retries permitted until 2026-04-23 13:38:13.629953979 +0000 UTC m=+427.768232650 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/1dfad772-5fbf-4c0b-a2ca-8c4268065e53-certificates") pod "keda-metrics-apiserver-7c9f485588-qvldq" (UID: "1dfad772-5fbf-4c0b-a2ca-8c4268065e53") : references non-existent secret key: tls.crt Apr 23 13:38:13.130089 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:38:13.130071 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/1dfad772-5fbf-4c0b-a2ca-8c4268065e53-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-qvldq\" (UID: \"1dfad772-5fbf-4c0b-a2ca-8c4268065e53\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-qvldq" Apr 23 13:38:13.138716 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:38:13.138700 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-trtcx\" (UniqueName: \"kubernetes.io/projected/1dfad772-5fbf-4c0b-a2ca-8c4268065e53-kube-api-access-trtcx\") pod \"keda-metrics-apiserver-7c9f485588-qvldq\" (UID: \"1dfad772-5fbf-4c0b-a2ca-8c4268065e53\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-qvldq" Apr 23 13:38:13.331952 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:38:13.331874 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/7f608843-ca86-4399-b6a4-85d583e0315f-certificates\") pod \"keda-operator-ffbb595cb-db79b\" (UID: \"7f608843-ca86-4399-b6a4-85d583e0315f\") " pod="openshift-keda/keda-operator-ffbb595cb-db79b" Apr 23 13:38:13.332104 ip-10-0-128-108 kubenswrapper[2571]: E0423 13:38:13.332032 2571 secret.go:281] references non-existent secret key: ca.crt Apr 23 13:38:13.332104 ip-10-0-128-108 kubenswrapper[2571]: E0423 13:38:13.332052 2571 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 23 13:38:13.332104 ip-10-0-128-108 kubenswrapper[2571]: E0423 13:38:13.332064 2571 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-db79b: references non-existent secret key: ca.crt Apr 23 13:38:13.332263 ip-10-0-128-108 kubenswrapper[2571]: E0423 13:38:13.332157 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7f608843-ca86-4399-b6a4-85d583e0315f-certificates podName:7f608843-ca86-4399-b6a4-85d583e0315f nodeName:}" failed. No retries permitted until 2026-04-23 13:38:14.332136458 +0000 UTC m=+428.470415136 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/7f608843-ca86-4399-b6a4-85d583e0315f-certificates") pod "keda-operator-ffbb595cb-db79b" (UID: "7f608843-ca86-4399-b6a4-85d583e0315f") : references non-existent secret key: ca.crt Apr 23 13:38:13.634504 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:38:13.634425 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/1dfad772-5fbf-4c0b-a2ca-8c4268065e53-certificates\") pod \"keda-metrics-apiserver-7c9f485588-qvldq\" (UID: \"1dfad772-5fbf-4c0b-a2ca-8c4268065e53\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-qvldq" Apr 23 13:38:13.636812 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:38:13.636793 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/1dfad772-5fbf-4c0b-a2ca-8c4268065e53-certificates\") pod \"keda-metrics-apiserver-7c9f485588-qvldq\" (UID: \"1dfad772-5fbf-4c0b-a2ca-8c4268065e53\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-qvldq" Apr 23 13:38:13.905168 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:38:13.905080 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-qvldq" Apr 23 13:38:14.022941 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:38:14.022918 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-qvldq"] Apr 23 13:38:14.025514 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:38:14.025488 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1dfad772_5fbf_4c0b_a2ca_8c4268065e53.slice/crio-be1291b9fa8cc6be69fe8ef7435e783a761e4674bbb32d25bc9085f85ec34144 WatchSource:0}: Error finding container be1291b9fa8cc6be69fe8ef7435e783a761e4674bbb32d25bc9085f85ec34144: Status 404 returned error can't find the container with id be1291b9fa8cc6be69fe8ef7435e783a761e4674bbb32d25bc9085f85ec34144 Apr 23 13:38:14.339215 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:38:14.339179 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/7f608843-ca86-4399-b6a4-85d583e0315f-certificates\") pod \"keda-operator-ffbb595cb-db79b\" (UID: \"7f608843-ca86-4399-b6a4-85d583e0315f\") " pod="openshift-keda/keda-operator-ffbb595cb-db79b" Apr 23 13:38:14.341655 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:38:14.341631 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/7f608843-ca86-4399-b6a4-85d583e0315f-certificates\") pod \"keda-operator-ffbb595cb-db79b\" (UID: \"7f608843-ca86-4399-b6a4-85d583e0315f\") " pod="openshift-keda/keda-operator-ffbb595cb-db79b" Apr 23 13:38:14.503658 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:38:14.503614 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-db79b" Apr 23 13:38:14.615418 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:38:14.615259 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-db79b"] Apr 23 13:38:14.617678 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:38:14.617652 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7f608843_ca86_4399_b6a4_85d583e0315f.slice/crio-2d990215536a6d107c2cd4740461ca6ecd882f546749bfd672723c265f3391bf WatchSource:0}: Error finding container 2d990215536a6d107c2cd4740461ca6ecd882f546749bfd672723c265f3391bf: Status 404 returned error can't find the container with id 2d990215536a6d107c2cd4740461ca6ecd882f546749bfd672723c265f3391bf Apr 23 13:38:14.627287 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:38:14.627241 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-db79b" event={"ID":"7f608843-ca86-4399-b6a4-85d583e0315f","Type":"ContainerStarted","Data":"2d990215536a6d107c2cd4740461ca6ecd882f546749bfd672723c265f3391bf"} Apr 23 13:38:14.628264 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:38:14.628235 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-qvldq" event={"ID":"1dfad772-5fbf-4c0b-a2ca-8c4268065e53","Type":"ContainerStarted","Data":"be1291b9fa8cc6be69fe8ef7435e783a761e4674bbb32d25bc9085f85ec34144"} Apr 23 13:38:17.637911 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:38:17.637878 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-qvldq" event={"ID":"1dfad772-5fbf-4c0b-a2ca-8c4268065e53","Type":"ContainerStarted","Data":"07b8829e98be3fafd8e24922806f72fba5677e68f8a8944b8414a9623fe485ac"} Apr 23 13:38:17.638282 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:38:17.638030 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-qvldq" Apr 23 13:38:17.654365 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:38:17.654293 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-qvldq" podStartSLOduration=2.581617085 podStartE2EDuration="5.654280268s" podCreationTimestamp="2026-04-23 13:38:12 +0000 UTC" firstStartedPulling="2026-04-23 13:38:14.027100876 +0000 UTC m=+428.165379538" lastFinishedPulling="2026-04-23 13:38:17.099764055 +0000 UTC m=+431.238042721" observedRunningTime="2026-04-23 13:38:17.653132548 +0000 UTC m=+431.791411247" watchObservedRunningTime="2026-04-23 13:38:17.654280268 +0000 UTC m=+431.792558954" Apr 23 13:38:19.645467 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:38:19.645363 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-db79b" event={"ID":"7f608843-ca86-4399-b6a4-85d583e0315f","Type":"ContainerStarted","Data":"406e392a1c2ed094e887f27bd14ef5c8f1bfa64be1dee61e2c28b9d55f93f6cf"} Apr 23 13:38:19.645904 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:38:19.645518 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-operator-ffbb595cb-db79b" Apr 23 13:38:19.673636 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:38:19.673584 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-operator-ffbb595cb-db79b" podStartSLOduration=3.035987656 podStartE2EDuration="7.673566864s" podCreationTimestamp="2026-04-23 13:38:12 +0000 UTC" firstStartedPulling="2026-04-23 13:38:14.618964261 +0000 UTC m=+428.757242924" lastFinishedPulling="2026-04-23 13:38:19.256543468 +0000 UTC m=+433.394822132" observedRunningTime="2026-04-23 13:38:19.671990568 +0000 UTC m=+433.810269254" watchObservedRunningTime="2026-04-23 13:38:19.673566864 +0000 UTC m=+433.811845549" Apr 23 13:38:28.645876 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:38:28.645798 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-qvldq" Apr 23 13:38:40.650185 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:38:40.650154 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-operator-ffbb595cb-db79b" Apr 23 13:39:06.156820 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:39:06.156787 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d55kfm"] Apr 23 13:39:06.159984 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:39:06.159966 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d55kfm" Apr 23 13:39:06.162939 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:39:06.162915 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 23 13:39:06.163186 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:39:06.163171 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 23 13:39:06.163853 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:39:06.163839 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-bxpk4\"" Apr 23 13:39:06.168737 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:39:06.168717 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d55kfm"] Apr 23 13:39:06.284884 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:39:06.284850 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bpx92\" (UniqueName: \"kubernetes.io/projected/1d598f13-e7f6-43a4-a627-11991bfc07c5-kube-api-access-bpx92\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d55kfm\" (UID: \"1d598f13-e7f6-43a4-a627-11991bfc07c5\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d55kfm" Apr 23 13:39:06.285034 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:39:06.284900 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1d598f13-e7f6-43a4-a627-11991bfc07c5-bundle\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d55kfm\" (UID: \"1d598f13-e7f6-43a4-a627-11991bfc07c5\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d55kfm" Apr 23 13:39:06.285034 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:39:06.284966 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1d598f13-e7f6-43a4-a627-11991bfc07c5-util\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d55kfm\" (UID: \"1d598f13-e7f6-43a4-a627-11991bfc07c5\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d55kfm" Apr 23 13:39:06.385656 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:39:06.385621 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1d598f13-e7f6-43a4-a627-11991bfc07c5-bundle\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d55kfm\" (UID: \"1d598f13-e7f6-43a4-a627-11991bfc07c5\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d55kfm" Apr 23 13:39:06.385827 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:39:06.385661 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1d598f13-e7f6-43a4-a627-11991bfc07c5-util\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d55kfm\" (UID: \"1d598f13-e7f6-43a4-a627-11991bfc07c5\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d55kfm" Apr 23 13:39:06.385827 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:39:06.385713 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bpx92\" (UniqueName: \"kubernetes.io/projected/1d598f13-e7f6-43a4-a627-11991bfc07c5-kube-api-access-bpx92\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d55kfm\" (UID: \"1d598f13-e7f6-43a4-a627-11991bfc07c5\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d55kfm" Apr 23 13:39:06.386093 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:39:06.386066 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1d598f13-e7f6-43a4-a627-11991bfc07c5-bundle\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d55kfm\" (UID: \"1d598f13-e7f6-43a4-a627-11991bfc07c5\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d55kfm" Apr 23 13:39:06.386153 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:39:06.386081 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1d598f13-e7f6-43a4-a627-11991bfc07c5-util\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d55kfm\" (UID: \"1d598f13-e7f6-43a4-a627-11991bfc07c5\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d55kfm" Apr 23 13:39:06.396140 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:39:06.396118 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 23 13:39:06.409605 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:39:06.409556 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 23 13:39:06.417245 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:39:06.417226 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bpx92\" (UniqueName: \"kubernetes.io/projected/1d598f13-e7f6-43a4-a627-11991bfc07c5-kube-api-access-bpx92\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d55kfm\" (UID: \"1d598f13-e7f6-43a4-a627-11991bfc07c5\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d55kfm" Apr 23 13:39:06.471927 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:39:06.471903 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-bxpk4\"" Apr 23 13:39:06.479275 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:39:06.479260 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d55kfm" Apr 23 13:39:06.609233 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:39:06.609191 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d55kfm"] Apr 23 13:39:06.612135 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:39:06.612108 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1d598f13_e7f6_43a4_a627_11991bfc07c5.slice/crio-9f7ed0f8aeeb37d2eb3b44c58269684f194a33793d5fa43d444f2f8765807833 WatchSource:0}: Error finding container 9f7ed0f8aeeb37d2eb3b44c58269684f194a33793d5fa43d444f2f8765807833: Status 404 returned error can't find the container with id 9f7ed0f8aeeb37d2eb3b44c58269684f194a33793d5fa43d444f2f8765807833 Apr 23 13:39:06.764584 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:39:06.764551 2571 generic.go:358] "Generic (PLEG): container finished" podID="1d598f13-e7f6-43a4-a627-11991bfc07c5" containerID="f9b0d7397356938e83b8ee240315b2eecfde1a21bcd0fd0a98611780de058691" exitCode=0 Apr 23 13:39:06.764685 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:39:06.764634 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d55kfm" event={"ID":"1d598f13-e7f6-43a4-a627-11991bfc07c5","Type":"ContainerDied","Data":"f9b0d7397356938e83b8ee240315b2eecfde1a21bcd0fd0a98611780de058691"} Apr 23 13:39:06.764685 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:39:06.764667 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d55kfm" event={"ID":"1d598f13-e7f6-43a4-a627-11991bfc07c5","Type":"ContainerStarted","Data":"9f7ed0f8aeeb37d2eb3b44c58269684f194a33793d5fa43d444f2f8765807833"} Apr 23 13:39:08.772110 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:39:08.772072 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d55kfm" event={"ID":"1d598f13-e7f6-43a4-a627-11991bfc07c5","Type":"ContainerStarted","Data":"5418cd40fa43589e1d1c3dbdc7f8ab841cd4fbd192f2f9f3b3d80e2783612a2f"} Apr 23 13:39:09.776929 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:39:09.776898 2571 generic.go:358] "Generic (PLEG): container finished" podID="1d598f13-e7f6-43a4-a627-11991bfc07c5" containerID="5418cd40fa43589e1d1c3dbdc7f8ab841cd4fbd192f2f9f3b3d80e2783612a2f" exitCode=0 Apr 23 13:39:09.777515 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:39:09.776988 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d55kfm" event={"ID":"1d598f13-e7f6-43a4-a627-11991bfc07c5","Type":"ContainerDied","Data":"5418cd40fa43589e1d1c3dbdc7f8ab841cd4fbd192f2f9f3b3d80e2783612a2f"} Apr 23 13:39:10.781104 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:39:10.781065 2571 generic.go:358] "Generic (PLEG): container finished" podID="1d598f13-e7f6-43a4-a627-11991bfc07c5" containerID="da29ba83f844c4d91b94a6713245cb135c4e269ec83114c1b738b8eb56d8fae7" exitCode=0 Apr 23 13:39:10.781476 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:39:10.781118 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d55kfm" event={"ID":"1d598f13-e7f6-43a4-a627-11991bfc07c5","Type":"ContainerDied","Data":"da29ba83f844c4d91b94a6713245cb135c4e269ec83114c1b738b8eb56d8fae7"} Apr 23 13:39:11.900088 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:39:11.900064 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d55kfm" Apr 23 13:39:12.030636 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:39:12.030606 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1d598f13-e7f6-43a4-a627-11991bfc07c5-bundle\") pod \"1d598f13-e7f6-43a4-a627-11991bfc07c5\" (UID: \"1d598f13-e7f6-43a4-a627-11991bfc07c5\") " Apr 23 13:39:12.030636 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:39:12.030644 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bpx92\" (UniqueName: \"kubernetes.io/projected/1d598f13-e7f6-43a4-a627-11991bfc07c5-kube-api-access-bpx92\") pod \"1d598f13-e7f6-43a4-a627-11991bfc07c5\" (UID: \"1d598f13-e7f6-43a4-a627-11991bfc07c5\") " Apr 23 13:39:12.030823 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:39:12.030693 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1d598f13-e7f6-43a4-a627-11991bfc07c5-util\") pod \"1d598f13-e7f6-43a4-a627-11991bfc07c5\" (UID: \"1d598f13-e7f6-43a4-a627-11991bfc07c5\") " Apr 23 13:39:12.031399 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:39:12.031370 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d598f13-e7f6-43a4-a627-11991bfc07c5-bundle" (OuterVolumeSpecName: "bundle") pod "1d598f13-e7f6-43a4-a627-11991bfc07c5" (UID: "1d598f13-e7f6-43a4-a627-11991bfc07c5"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 13:39:12.032845 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:39:12.032827 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d598f13-e7f6-43a4-a627-11991bfc07c5-kube-api-access-bpx92" (OuterVolumeSpecName: "kube-api-access-bpx92") pod "1d598f13-e7f6-43a4-a627-11991bfc07c5" (UID: "1d598f13-e7f6-43a4-a627-11991bfc07c5"). InnerVolumeSpecName "kube-api-access-bpx92". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 13:39:12.035114 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:39:12.035069 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d598f13-e7f6-43a4-a627-11991bfc07c5-util" (OuterVolumeSpecName: "util") pod "1d598f13-e7f6-43a4-a627-11991bfc07c5" (UID: "1d598f13-e7f6-43a4-a627-11991bfc07c5"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 13:39:12.131876 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:39:12.131847 2571 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1d598f13-e7f6-43a4-a627-11991bfc07c5-util\") on node \"ip-10-0-128-108.ec2.internal\" DevicePath \"\"" Apr 23 13:39:12.131876 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:39:12.131871 2571 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1d598f13-e7f6-43a4-a627-11991bfc07c5-bundle\") on node \"ip-10-0-128-108.ec2.internal\" DevicePath \"\"" Apr 23 13:39:12.131876 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:39:12.131881 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-bpx92\" (UniqueName: \"kubernetes.io/projected/1d598f13-e7f6-43a4-a627-11991bfc07c5-kube-api-access-bpx92\") on node \"ip-10-0-128-108.ec2.internal\" DevicePath \"\"" Apr 23 13:39:12.788456 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:39:12.788416 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d55kfm" event={"ID":"1d598f13-e7f6-43a4-a627-11991bfc07c5","Type":"ContainerDied","Data":"9f7ed0f8aeeb37d2eb3b44c58269684f194a33793d5fa43d444f2f8765807833"} Apr 23 13:39:12.788456 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:39:12.788461 2571 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9f7ed0f8aeeb37d2eb3b44c58269684f194a33793d5fa43d444f2f8765807833" Apr 23 13:39:12.788652 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:39:12.788492 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d55kfm" Apr 23 13:39:19.437094 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:39:19.437057 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-p6z5m"] Apr 23 13:39:19.437575 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:39:19.437394 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1d598f13-e7f6-43a4-a627-11991bfc07c5" containerName="pull" Apr 23 13:39:19.437575 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:39:19.437411 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d598f13-e7f6-43a4-a627-11991bfc07c5" containerName="pull" Apr 23 13:39:19.437575 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:39:19.437428 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1d598f13-e7f6-43a4-a627-11991bfc07c5" containerName="util" Apr 23 13:39:19.437575 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:39:19.437437 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d598f13-e7f6-43a4-a627-11991bfc07c5" containerName="util" Apr 23 13:39:19.437575 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:39:19.437455 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1d598f13-e7f6-43a4-a627-11991bfc07c5" containerName="extract" Apr 23 13:39:19.437575 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:39:19.437464 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d598f13-e7f6-43a4-a627-11991bfc07c5" containerName="extract" Apr 23 13:39:19.437575 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:39:19.437525 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="1d598f13-e7f6-43a4-a627-11991bfc07c5" containerName="extract" Apr 23 13:39:19.442748 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:39:19.442727 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-p6z5m" Apr 23 13:39:19.445798 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:39:19.445777 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager-operator\"/\"cert-manager-operator-controller-manager-dockercfg-v6lhg\"" Apr 23 13:39:19.446425 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:39:19.446407 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"openshift-service-ca.crt\"" Apr 23 13:39:19.446425 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:39:19.446417 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"kube-root-ca.crt\"" Apr 23 13:39:19.462685 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:39:19.462661 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-p6z5m"] Apr 23 13:39:19.481920 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:39:19.481893 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8dkbh\" (UniqueName: \"kubernetes.io/projected/f7106efb-14ad-4629-a445-2c93327c4e51-kube-api-access-8dkbh\") pod \"cert-manager-operator-controller-manager-54b9655956-p6z5m\" (UID: \"f7106efb-14ad-4629-a445-2c93327c4e51\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-p6z5m" Apr 23 13:39:19.482013 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:39:19.481959 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/f7106efb-14ad-4629-a445-2c93327c4e51-tmp\") pod \"cert-manager-operator-controller-manager-54b9655956-p6z5m\" (UID: \"f7106efb-14ad-4629-a445-2c93327c4e51\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-p6z5m" Apr 23 13:39:19.582228 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:39:19.582197 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/f7106efb-14ad-4629-a445-2c93327c4e51-tmp\") pod \"cert-manager-operator-controller-manager-54b9655956-p6z5m\" (UID: \"f7106efb-14ad-4629-a445-2c93327c4e51\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-p6z5m" Apr 23 13:39:19.582363 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:39:19.582256 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8dkbh\" (UniqueName: \"kubernetes.io/projected/f7106efb-14ad-4629-a445-2c93327c4e51-kube-api-access-8dkbh\") pod \"cert-manager-operator-controller-manager-54b9655956-p6z5m\" (UID: \"f7106efb-14ad-4629-a445-2c93327c4e51\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-p6z5m" Apr 23 13:39:19.582619 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:39:19.582599 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/f7106efb-14ad-4629-a445-2c93327c4e51-tmp\") pod \"cert-manager-operator-controller-manager-54b9655956-p6z5m\" (UID: \"f7106efb-14ad-4629-a445-2c93327c4e51\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-p6z5m" Apr 23 13:39:19.601542 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:39:19.601519 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8dkbh\" (UniqueName: \"kubernetes.io/projected/f7106efb-14ad-4629-a445-2c93327c4e51-kube-api-access-8dkbh\") pod \"cert-manager-operator-controller-manager-54b9655956-p6z5m\" (UID: \"f7106efb-14ad-4629-a445-2c93327c4e51\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-p6z5m" Apr 23 13:39:19.751323 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:39:19.751288 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-p6z5m" Apr 23 13:39:19.896146 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:39:19.896123 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-p6z5m"] Apr 23 13:39:19.898490 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:39:19.898462 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf7106efb_14ad_4629_a445_2c93327c4e51.slice/crio-88296626a6bad491338b4129282972f085b59a1e4f68c98f31474f7b4357ed2c WatchSource:0}: Error finding container 88296626a6bad491338b4129282972f085b59a1e4f68c98f31474f7b4357ed2c: Status 404 returned error can't find the container with id 88296626a6bad491338b4129282972f085b59a1e4f68c98f31474f7b4357ed2c Apr 23 13:39:20.812916 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:39:20.812863 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-p6z5m" event={"ID":"f7106efb-14ad-4629-a445-2c93327c4e51","Type":"ContainerStarted","Data":"88296626a6bad491338b4129282972f085b59a1e4f68c98f31474f7b4357ed2c"} Apr 23 13:39:21.817743 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:39:21.817715 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-p6z5m" event={"ID":"f7106efb-14ad-4629-a445-2c93327c4e51","Type":"ContainerStarted","Data":"1405d529eb13d8631e653cfe17c9c6a5e34729a26bd8e04b5fb80c43d703960a"} Apr 23 13:39:27.569525 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:39:27.569466 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-p6z5m" podStartSLOduration=6.808513233 podStartE2EDuration="8.569447318s" podCreationTimestamp="2026-04-23 13:39:19 +0000 UTC" firstStartedPulling="2026-04-23 13:39:19.901636739 +0000 UTC m=+494.039915406" lastFinishedPulling="2026-04-23 13:39:21.662570828 +0000 UTC m=+495.800849491" observedRunningTime="2026-04-23 13:39:21.856566085 +0000 UTC m=+495.994844770" watchObservedRunningTime="2026-04-23 13:39:27.569447318 +0000 UTC m=+501.707726001" Apr 23 13:39:27.570663 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:39:27.570644 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f5r4wc"] Apr 23 13:39:27.574032 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:39:27.574009 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f5r4wc" Apr 23 13:39:27.576779 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:39:27.576760 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 23 13:39:27.577627 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:39:27.577609 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 23 13:39:27.577703 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:39:27.577617 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-bxpk4\"" Apr 23 13:39:27.582613 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:39:27.582591 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f5r4wc"] Apr 23 13:39:27.635579 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:39:27.635554 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j8h6p\" (UniqueName: \"kubernetes.io/projected/8462f8fd-723d-4097-9099-480ee2b21254-kube-api-access-j8h6p\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f5r4wc\" (UID: \"8462f8fd-723d-4097-9099-480ee2b21254\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f5r4wc" Apr 23 13:39:27.635684 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:39:27.635587 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8462f8fd-723d-4097-9099-480ee2b21254-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f5r4wc\" (UID: \"8462f8fd-723d-4097-9099-480ee2b21254\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f5r4wc" Apr 23 13:39:27.635684 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:39:27.635633 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8462f8fd-723d-4097-9099-480ee2b21254-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f5r4wc\" (UID: \"8462f8fd-723d-4097-9099-480ee2b21254\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f5r4wc" Apr 23 13:39:27.735943 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:39:27.735910 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8462f8fd-723d-4097-9099-480ee2b21254-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f5r4wc\" (UID: \"8462f8fd-723d-4097-9099-480ee2b21254\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f5r4wc" Apr 23 13:39:27.736096 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:39:27.735976 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j8h6p\" (UniqueName: \"kubernetes.io/projected/8462f8fd-723d-4097-9099-480ee2b21254-kube-api-access-j8h6p\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f5r4wc\" (UID: \"8462f8fd-723d-4097-9099-480ee2b21254\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f5r4wc" Apr 23 13:39:27.736096 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:39:27.736010 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8462f8fd-723d-4097-9099-480ee2b21254-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f5r4wc\" (UID: \"8462f8fd-723d-4097-9099-480ee2b21254\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f5r4wc" Apr 23 13:39:27.736287 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:39:27.736267 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8462f8fd-723d-4097-9099-480ee2b21254-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f5r4wc\" (UID: \"8462f8fd-723d-4097-9099-480ee2b21254\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f5r4wc" Apr 23 13:39:27.736407 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:39:27.736391 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8462f8fd-723d-4097-9099-480ee2b21254-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f5r4wc\" (UID: \"8462f8fd-723d-4097-9099-480ee2b21254\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f5r4wc" Apr 23 13:39:27.745849 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:39:27.745817 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j8h6p\" (UniqueName: \"kubernetes.io/projected/8462f8fd-723d-4097-9099-480ee2b21254-kube-api-access-j8h6p\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f5r4wc\" (UID: \"8462f8fd-723d-4097-9099-480ee2b21254\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f5r4wc" Apr 23 13:39:27.883195 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:39:27.883110 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f5r4wc" Apr 23 13:39:27.999249 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:39:27.999082 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f5r4wc"] Apr 23 13:39:28.001720 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:39:28.001694 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8462f8fd_723d_4097_9099_480ee2b21254.slice/crio-fb46450d81f448bc84552a0c3dc7c5a483c61f15841795b98d119bc7aeffce5d WatchSource:0}: Error finding container fb46450d81f448bc84552a0c3dc7c5a483c61f15841795b98d119bc7aeffce5d: Status 404 returned error can't find the container with id fb46450d81f448bc84552a0c3dc7c5a483c61f15841795b98d119bc7aeffce5d Apr 23 13:39:28.843673 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:39:28.843643 2571 generic.go:358] "Generic (PLEG): container finished" podID="8462f8fd-723d-4097-9099-480ee2b21254" containerID="f1b2107360a56bb11b788915b4b4534a9db16a441a17daec85998b4eb6f6ba8e" exitCode=0 Apr 23 13:39:28.843989 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:39:28.843679 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f5r4wc" event={"ID":"8462f8fd-723d-4097-9099-480ee2b21254","Type":"ContainerDied","Data":"f1b2107360a56bb11b788915b4b4534a9db16a441a17daec85998b4eb6f6ba8e"} Apr 23 13:39:28.843989 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:39:28.843701 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f5r4wc" event={"ID":"8462f8fd-723d-4097-9099-480ee2b21254","Type":"ContainerStarted","Data":"fb46450d81f448bc84552a0c3dc7c5a483c61f15841795b98d119bc7aeffce5d"} Apr 23 13:39:30.851744 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:39:30.851657 2571 generic.go:358] "Generic (PLEG): container finished" podID="8462f8fd-723d-4097-9099-480ee2b21254" containerID="a3783b819cde8682ba8f74a53b073ed8db91694de8cbbff5b8221940e6591819" exitCode=0 Apr 23 13:39:30.851744 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:39:30.851722 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f5r4wc" event={"ID":"8462f8fd-723d-4097-9099-480ee2b21254","Type":"ContainerDied","Data":"a3783b819cde8682ba8f74a53b073ed8db91694de8cbbff5b8221940e6591819"} Apr 23 13:39:31.857161 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:39:31.857129 2571 generic.go:358] "Generic (PLEG): container finished" podID="8462f8fd-723d-4097-9099-480ee2b21254" containerID="a3ec12e8cc2ae184fa151a9ef1fd5b00738d7e667fa24e5b58409c9e3c6277b3" exitCode=0 Apr 23 13:39:31.857559 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:39:31.857196 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f5r4wc" event={"ID":"8462f8fd-723d-4097-9099-480ee2b21254","Type":"ContainerDied","Data":"a3ec12e8cc2ae184fa151a9ef1fd5b00738d7e667fa24e5b58409c9e3c6277b3"} Apr 23 13:39:32.981183 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:39:32.981153 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f5r4wc" Apr 23 13:39:33.072037 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:39:33.072007 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8462f8fd-723d-4097-9099-480ee2b21254-bundle\") pod \"8462f8fd-723d-4097-9099-480ee2b21254\" (UID: \"8462f8fd-723d-4097-9099-480ee2b21254\") " Apr 23 13:39:33.072187 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:39:33.072046 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8462f8fd-723d-4097-9099-480ee2b21254-util\") pod \"8462f8fd-723d-4097-9099-480ee2b21254\" (UID: \"8462f8fd-723d-4097-9099-480ee2b21254\") " Apr 23 13:39:33.072187 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:39:33.072089 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j8h6p\" (UniqueName: \"kubernetes.io/projected/8462f8fd-723d-4097-9099-480ee2b21254-kube-api-access-j8h6p\") pod \"8462f8fd-723d-4097-9099-480ee2b21254\" (UID: \"8462f8fd-723d-4097-9099-480ee2b21254\") " Apr 23 13:39:33.072418 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:39:33.072394 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8462f8fd-723d-4097-9099-480ee2b21254-bundle" (OuterVolumeSpecName: "bundle") pod "8462f8fd-723d-4097-9099-480ee2b21254" (UID: "8462f8fd-723d-4097-9099-480ee2b21254"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 13:39:33.074166 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:39:33.074139 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8462f8fd-723d-4097-9099-480ee2b21254-kube-api-access-j8h6p" (OuterVolumeSpecName: "kube-api-access-j8h6p") pod "8462f8fd-723d-4097-9099-480ee2b21254" (UID: "8462f8fd-723d-4097-9099-480ee2b21254"). InnerVolumeSpecName "kube-api-access-j8h6p". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 13:39:33.077437 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:39:33.077418 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8462f8fd-723d-4097-9099-480ee2b21254-util" (OuterVolumeSpecName: "util") pod "8462f8fd-723d-4097-9099-480ee2b21254" (UID: "8462f8fd-723d-4097-9099-480ee2b21254"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 13:39:33.173049 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:39:33.172985 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-j8h6p\" (UniqueName: \"kubernetes.io/projected/8462f8fd-723d-4097-9099-480ee2b21254-kube-api-access-j8h6p\") on node \"ip-10-0-128-108.ec2.internal\" DevicePath \"\"" Apr 23 13:39:33.173049 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:39:33.173009 2571 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8462f8fd-723d-4097-9099-480ee2b21254-bundle\") on node \"ip-10-0-128-108.ec2.internal\" DevicePath \"\"" Apr 23 13:39:33.173049 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:39:33.173019 2571 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8462f8fd-723d-4097-9099-480ee2b21254-util\") on node \"ip-10-0-128-108.ec2.internal\" DevicePath \"\"" Apr 23 13:39:33.865062 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:39:33.865020 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f5r4wc" event={"ID":"8462f8fd-723d-4097-9099-480ee2b21254","Type":"ContainerDied","Data":"fb46450d81f448bc84552a0c3dc7c5a483c61f15841795b98d119bc7aeffce5d"} Apr 23 13:39:33.865062 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:39:33.865065 2571 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fb46450d81f448bc84552a0c3dc7c5a483c61f15841795b98d119bc7aeffce5d" Apr 23 13:39:33.865260 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:39:33.865067 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f5r4wc" Apr 23 13:39:49.389125 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:39:49.389091 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835nj7qn"] Apr 23 13:39:49.389547 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:39:49.389321 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8462f8fd-723d-4097-9099-480ee2b21254" containerName="pull" Apr 23 13:39:49.389547 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:39:49.389345 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="8462f8fd-723d-4097-9099-480ee2b21254" containerName="pull" Apr 23 13:39:49.389547 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:39:49.389359 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8462f8fd-723d-4097-9099-480ee2b21254" containerName="util" Apr 23 13:39:49.389547 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:39:49.389366 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="8462f8fd-723d-4097-9099-480ee2b21254" containerName="util" Apr 23 13:39:49.389547 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:39:49.389388 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8462f8fd-723d-4097-9099-480ee2b21254" containerName="extract" Apr 23 13:39:49.389547 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:39:49.389393 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="8462f8fd-723d-4097-9099-480ee2b21254" containerName="extract" Apr 23 13:39:49.389547 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:39:49.389437 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="8462f8fd-723d-4097-9099-480ee2b21254" containerName="extract" Apr 23 13:39:49.396723 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:39:49.396701 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835nj7qn" Apr 23 13:39:49.399698 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:39:49.399671 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 23 13:39:49.399823 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:39:49.399735 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-bxpk4\"" Apr 23 13:39:49.399823 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:39:49.399705 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 23 13:39:49.401926 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:39:49.401907 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835nj7qn"] Apr 23 13:39:49.479134 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:39:49.479102 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/100e999c-2212-49eb-bce8-dc2fad08c93e-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835nj7qn\" (UID: \"100e999c-2212-49eb-bce8-dc2fad08c93e\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835nj7qn" Apr 23 13:39:49.479288 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:39:49.479143 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/100e999c-2212-49eb-bce8-dc2fad08c93e-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835nj7qn\" (UID: \"100e999c-2212-49eb-bce8-dc2fad08c93e\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835nj7qn" Apr 23 13:39:49.479288 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:39:49.479217 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dsr28\" (UniqueName: \"kubernetes.io/projected/100e999c-2212-49eb-bce8-dc2fad08c93e-kube-api-access-dsr28\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835nj7qn\" (UID: \"100e999c-2212-49eb-bce8-dc2fad08c93e\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835nj7qn" Apr 23 13:39:49.580034 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:39:49.580003 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dsr28\" (UniqueName: \"kubernetes.io/projected/100e999c-2212-49eb-bce8-dc2fad08c93e-kube-api-access-dsr28\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835nj7qn\" (UID: \"100e999c-2212-49eb-bce8-dc2fad08c93e\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835nj7qn" Apr 23 13:39:49.580175 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:39:49.580047 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/100e999c-2212-49eb-bce8-dc2fad08c93e-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835nj7qn\" (UID: \"100e999c-2212-49eb-bce8-dc2fad08c93e\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835nj7qn" Apr 23 13:39:49.580175 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:39:49.580068 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/100e999c-2212-49eb-bce8-dc2fad08c93e-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835nj7qn\" (UID: \"100e999c-2212-49eb-bce8-dc2fad08c93e\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835nj7qn" Apr 23 13:39:49.580386 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:39:49.580366 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/100e999c-2212-49eb-bce8-dc2fad08c93e-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835nj7qn\" (UID: \"100e999c-2212-49eb-bce8-dc2fad08c93e\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835nj7qn" Apr 23 13:39:49.580429 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:39:49.580399 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/100e999c-2212-49eb-bce8-dc2fad08c93e-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835nj7qn\" (UID: \"100e999c-2212-49eb-bce8-dc2fad08c93e\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835nj7qn" Apr 23 13:39:49.588923 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:39:49.588898 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dsr28\" (UniqueName: \"kubernetes.io/projected/100e999c-2212-49eb-bce8-dc2fad08c93e-kube-api-access-dsr28\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835nj7qn\" (UID: \"100e999c-2212-49eb-bce8-dc2fad08c93e\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835nj7qn" Apr 23 13:39:49.705779 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:39:49.705753 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835nj7qn" Apr 23 13:39:49.824977 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:39:49.824956 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835nj7qn"] Apr 23 13:39:49.827473 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:39:49.827447 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod100e999c_2212_49eb_bce8_dc2fad08c93e.slice/crio-57e03c6dee7cec81993109387dec0e8a01170d0c00b796951bbd4a37dfdb071a WatchSource:0}: Error finding container 57e03c6dee7cec81993109387dec0e8a01170d0c00b796951bbd4a37dfdb071a: Status 404 returned error can't find the container with id 57e03c6dee7cec81993109387dec0e8a01170d0c00b796951bbd4a37dfdb071a Apr 23 13:39:49.913413 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:39:49.913374 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835nj7qn" event={"ID":"100e999c-2212-49eb-bce8-dc2fad08c93e","Type":"ContainerStarted","Data":"ac9eff1a2d3c2d554298ffefdd9f1d115af3c4ba48b3609421f0d325209c62b0"} Apr 23 13:39:49.913545 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:39:49.913422 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835nj7qn" event={"ID":"100e999c-2212-49eb-bce8-dc2fad08c93e","Type":"ContainerStarted","Data":"57e03c6dee7cec81993109387dec0e8a01170d0c00b796951bbd4a37dfdb071a"} Apr 23 13:39:50.917265 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:39:50.917170 2571 generic.go:358] "Generic (PLEG): container finished" podID="100e999c-2212-49eb-bce8-dc2fad08c93e" containerID="ac9eff1a2d3c2d554298ffefdd9f1d115af3c4ba48b3609421f0d325209c62b0" exitCode=0 Apr 23 13:39:50.917631 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:39:50.917253 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835nj7qn" event={"ID":"100e999c-2212-49eb-bce8-dc2fad08c93e","Type":"ContainerDied","Data":"ac9eff1a2d3c2d554298ffefdd9f1d115af3c4ba48b3609421f0d325209c62b0"} Apr 23 13:39:52.925099 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:39:52.925066 2571 generic.go:358] "Generic (PLEG): container finished" podID="100e999c-2212-49eb-bce8-dc2fad08c93e" containerID="90eb66e9d46cf059f60290d6938623c7e13b6b3d1daacf893e1702cb08caebbb" exitCode=0 Apr 23 13:39:52.925572 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:39:52.925139 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835nj7qn" event={"ID":"100e999c-2212-49eb-bce8-dc2fad08c93e","Type":"ContainerDied","Data":"90eb66e9d46cf059f60290d6938623c7e13b6b3d1daacf893e1702cb08caebbb"} Apr 23 13:39:53.929265 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:39:53.929227 2571 generic.go:358] "Generic (PLEG): container finished" podID="100e999c-2212-49eb-bce8-dc2fad08c93e" containerID="25d6d6c4804a02867f3991e41e533b25bdf1b5bafe827638542ab25fd97378dc" exitCode=0 Apr 23 13:39:53.929713 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:39:53.929301 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835nj7qn" event={"ID":"100e999c-2212-49eb-bce8-dc2fad08c93e","Type":"ContainerDied","Data":"25d6d6c4804a02867f3991e41e533b25bdf1b5bafe827638542ab25fd97378dc"} Apr 23 13:39:55.049232 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:39:55.049208 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835nj7qn" Apr 23 13:39:55.118883 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:39:55.118852 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dsr28\" (UniqueName: \"kubernetes.io/projected/100e999c-2212-49eb-bce8-dc2fad08c93e-kube-api-access-dsr28\") pod \"100e999c-2212-49eb-bce8-dc2fad08c93e\" (UID: \"100e999c-2212-49eb-bce8-dc2fad08c93e\") " Apr 23 13:39:55.119042 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:39:55.118891 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/100e999c-2212-49eb-bce8-dc2fad08c93e-util\") pod \"100e999c-2212-49eb-bce8-dc2fad08c93e\" (UID: \"100e999c-2212-49eb-bce8-dc2fad08c93e\") " Apr 23 13:39:55.119042 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:39:55.118910 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/100e999c-2212-49eb-bce8-dc2fad08c93e-bundle\") pod \"100e999c-2212-49eb-bce8-dc2fad08c93e\" (UID: \"100e999c-2212-49eb-bce8-dc2fad08c93e\") " Apr 23 13:39:55.119746 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:39:55.119720 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/100e999c-2212-49eb-bce8-dc2fad08c93e-bundle" (OuterVolumeSpecName: "bundle") pod "100e999c-2212-49eb-bce8-dc2fad08c93e" (UID: "100e999c-2212-49eb-bce8-dc2fad08c93e"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 13:39:55.120874 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:39:55.120850 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/100e999c-2212-49eb-bce8-dc2fad08c93e-kube-api-access-dsr28" (OuterVolumeSpecName: "kube-api-access-dsr28") pod "100e999c-2212-49eb-bce8-dc2fad08c93e" (UID: "100e999c-2212-49eb-bce8-dc2fad08c93e"). InnerVolumeSpecName "kube-api-access-dsr28". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 13:39:55.123656 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:39:55.123632 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/100e999c-2212-49eb-bce8-dc2fad08c93e-util" (OuterVolumeSpecName: "util") pod "100e999c-2212-49eb-bce8-dc2fad08c93e" (UID: "100e999c-2212-49eb-bce8-dc2fad08c93e"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 13:39:55.220222 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:39:55.220183 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-dsr28\" (UniqueName: \"kubernetes.io/projected/100e999c-2212-49eb-bce8-dc2fad08c93e-kube-api-access-dsr28\") on node \"ip-10-0-128-108.ec2.internal\" DevicePath \"\"" Apr 23 13:39:55.220222 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:39:55.220220 2571 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/100e999c-2212-49eb-bce8-dc2fad08c93e-util\") on node \"ip-10-0-128-108.ec2.internal\" DevicePath \"\"" Apr 23 13:39:55.220460 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:39:55.220236 2571 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/100e999c-2212-49eb-bce8-dc2fad08c93e-bundle\") on node \"ip-10-0-128-108.ec2.internal\" DevicePath \"\"" Apr 23 13:39:55.936604 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:39:55.936575 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835nj7qn" Apr 23 13:39:55.936756 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:39:55.936574 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835nj7qn" event={"ID":"100e999c-2212-49eb-bce8-dc2fad08c93e","Type":"ContainerDied","Data":"57e03c6dee7cec81993109387dec0e8a01170d0c00b796951bbd4a37dfdb071a"} Apr 23 13:39:55.936756 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:39:55.936685 2571 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="57e03c6dee7cec81993109387dec0e8a01170d0c00b796951bbd4a37dfdb071a" Apr 23 13:40:04.632712 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:04.632679 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebzmxw8"] Apr 23 13:40:04.633077 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:04.632920 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="100e999c-2212-49eb-bce8-dc2fad08c93e" containerName="pull" Apr 23 13:40:04.633077 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:04.632936 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="100e999c-2212-49eb-bce8-dc2fad08c93e" containerName="pull" Apr 23 13:40:04.633077 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:04.632947 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="100e999c-2212-49eb-bce8-dc2fad08c93e" containerName="extract" Apr 23 13:40:04.633077 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:04.632953 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="100e999c-2212-49eb-bce8-dc2fad08c93e" containerName="extract" Apr 23 13:40:04.633077 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:04.632962 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="100e999c-2212-49eb-bce8-dc2fad08c93e" containerName="util" Apr 23 13:40:04.633077 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:04.632967 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="100e999c-2212-49eb-bce8-dc2fad08c93e" containerName="util" Apr 23 13:40:04.633077 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:04.633016 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="100e999c-2212-49eb-bce8-dc2fad08c93e" containerName="extract" Apr 23 13:40:04.638884 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:04.638859 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebzmxw8" Apr 23 13:40:04.646488 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:04.646465 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-bxpk4\"" Apr 23 13:40:04.650350 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:04.647179 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 23 13:40:04.651405 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:04.651379 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 23 13:40:04.653549 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:04.653526 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebzmxw8"] Apr 23 13:40:04.684075 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:04.684050 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ld2q6\" (UniqueName: \"kubernetes.io/projected/9910544d-3fb5-4106-b825-e4e83cce3da7-kube-api-access-ld2q6\") pod \"7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebzmxw8\" (UID: \"9910544d-3fb5-4106-b825-e4e83cce3da7\") " pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebzmxw8" Apr 23 13:40:04.684187 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:04.684087 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9910544d-3fb5-4106-b825-e4e83cce3da7-util\") pod \"7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebzmxw8\" (UID: \"9910544d-3fb5-4106-b825-e4e83cce3da7\") " pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebzmxw8" Apr 23 13:40:04.684187 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:04.684114 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9910544d-3fb5-4106-b825-e4e83cce3da7-bundle\") pod \"7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebzmxw8\" (UID: \"9910544d-3fb5-4106-b825-e4e83cce3da7\") " pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebzmxw8" Apr 23 13:40:04.784468 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:04.784423 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ld2q6\" (UniqueName: \"kubernetes.io/projected/9910544d-3fb5-4106-b825-e4e83cce3da7-kube-api-access-ld2q6\") pod \"7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebzmxw8\" (UID: \"9910544d-3fb5-4106-b825-e4e83cce3da7\") " pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebzmxw8" Apr 23 13:40:04.784468 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:04.784472 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9910544d-3fb5-4106-b825-e4e83cce3da7-util\") pod \"7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebzmxw8\" (UID: \"9910544d-3fb5-4106-b825-e4e83cce3da7\") " pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebzmxw8" Apr 23 13:40:04.784656 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:04.784587 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9910544d-3fb5-4106-b825-e4e83cce3da7-bundle\") pod \"7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebzmxw8\" (UID: \"9910544d-3fb5-4106-b825-e4e83cce3da7\") " pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebzmxw8" Apr 23 13:40:04.784813 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:04.784798 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9910544d-3fb5-4106-b825-e4e83cce3da7-util\") pod \"7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebzmxw8\" (UID: \"9910544d-3fb5-4106-b825-e4e83cce3da7\") " pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebzmxw8" Apr 23 13:40:04.784899 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:04.784880 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9910544d-3fb5-4106-b825-e4e83cce3da7-bundle\") pod \"7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebzmxw8\" (UID: \"9910544d-3fb5-4106-b825-e4e83cce3da7\") " pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebzmxw8" Apr 23 13:40:04.810439 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:04.810407 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ld2q6\" (UniqueName: \"kubernetes.io/projected/9910544d-3fb5-4106-b825-e4e83cce3da7-kube-api-access-ld2q6\") pod \"7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebzmxw8\" (UID: \"9910544d-3fb5-4106-b825-e4e83cce3da7\") " pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebzmxw8" Apr 23 13:40:04.948117 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:04.948092 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebzmxw8" Apr 23 13:40:05.251186 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:40:05.251153 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9910544d_3fb5_4106_b825_e4e83cce3da7.slice/crio-e8cda5954c8ef6f041210970374312812c00ab145ac4294e635c0f500eefa9ae WatchSource:0}: Error finding container e8cda5954c8ef6f041210970374312812c00ab145ac4294e635c0f500eefa9ae: Status 404 returned error can't find the container with id e8cda5954c8ef6f041210970374312812c00ab145ac4294e635c0f500eefa9ae Apr 23 13:40:05.286737 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:05.286712 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebzmxw8"] Apr 23 13:40:05.968492 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:05.968461 2571 generic.go:358] "Generic (PLEG): container finished" podID="9910544d-3fb5-4106-b825-e4e83cce3da7" containerID="6fd67a74a08e92590526485f6179b9fe16e36fbbef27ac63245cee0b62f9f55b" exitCode=0 Apr 23 13:40:05.968840 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:05.968507 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebzmxw8" event={"ID":"9910544d-3fb5-4106-b825-e4e83cce3da7","Type":"ContainerDied","Data":"6fd67a74a08e92590526485f6179b9fe16e36fbbef27ac63245cee0b62f9f55b"} Apr 23 13:40:05.968840 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:05.968533 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebzmxw8" event={"ID":"9910544d-3fb5-4106-b825-e4e83cce3da7","Type":"ContainerStarted","Data":"e8cda5954c8ef6f041210970374312812c00ab145ac4294e635c0f500eefa9ae"} Apr 23 13:40:06.973561 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:06.973529 2571 generic.go:358] "Generic (PLEG): container finished" podID="9910544d-3fb5-4106-b825-e4e83cce3da7" containerID="070ad6ef88e4e919c9068e20c2ba7822679d80c03326ffe1f61d5432caa9444c" exitCode=0 Apr 23 13:40:06.973926 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:06.973570 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebzmxw8" event={"ID":"9910544d-3fb5-4106-b825-e4e83cce3da7","Type":"ContainerDied","Data":"070ad6ef88e4e919c9068e20c2ba7822679d80c03326ffe1f61d5432caa9444c"} Apr 23 13:40:07.978960 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:07.978922 2571 generic.go:358] "Generic (PLEG): container finished" podID="9910544d-3fb5-4106-b825-e4e83cce3da7" containerID="17836a6ee5404db514d09bacfdfd44efb73447e23c57242826f03a9fe5dc1b02" exitCode=0 Apr 23 13:40:07.979327 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:07.978977 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebzmxw8" event={"ID":"9910544d-3fb5-4106-b825-e4e83cce3da7","Type":"ContainerDied","Data":"17836a6ee5404db514d09bacfdfd44efb73447e23c57242826f03a9fe5dc1b02"} Apr 23 13:40:08.049082 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:08.049047 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-cx42n"] Apr 23 13:40:08.051979 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:08.051958 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/servicemesh-operator3-55f49c5f94-cx42n" Apr 23 13:40:08.057159 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:08.057135 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-operators\"/\"servicemesh-operator3-dockercfg-sc4jp\"" Apr 23 13:40:08.060663 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:08.060647 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operators\"/\"openshift-service-ca.crt\"" Apr 23 13:40:08.064927 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:08.064913 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operators\"/\"kube-root-ca.crt\"" Apr 23 13:40:08.106439 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:08.106418 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7t57x\" (UniqueName: \"kubernetes.io/projected/a67aea19-21cc-4ec6-a53b-87b67eda7b4c-kube-api-access-7t57x\") pod \"servicemesh-operator3-55f49c5f94-cx42n\" (UID: \"a67aea19-21cc-4ec6-a53b-87b67eda7b4c\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-cx42n" Apr 23 13:40:08.106541 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:08.106449 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/a67aea19-21cc-4ec6-a53b-87b67eda7b4c-operator-config\") pod \"servicemesh-operator3-55f49c5f94-cx42n\" (UID: \"a67aea19-21cc-4ec6-a53b-87b67eda7b4c\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-cx42n" Apr 23 13:40:08.160136 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:08.160113 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-cx42n"] Apr 23 13:40:08.211357 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:08.207832 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7t57x\" (UniqueName: \"kubernetes.io/projected/a67aea19-21cc-4ec6-a53b-87b67eda7b4c-kube-api-access-7t57x\") pod \"servicemesh-operator3-55f49c5f94-cx42n\" (UID: \"a67aea19-21cc-4ec6-a53b-87b67eda7b4c\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-cx42n" Apr 23 13:40:08.211357 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:08.207893 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/a67aea19-21cc-4ec6-a53b-87b67eda7b4c-operator-config\") pod \"servicemesh-operator3-55f49c5f94-cx42n\" (UID: \"a67aea19-21cc-4ec6-a53b-87b67eda7b4c\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-cx42n" Apr 23 13:40:08.211357 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:08.211123 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/a67aea19-21cc-4ec6-a53b-87b67eda7b4c-operator-config\") pod \"servicemesh-operator3-55f49c5f94-cx42n\" (UID: \"a67aea19-21cc-4ec6-a53b-87b67eda7b4c\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-cx42n" Apr 23 13:40:08.228104 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:08.228079 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7t57x\" (UniqueName: \"kubernetes.io/projected/a67aea19-21cc-4ec6-a53b-87b67eda7b4c-kube-api-access-7t57x\") pod \"servicemesh-operator3-55f49c5f94-cx42n\" (UID: \"a67aea19-21cc-4ec6-a53b-87b67eda7b4c\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-cx42n" Apr 23 13:40:08.361064 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:08.360998 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/servicemesh-operator3-55f49c5f94-cx42n" Apr 23 13:40:08.522999 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:08.522974 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-cx42n"] Apr 23 13:40:08.525709 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:40:08.525689 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda67aea19_21cc_4ec6_a53b_87b67eda7b4c.slice/crio-18a3f3e99a1a7baa55351d4fa8d5a004c04973f680cb130ca2d241e906177aa1 WatchSource:0}: Error finding container 18a3f3e99a1a7baa55351d4fa8d5a004c04973f680cb130ca2d241e906177aa1: Status 404 returned error can't find the container with id 18a3f3e99a1a7baa55351d4fa8d5a004c04973f680cb130ca2d241e906177aa1 Apr 23 13:40:08.983234 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:08.983192 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operators/servicemesh-operator3-55f49c5f94-cx42n" event={"ID":"a67aea19-21cc-4ec6-a53b-87b67eda7b4c","Type":"ContainerStarted","Data":"18a3f3e99a1a7baa55351d4fa8d5a004c04973f680cb130ca2d241e906177aa1"} Apr 23 13:40:09.094471 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:09.094446 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebzmxw8" Apr 23 13:40:09.213467 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:09.213433 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ld2q6\" (UniqueName: \"kubernetes.io/projected/9910544d-3fb5-4106-b825-e4e83cce3da7-kube-api-access-ld2q6\") pod \"9910544d-3fb5-4106-b825-e4e83cce3da7\" (UID: \"9910544d-3fb5-4106-b825-e4e83cce3da7\") " Apr 23 13:40:09.213467 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:09.213466 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9910544d-3fb5-4106-b825-e4e83cce3da7-util\") pod \"9910544d-3fb5-4106-b825-e4e83cce3da7\" (UID: \"9910544d-3fb5-4106-b825-e4e83cce3da7\") " Apr 23 13:40:09.213653 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:09.213504 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9910544d-3fb5-4106-b825-e4e83cce3da7-bundle\") pod \"9910544d-3fb5-4106-b825-e4e83cce3da7\" (UID: \"9910544d-3fb5-4106-b825-e4e83cce3da7\") " Apr 23 13:40:09.214381 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:09.214352 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9910544d-3fb5-4106-b825-e4e83cce3da7-bundle" (OuterVolumeSpecName: "bundle") pod "9910544d-3fb5-4106-b825-e4e83cce3da7" (UID: "9910544d-3fb5-4106-b825-e4e83cce3da7"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 13:40:09.215562 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:09.215536 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9910544d-3fb5-4106-b825-e4e83cce3da7-kube-api-access-ld2q6" (OuterVolumeSpecName: "kube-api-access-ld2q6") pod "9910544d-3fb5-4106-b825-e4e83cce3da7" (UID: "9910544d-3fb5-4106-b825-e4e83cce3da7"). InnerVolumeSpecName "kube-api-access-ld2q6". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 13:40:09.219489 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:09.219467 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9910544d-3fb5-4106-b825-e4e83cce3da7-util" (OuterVolumeSpecName: "util") pod "9910544d-3fb5-4106-b825-e4e83cce3da7" (UID: "9910544d-3fb5-4106-b825-e4e83cce3da7"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 13:40:09.314325 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:09.314258 2571 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9910544d-3fb5-4106-b825-e4e83cce3da7-bundle\") on node \"ip-10-0-128-108.ec2.internal\" DevicePath \"\"" Apr 23 13:40:09.314325 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:09.314283 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-ld2q6\" (UniqueName: \"kubernetes.io/projected/9910544d-3fb5-4106-b825-e4e83cce3da7-kube-api-access-ld2q6\") on node \"ip-10-0-128-108.ec2.internal\" DevicePath \"\"" Apr 23 13:40:09.314325 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:09.314294 2571 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9910544d-3fb5-4106-b825-e4e83cce3da7-util\") on node \"ip-10-0-128-108.ec2.internal\" DevicePath \"\"" Apr 23 13:40:09.988903 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:09.988871 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebzmxw8" Apr 23 13:40:09.989400 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:09.988870 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebzmxw8" event={"ID":"9910544d-3fb5-4106-b825-e4e83cce3da7","Type":"ContainerDied","Data":"e8cda5954c8ef6f041210970374312812c00ab145ac4294e635c0f500eefa9ae"} Apr 23 13:40:09.989400 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:09.988995 2571 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e8cda5954c8ef6f041210970374312812c00ab145ac4294e635c0f500eefa9ae" Apr 23 13:40:11.999416 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:11.999366 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operators/servicemesh-operator3-55f49c5f94-cx42n" event={"ID":"a67aea19-21cc-4ec6-a53b-87b67eda7b4c","Type":"ContainerStarted","Data":"581720b085fddb752aab417e0623980746daf2c2e1283cd7c518fdd7d7fe9c74"} Apr 23 13:40:11.999831 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:11.999459 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-operators/servicemesh-operator3-55f49c5f94-cx42n" Apr 23 13:40:12.039213 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:12.039160 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/servicemesh-operator3-55f49c5f94-cx42n" podStartSLOduration=1.468053317 podStartE2EDuration="4.039142369s" podCreationTimestamp="2026-04-23 13:40:08 +0000 UTC" firstStartedPulling="2026-04-23 13:40:08.52847514 +0000 UTC m=+542.666753803" lastFinishedPulling="2026-04-23 13:40:11.099564177 +0000 UTC m=+545.237842855" observedRunningTime="2026-04-23 13:40:12.038388167 +0000 UTC m=+546.176666853" watchObservedRunningTime="2026-04-23 13:40:12.039142369 +0000 UTC m=+546.177421056" Apr 23 13:40:12.126051 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:12.126023 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-kfm5c"] Apr 23 13:40:12.126263 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:12.126252 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9910544d-3fb5-4106-b825-e4e83cce3da7" containerName="util" Apr 23 13:40:12.126316 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:12.126264 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="9910544d-3fb5-4106-b825-e4e83cce3da7" containerName="util" Apr 23 13:40:12.126316 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:12.126275 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9910544d-3fb5-4106-b825-e4e83cce3da7" containerName="extract" Apr 23 13:40:12.126316 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:12.126280 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="9910544d-3fb5-4106-b825-e4e83cce3da7" containerName="extract" Apr 23 13:40:12.126316 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:12.126298 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9910544d-3fb5-4106-b825-e4e83cce3da7" containerName="pull" Apr 23 13:40:12.126316 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:12.126303 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="9910544d-3fb5-4106-b825-e4e83cce3da7" containerName="pull" Apr 23 13:40:12.126563 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:12.126354 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="9910544d-3fb5-4106-b825-e4e83cce3da7" containerName="extract" Apr 23 13:40:12.129134 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:12.129116 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-kfm5c" Apr 23 13:40:12.132501 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:12.132476 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"cacerts\"" Apr 23 13:40:12.132619 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:12.132501 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 23 13:40:12.132783 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:12.132762 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"istio-ca-root-cert\"" Apr 23 13:40:12.132884 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:12.132811 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istiod-tls\"" Apr 23 13:40:12.132884 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:12.132862 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 23 13:40:12.132984 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:12.132903 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istiod-openshift-gateway-dockercfg-2xnwp\"" Apr 23 13:40:12.132984 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:12.132768 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istio-kubeconfig\"" Apr 23 13:40:12.144504 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:12.144484 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-kfm5c"] Apr 23 13:40:12.236603 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:12.236569 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/254690a2-4108-40d3-9aca-efd709937afd-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-7cd77c7ffd-kfm5c\" (UID: \"254690a2-4108-40d3-9aca-efd709937afd\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-kfm5c" Apr 23 13:40:12.236603 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:12.236604 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/254690a2-4108-40d3-9aca-efd709937afd-local-certs\") pod \"istiod-openshift-gateway-7cd77c7ffd-kfm5c\" (UID: \"254690a2-4108-40d3-9aca-efd709937afd\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-kfm5c" Apr 23 13:40:12.236798 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:12.236638 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/254690a2-4108-40d3-9aca-efd709937afd-cacerts\") pod \"istiod-openshift-gateway-7cd77c7ffd-kfm5c\" (UID: \"254690a2-4108-40d3-9aca-efd709937afd\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-kfm5c" Apr 23 13:40:12.236798 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:12.236682 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/254690a2-4108-40d3-9aca-efd709937afd-istio-token\") pod \"istiod-openshift-gateway-7cd77c7ffd-kfm5c\" (UID: \"254690a2-4108-40d3-9aca-efd709937afd\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-kfm5c" Apr 23 13:40:12.236798 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:12.236698 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/254690a2-4108-40d3-9aca-efd709937afd-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-7cd77c7ffd-kfm5c\" (UID: \"254690a2-4108-40d3-9aca-efd709937afd\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-kfm5c" Apr 23 13:40:12.236798 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:12.236740 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pjnmf\" (UniqueName: \"kubernetes.io/projected/254690a2-4108-40d3-9aca-efd709937afd-kube-api-access-pjnmf\") pod \"istiod-openshift-gateway-7cd77c7ffd-kfm5c\" (UID: \"254690a2-4108-40d3-9aca-efd709937afd\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-kfm5c" Apr 23 13:40:12.236798 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:12.236763 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/254690a2-4108-40d3-9aca-efd709937afd-istio-kubeconfig\") pod \"istiod-openshift-gateway-7cd77c7ffd-kfm5c\" (UID: \"254690a2-4108-40d3-9aca-efd709937afd\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-kfm5c" Apr 23 13:40:12.337284 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:12.337199 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/254690a2-4108-40d3-9aca-efd709937afd-istio-token\") pod \"istiod-openshift-gateway-7cd77c7ffd-kfm5c\" (UID: \"254690a2-4108-40d3-9aca-efd709937afd\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-kfm5c" Apr 23 13:40:12.337284 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:12.337236 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/254690a2-4108-40d3-9aca-efd709937afd-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-7cd77c7ffd-kfm5c\" (UID: \"254690a2-4108-40d3-9aca-efd709937afd\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-kfm5c" Apr 23 13:40:12.337284 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:12.337269 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pjnmf\" (UniqueName: \"kubernetes.io/projected/254690a2-4108-40d3-9aca-efd709937afd-kube-api-access-pjnmf\") pod \"istiod-openshift-gateway-7cd77c7ffd-kfm5c\" (UID: \"254690a2-4108-40d3-9aca-efd709937afd\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-kfm5c" Apr 23 13:40:12.337554 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:12.337297 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/254690a2-4108-40d3-9aca-efd709937afd-istio-kubeconfig\") pod \"istiod-openshift-gateway-7cd77c7ffd-kfm5c\" (UID: \"254690a2-4108-40d3-9aca-efd709937afd\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-kfm5c" Apr 23 13:40:12.337554 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:12.337322 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/254690a2-4108-40d3-9aca-efd709937afd-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-7cd77c7ffd-kfm5c\" (UID: \"254690a2-4108-40d3-9aca-efd709937afd\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-kfm5c" Apr 23 13:40:12.337705 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:12.337678 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/254690a2-4108-40d3-9aca-efd709937afd-local-certs\") pod \"istiod-openshift-gateway-7cd77c7ffd-kfm5c\" (UID: \"254690a2-4108-40d3-9aca-efd709937afd\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-kfm5c" Apr 23 13:40:12.337789 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:12.337739 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/254690a2-4108-40d3-9aca-efd709937afd-cacerts\") pod \"istiod-openshift-gateway-7cd77c7ffd-kfm5c\" (UID: \"254690a2-4108-40d3-9aca-efd709937afd\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-kfm5c" Apr 23 13:40:12.338442 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:12.338418 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/254690a2-4108-40d3-9aca-efd709937afd-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-7cd77c7ffd-kfm5c\" (UID: \"254690a2-4108-40d3-9aca-efd709937afd\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-kfm5c" Apr 23 13:40:12.339935 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:12.339902 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/254690a2-4108-40d3-9aca-efd709937afd-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-7cd77c7ffd-kfm5c\" (UID: \"254690a2-4108-40d3-9aca-efd709937afd\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-kfm5c" Apr 23 13:40:12.340141 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:12.340123 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/254690a2-4108-40d3-9aca-efd709937afd-local-certs\") pod \"istiod-openshift-gateway-7cd77c7ffd-kfm5c\" (UID: \"254690a2-4108-40d3-9aca-efd709937afd\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-kfm5c" Apr 23 13:40:12.340233 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:12.340219 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/254690a2-4108-40d3-9aca-efd709937afd-istio-kubeconfig\") pod \"istiod-openshift-gateway-7cd77c7ffd-kfm5c\" (UID: \"254690a2-4108-40d3-9aca-efd709937afd\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-kfm5c" Apr 23 13:40:12.340349 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:12.340307 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/254690a2-4108-40d3-9aca-efd709937afd-cacerts\") pod \"istiod-openshift-gateway-7cd77c7ffd-kfm5c\" (UID: \"254690a2-4108-40d3-9aca-efd709937afd\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-kfm5c" Apr 23 13:40:12.345653 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:12.345623 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/254690a2-4108-40d3-9aca-efd709937afd-istio-token\") pod \"istiod-openshift-gateway-7cd77c7ffd-kfm5c\" (UID: \"254690a2-4108-40d3-9aca-efd709937afd\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-kfm5c" Apr 23 13:40:12.345653 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:12.345630 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pjnmf\" (UniqueName: \"kubernetes.io/projected/254690a2-4108-40d3-9aca-efd709937afd-kube-api-access-pjnmf\") pod \"istiod-openshift-gateway-7cd77c7ffd-kfm5c\" (UID: \"254690a2-4108-40d3-9aca-efd709937afd\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-kfm5c" Apr 23 13:40:12.438221 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:12.438184 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-kfm5c" Apr 23 13:40:12.571739 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:12.571708 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-kfm5c"] Apr 23 13:40:12.572624 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:40:12.572598 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod254690a2_4108_40d3_9aca_efd709937afd.slice/crio-ba93bdc8952b50440dbc70bc9e90c164ce587d0e2cc2013fa3f09e0b53448cbc WatchSource:0}: Error finding container ba93bdc8952b50440dbc70bc9e90c164ce587d0e2cc2013fa3f09e0b53448cbc: Status 404 returned error can't find the container with id ba93bdc8952b50440dbc70bc9e90c164ce587d0e2cc2013fa3f09e0b53448cbc Apr 23 13:40:13.003857 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:13.003817 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-kfm5c" event={"ID":"254690a2-4108-40d3-9aca-efd709937afd","Type":"ContainerStarted","Data":"ba93bdc8952b50440dbc70bc9e90c164ce587d0e2cc2013fa3f09e0b53448cbc"} Apr 23 13:40:15.122813 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:15.122741 2571 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 23 13:40:15.123156 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:15.122844 2571 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 23 13:40:16.017008 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:16.016970 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-kfm5c" event={"ID":"254690a2-4108-40d3-9aca-efd709937afd","Type":"ContainerStarted","Data":"b9e6404aa2cec451fb3ca95707d896a4baccee8697ee1846f96ed3bb9dc85526"} Apr 23 13:40:16.017322 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:16.017297 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-kfm5c" Apr 23 13:40:16.019042 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:16.019013 2571 patch_prober.go:28] interesting pod/istiod-openshift-gateway-7cd77c7ffd-kfm5c container/discovery namespace/openshift-ingress: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body= Apr 23 13:40:16.019137 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:16.019081 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-kfm5c" podUID="254690a2-4108-40d3-9aca-efd709937afd" containerName="discovery" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 13:40:16.053009 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:16.052945 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-kfm5c" podStartSLOduration=1.504885877 podStartE2EDuration="4.05292732s" podCreationTimestamp="2026-04-23 13:40:12 +0000 UTC" firstStartedPulling="2026-04-23 13:40:12.574490377 +0000 UTC m=+546.712769041" lastFinishedPulling="2026-04-23 13:40:15.122531818 +0000 UTC m=+549.260810484" observedRunningTime="2026-04-23 13:40:16.05032301 +0000 UTC m=+550.188601695" watchObservedRunningTime="2026-04-23 13:40:16.05292732 +0000 UTC m=+550.191206005" Apr 23 13:40:17.021207 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:17.021169 2571 patch_prober.go:28] interesting pod/istiod-openshift-gateway-7cd77c7ffd-kfm5c container/discovery namespace/openshift-ingress: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body= Apr 23 13:40:17.021675 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:17.021231 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-kfm5c" podUID="254690a2-4108-40d3-9aca-efd709937afd" containerName="discovery" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 13:40:20.021803 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:20.021773 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-kfm5c" Apr 23 13:40:20.387222 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:20.387139 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-nsr6v"] Apr 23 13:40:20.390314 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:20.390299 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-nsr6v" Apr 23 13:40:20.392897 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:20.392875 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"openshift-ai-inference-openshift-default-dockercfg-n4s6s\"" Apr 23 13:40:20.402185 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:20.402164 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-nsr6v"] Apr 23 13:40:20.497485 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:20.497450 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/340b1383-bc96-4f31-a84c-6b1b58369cbe-istiod-ca-cert\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-nsr6v\" (UID: \"340b1383-bc96-4f31-a84c-6b1b58369cbe\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-nsr6v" Apr 23 13:40:20.497632 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:20.497494 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/340b1383-bc96-4f31-a84c-6b1b58369cbe-workload-socket\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-nsr6v\" (UID: \"340b1383-bc96-4f31-a84c-6b1b58369cbe\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-nsr6v" Apr 23 13:40:20.497632 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:20.497530 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/340b1383-bc96-4f31-a84c-6b1b58369cbe-istio-token\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-nsr6v\" (UID: \"340b1383-bc96-4f31-a84c-6b1b58369cbe\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-nsr6v" Apr 23 13:40:20.497632 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:20.497553 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/340b1383-bc96-4f31-a84c-6b1b58369cbe-istio-podinfo\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-nsr6v\" (UID: \"340b1383-bc96-4f31-a84c-6b1b58369cbe\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-nsr6v" Apr 23 13:40:20.497632 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:20.497575 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/340b1383-bc96-4f31-a84c-6b1b58369cbe-workload-certs\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-nsr6v\" (UID: \"340b1383-bc96-4f31-a84c-6b1b58369cbe\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-nsr6v" Apr 23 13:40:20.497632 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:20.497601 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/340b1383-bc96-4f31-a84c-6b1b58369cbe-istio-envoy\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-nsr6v\" (UID: \"340b1383-bc96-4f31-a84c-6b1b58369cbe\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-nsr6v" Apr 23 13:40:20.497632 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:20.497634 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/340b1383-bc96-4f31-a84c-6b1b58369cbe-credential-socket\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-nsr6v\" (UID: \"340b1383-bc96-4f31-a84c-6b1b58369cbe\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-nsr6v" Apr 23 13:40:20.497883 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:20.497689 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w6gxx\" (UniqueName: \"kubernetes.io/projected/340b1383-bc96-4f31-a84c-6b1b58369cbe-kube-api-access-w6gxx\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-nsr6v\" (UID: \"340b1383-bc96-4f31-a84c-6b1b58369cbe\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-nsr6v" Apr 23 13:40:20.497883 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:20.497730 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/340b1383-bc96-4f31-a84c-6b1b58369cbe-istio-data\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-nsr6v\" (UID: \"340b1383-bc96-4f31-a84c-6b1b58369cbe\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-nsr6v" Apr 23 13:40:20.598168 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:20.598138 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/340b1383-bc96-4f31-a84c-6b1b58369cbe-workload-certs\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-nsr6v\" (UID: \"340b1383-bc96-4f31-a84c-6b1b58369cbe\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-nsr6v" Apr 23 13:40:20.598168 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:20.598169 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/340b1383-bc96-4f31-a84c-6b1b58369cbe-istio-envoy\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-nsr6v\" (UID: \"340b1383-bc96-4f31-a84c-6b1b58369cbe\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-nsr6v" Apr 23 13:40:20.598457 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:20.598191 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/340b1383-bc96-4f31-a84c-6b1b58369cbe-credential-socket\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-nsr6v\" (UID: \"340b1383-bc96-4f31-a84c-6b1b58369cbe\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-nsr6v" Apr 23 13:40:20.598457 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:20.598219 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w6gxx\" (UniqueName: \"kubernetes.io/projected/340b1383-bc96-4f31-a84c-6b1b58369cbe-kube-api-access-w6gxx\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-nsr6v\" (UID: \"340b1383-bc96-4f31-a84c-6b1b58369cbe\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-nsr6v" Apr 23 13:40:20.598457 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:20.598248 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/340b1383-bc96-4f31-a84c-6b1b58369cbe-istio-data\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-nsr6v\" (UID: \"340b1383-bc96-4f31-a84c-6b1b58369cbe\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-nsr6v" Apr 23 13:40:20.598457 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:20.598301 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/340b1383-bc96-4f31-a84c-6b1b58369cbe-istiod-ca-cert\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-nsr6v\" (UID: \"340b1383-bc96-4f31-a84c-6b1b58369cbe\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-nsr6v" Apr 23 13:40:20.598457 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:20.598355 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/340b1383-bc96-4f31-a84c-6b1b58369cbe-workload-socket\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-nsr6v\" (UID: \"340b1383-bc96-4f31-a84c-6b1b58369cbe\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-nsr6v" Apr 23 13:40:20.598457 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:20.598401 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/340b1383-bc96-4f31-a84c-6b1b58369cbe-istio-token\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-nsr6v\" (UID: \"340b1383-bc96-4f31-a84c-6b1b58369cbe\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-nsr6v" Apr 23 13:40:20.598457 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:20.598433 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/340b1383-bc96-4f31-a84c-6b1b58369cbe-istio-podinfo\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-nsr6v\" (UID: \"340b1383-bc96-4f31-a84c-6b1b58369cbe\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-nsr6v" Apr 23 13:40:20.598815 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:20.598604 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/340b1383-bc96-4f31-a84c-6b1b58369cbe-workload-certs\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-nsr6v\" (UID: \"340b1383-bc96-4f31-a84c-6b1b58369cbe\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-nsr6v" Apr 23 13:40:20.598815 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:20.598803 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/340b1383-bc96-4f31-a84c-6b1b58369cbe-credential-socket\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-nsr6v\" (UID: \"340b1383-bc96-4f31-a84c-6b1b58369cbe\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-nsr6v" Apr 23 13:40:20.599149 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:20.599126 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/340b1383-bc96-4f31-a84c-6b1b58369cbe-istio-data\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-nsr6v\" (UID: \"340b1383-bc96-4f31-a84c-6b1b58369cbe\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-nsr6v" Apr 23 13:40:20.599264 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:20.599175 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/340b1383-bc96-4f31-a84c-6b1b58369cbe-workload-socket\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-nsr6v\" (UID: \"340b1383-bc96-4f31-a84c-6b1b58369cbe\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-nsr6v" Apr 23 13:40:20.599613 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:20.599590 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/340b1383-bc96-4f31-a84c-6b1b58369cbe-istiod-ca-cert\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-nsr6v\" (UID: \"340b1383-bc96-4f31-a84c-6b1b58369cbe\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-nsr6v" Apr 23 13:40:20.601053 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:20.601025 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/340b1383-bc96-4f31-a84c-6b1b58369cbe-istio-envoy\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-nsr6v\" (UID: \"340b1383-bc96-4f31-a84c-6b1b58369cbe\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-nsr6v" Apr 23 13:40:20.601234 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:20.601212 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/340b1383-bc96-4f31-a84c-6b1b58369cbe-istio-podinfo\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-nsr6v\" (UID: \"340b1383-bc96-4f31-a84c-6b1b58369cbe\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-nsr6v" Apr 23 13:40:20.606236 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:20.606205 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w6gxx\" (UniqueName: \"kubernetes.io/projected/340b1383-bc96-4f31-a84c-6b1b58369cbe-kube-api-access-w6gxx\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-nsr6v\" (UID: \"340b1383-bc96-4f31-a84c-6b1b58369cbe\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-nsr6v" Apr 23 13:40:20.606520 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:20.606502 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/340b1383-bc96-4f31-a84c-6b1b58369cbe-istio-token\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-nsr6v\" (UID: \"340b1383-bc96-4f31-a84c-6b1b58369cbe\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-nsr6v" Apr 23 13:40:20.701754 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:20.701730 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-nsr6v" Apr 23 13:40:20.831344 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:20.831300 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-nsr6v"] Apr 23 13:40:20.832971 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:40:20.832937 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod340b1383_bc96_4f31_a84c_6b1b58369cbe.slice/crio-454ee0baa67030af1262dc365c8719a6bc7be9f5bd118b9fde85b63277030e5b WatchSource:0}: Error finding container 454ee0baa67030af1262dc365c8719a6bc7be9f5bd118b9fde85b63277030e5b: Status 404 returned error can't find the container with id 454ee0baa67030af1262dc365c8719a6bc7be9f5bd118b9fde85b63277030e5b Apr 23 13:40:21.034367 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:21.034266 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-nsr6v" event={"ID":"340b1383-bc96-4f31-a84c-6b1b58369cbe","Type":"ContainerStarted","Data":"454ee0baa67030af1262dc365c8719a6bc7be9f5bd118b9fde85b63277030e5b"} Apr 23 13:40:23.006888 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:23.006859 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/servicemesh-operator3-55f49c5f94-cx42n" Apr 23 13:40:23.483076 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:23.483037 2571 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 23 13:40:23.483182 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:23.483110 2571 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 23 13:40:23.483182 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:23.483137 2571 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 23 13:40:24.046323 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:24.046285 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-nsr6v" event={"ID":"340b1383-bc96-4f31-a84c-6b1b58369cbe","Type":"ContainerStarted","Data":"12f941d7bc05f0e87e5efff537bbce97bf94e8cfa40a14af62e11ad901889873"} Apr 23 13:40:24.101264 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:24.101202 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-nsr6v" podStartSLOduration=1.453177348 podStartE2EDuration="4.10118337s" podCreationTimestamp="2026-04-23 13:40:20 +0000 UTC" firstStartedPulling="2026-04-23 13:40:20.834812245 +0000 UTC m=+554.973090907" lastFinishedPulling="2026-04-23 13:40:23.482818257 +0000 UTC m=+557.621096929" observedRunningTime="2026-04-23 13:40:24.098117489 +0000 UTC m=+558.236396173" watchObservedRunningTime="2026-04-23 13:40:24.10118337 +0000 UTC m=+558.239462059" Apr 23 13:40:24.702840 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:24.702804 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-nsr6v" Apr 23 13:40:24.707307 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:24.707279 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-nsr6v" Apr 23 13:40:25.049541 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:25.049461 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-nsr6v" Apr 23 13:40:25.050455 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:25.050435 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-nsr6v" Apr 23 13:40:34.286757 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:34.286722 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503fc846"] Apr 23 13:40:34.291319 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:34.291302 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503fc846" Apr 23 13:40:34.293817 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:34.293797 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 23 13:40:34.293951 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:34.293833 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 23 13:40:34.293951 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:34.293891 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-bxpk4\"" Apr 23 13:40:34.298682 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:34.298657 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503fc846"] Apr 23 13:40:34.385184 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:34.385152 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec885wxmd"] Apr 23 13:40:34.388600 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:34.388584 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec885wxmd" Apr 23 13:40:34.396484 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:34.396459 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec885wxmd"] Apr 23 13:40:34.405507 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:34.405484 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2b002a2d-0cc9-471b-be88-a45268e1b1bf-util\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503fc846\" (UID: \"2b002a2d-0cc9-471b-be88-a45268e1b1bf\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503fc846" Apr 23 13:40:34.405745 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:34.405717 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bk7jm\" (UniqueName: \"kubernetes.io/projected/2b002a2d-0cc9-471b-be88-a45268e1b1bf-kube-api-access-bk7jm\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503fc846\" (UID: \"2b002a2d-0cc9-471b-be88-a45268e1b1bf\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503fc846" Apr 23 13:40:34.405902 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:34.405824 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2b002a2d-0cc9-471b-be88-a45268e1b1bf-bundle\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503fc846\" (UID: \"2b002a2d-0cc9-471b-be88-a45268e1b1bf\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503fc846" Apr 23 13:40:34.482412 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:34.482378 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bpj6rk"] Apr 23 13:40:34.485873 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:34.485858 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bpj6rk" Apr 23 13:40:34.493385 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:34.493361 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bpj6rk"] Apr 23 13:40:34.506583 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:34.506561 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bk7jm\" (UniqueName: \"kubernetes.io/projected/2b002a2d-0cc9-471b-be88-a45268e1b1bf-kube-api-access-bk7jm\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503fc846\" (UID: \"2b002a2d-0cc9-471b-be88-a45268e1b1bf\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503fc846" Apr 23 13:40:34.506677 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:34.506599 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2b002a2d-0cc9-471b-be88-a45268e1b1bf-bundle\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503fc846\" (UID: \"2b002a2d-0cc9-471b-be88-a45268e1b1bf\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503fc846" Apr 23 13:40:34.506677 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:34.506629 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bc4c43e2-3fa0-4efe-b3b4-1bf4ed623d50-util\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec885wxmd\" (UID: \"bc4c43e2-3fa0-4efe-b3b4-1bf4ed623d50\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec885wxmd" Apr 23 13:40:34.506677 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:34.506649 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2b002a2d-0cc9-471b-be88-a45268e1b1bf-util\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503fc846\" (UID: \"2b002a2d-0cc9-471b-be88-a45268e1b1bf\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503fc846" Apr 23 13:40:34.506940 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:34.506683 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bsqlk\" (UniqueName: \"kubernetes.io/projected/bc4c43e2-3fa0-4efe-b3b4-1bf4ed623d50-kube-api-access-bsqlk\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec885wxmd\" (UID: \"bc4c43e2-3fa0-4efe-b3b4-1bf4ed623d50\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec885wxmd" Apr 23 13:40:34.506940 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:34.506711 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bc4c43e2-3fa0-4efe-b3b4-1bf4ed623d50-bundle\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec885wxmd\" (UID: \"bc4c43e2-3fa0-4efe-b3b4-1bf4ed623d50\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec885wxmd" Apr 23 13:40:34.507012 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:34.506976 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2b002a2d-0cc9-471b-be88-a45268e1b1bf-bundle\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503fc846\" (UID: \"2b002a2d-0cc9-471b-be88-a45268e1b1bf\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503fc846" Apr 23 13:40:34.507012 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:34.507003 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2b002a2d-0cc9-471b-be88-a45268e1b1bf-util\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503fc846\" (UID: \"2b002a2d-0cc9-471b-be88-a45268e1b1bf\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503fc846" Apr 23 13:40:34.514116 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:34.514098 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bk7jm\" (UniqueName: \"kubernetes.io/projected/2b002a2d-0cc9-471b-be88-a45268e1b1bf-kube-api-access-bk7jm\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503fc846\" (UID: \"2b002a2d-0cc9-471b-be88-a45268e1b1bf\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503fc846" Apr 23 13:40:34.583210 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:34.583142 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30gf79n"] Apr 23 13:40:34.586599 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:34.586581 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30gf79n" Apr 23 13:40:34.597484 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:34.597460 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30gf79n"] Apr 23 13:40:34.600979 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:34.600958 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503fc846" Apr 23 13:40:34.609829 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:34.609057 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hnlpg\" (UniqueName: \"kubernetes.io/projected/86da3a6f-c91c-4b25-be13-d7e76f9f426f-kube-api-access-hnlpg\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bpj6rk\" (UID: \"86da3a6f-c91c-4b25-be13-d7e76f9f426f\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bpj6rk" Apr 23 13:40:34.609829 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:34.609119 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/86da3a6f-c91c-4b25-be13-d7e76f9f426f-bundle\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bpj6rk\" (UID: \"86da3a6f-c91c-4b25-be13-d7e76f9f426f\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bpj6rk" Apr 23 13:40:34.609829 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:34.609152 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bc4c43e2-3fa0-4efe-b3b4-1bf4ed623d50-util\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec885wxmd\" (UID: \"bc4c43e2-3fa0-4efe-b3b4-1bf4ed623d50\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec885wxmd" Apr 23 13:40:34.609829 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:34.609207 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bsqlk\" (UniqueName: \"kubernetes.io/projected/bc4c43e2-3fa0-4efe-b3b4-1bf4ed623d50-kube-api-access-bsqlk\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec885wxmd\" (UID: \"bc4c43e2-3fa0-4efe-b3b4-1bf4ed623d50\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec885wxmd" Apr 23 13:40:34.609829 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:34.609239 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bc4c43e2-3fa0-4efe-b3b4-1bf4ed623d50-bundle\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec885wxmd\" (UID: \"bc4c43e2-3fa0-4efe-b3b4-1bf4ed623d50\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec885wxmd" Apr 23 13:40:34.609829 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:34.609270 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/86da3a6f-c91c-4b25-be13-d7e76f9f426f-util\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bpj6rk\" (UID: \"86da3a6f-c91c-4b25-be13-d7e76f9f426f\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bpj6rk" Apr 23 13:40:34.609829 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:34.609704 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bc4c43e2-3fa0-4efe-b3b4-1bf4ed623d50-util\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec885wxmd\" (UID: \"bc4c43e2-3fa0-4efe-b3b4-1bf4ed623d50\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec885wxmd" Apr 23 13:40:34.610230 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:34.610052 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bc4c43e2-3fa0-4efe-b3b4-1bf4ed623d50-bundle\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec885wxmd\" (UID: \"bc4c43e2-3fa0-4efe-b3b4-1bf4ed623d50\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec885wxmd" Apr 23 13:40:34.617908 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:34.617883 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bsqlk\" (UniqueName: \"kubernetes.io/projected/bc4c43e2-3fa0-4efe-b3b4-1bf4ed623d50-kube-api-access-bsqlk\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec885wxmd\" (UID: \"bc4c43e2-3fa0-4efe-b3b4-1bf4ed623d50\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec885wxmd" Apr 23 13:40:34.697696 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:34.697657 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec885wxmd" Apr 23 13:40:34.710154 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:34.710101 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cef7c57d-d088-47ef-b81f-acbc7af31b5a-util\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30gf79n\" (UID: \"cef7c57d-d088-47ef-b81f-acbc7af31b5a\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30gf79n" Apr 23 13:40:34.710286 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:34.710232 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/86da3a6f-c91c-4b25-be13-d7e76f9f426f-util\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bpj6rk\" (UID: \"86da3a6f-c91c-4b25-be13-d7e76f9f426f\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bpj6rk" Apr 23 13:40:34.710384 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:34.710300 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hnlpg\" (UniqueName: \"kubernetes.io/projected/86da3a6f-c91c-4b25-be13-d7e76f9f426f-kube-api-access-hnlpg\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bpj6rk\" (UID: \"86da3a6f-c91c-4b25-be13-d7e76f9f426f\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bpj6rk" Apr 23 13:40:34.710442 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:34.710405 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/86da3a6f-c91c-4b25-be13-d7e76f9f426f-bundle\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bpj6rk\" (UID: \"86da3a6f-c91c-4b25-be13-d7e76f9f426f\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bpj6rk" Apr 23 13:40:34.710501 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:34.710454 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9whdk\" (UniqueName: \"kubernetes.io/projected/cef7c57d-d088-47ef-b81f-acbc7af31b5a-kube-api-access-9whdk\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30gf79n\" (UID: \"cef7c57d-d088-47ef-b81f-acbc7af31b5a\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30gf79n" Apr 23 13:40:34.710501 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:34.710490 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cef7c57d-d088-47ef-b81f-acbc7af31b5a-bundle\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30gf79n\" (UID: \"cef7c57d-d088-47ef-b81f-acbc7af31b5a\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30gf79n" Apr 23 13:40:34.710740 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:34.710685 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/86da3a6f-c91c-4b25-be13-d7e76f9f426f-util\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bpj6rk\" (UID: \"86da3a6f-c91c-4b25-be13-d7e76f9f426f\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bpj6rk" Apr 23 13:40:34.710888 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:34.710832 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/86da3a6f-c91c-4b25-be13-d7e76f9f426f-bundle\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bpj6rk\" (UID: \"86da3a6f-c91c-4b25-be13-d7e76f9f426f\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bpj6rk" Apr 23 13:40:34.718654 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:34.718628 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hnlpg\" (UniqueName: \"kubernetes.io/projected/86da3a6f-c91c-4b25-be13-d7e76f9f426f-kube-api-access-hnlpg\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bpj6rk\" (UID: \"86da3a6f-c91c-4b25-be13-d7e76f9f426f\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bpj6rk" Apr 23 13:40:34.725701 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:34.725680 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503fc846"] Apr 23 13:40:34.727530 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:40:34.727507 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2b002a2d_0cc9_471b_be88_a45268e1b1bf.slice/crio-fc63ccc94cf4855e15ea60acfcb5ff3999a971c091fc2928523ed9362d46e70b WatchSource:0}: Error finding container fc63ccc94cf4855e15ea60acfcb5ff3999a971c091fc2928523ed9362d46e70b: Status 404 returned error can't find the container with id fc63ccc94cf4855e15ea60acfcb5ff3999a971c091fc2928523ed9362d46e70b Apr 23 13:40:34.795671 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:34.795594 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bpj6rk" Apr 23 13:40:34.811662 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:34.811629 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9whdk\" (UniqueName: \"kubernetes.io/projected/cef7c57d-d088-47ef-b81f-acbc7af31b5a-kube-api-access-9whdk\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30gf79n\" (UID: \"cef7c57d-d088-47ef-b81f-acbc7af31b5a\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30gf79n" Apr 23 13:40:34.811790 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:34.811687 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cef7c57d-d088-47ef-b81f-acbc7af31b5a-bundle\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30gf79n\" (UID: \"cef7c57d-d088-47ef-b81f-acbc7af31b5a\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30gf79n" Apr 23 13:40:34.811790 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:34.811742 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cef7c57d-d088-47ef-b81f-acbc7af31b5a-util\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30gf79n\" (UID: \"cef7c57d-d088-47ef-b81f-acbc7af31b5a\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30gf79n" Apr 23 13:40:34.812305 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:34.812256 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cef7c57d-d088-47ef-b81f-acbc7af31b5a-util\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30gf79n\" (UID: \"cef7c57d-d088-47ef-b81f-acbc7af31b5a\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30gf79n" Apr 23 13:40:34.812305 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:34.812296 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cef7c57d-d088-47ef-b81f-acbc7af31b5a-bundle\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30gf79n\" (UID: \"cef7c57d-d088-47ef-b81f-acbc7af31b5a\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30gf79n" Apr 23 13:40:34.819832 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:34.819802 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9whdk\" (UniqueName: \"kubernetes.io/projected/cef7c57d-d088-47ef-b81f-acbc7af31b5a-kube-api-access-9whdk\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30gf79n\" (UID: \"cef7c57d-d088-47ef-b81f-acbc7af31b5a\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30gf79n" Apr 23 13:40:34.833406 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:34.831849 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec885wxmd"] Apr 23 13:40:34.896385 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:34.896357 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30gf79n" Apr 23 13:40:34.922955 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:34.922928 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bpj6rk"] Apr 23 13:40:34.926117 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:40:34.926089 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod86da3a6f_c91c_4b25_be13_d7e76f9f426f.slice/crio-9e87925f7364b9442023b26442725a8d564f9b95b3147017d6e5381c4f3e8264 WatchSource:0}: Error finding container 9e87925f7364b9442023b26442725a8d564f9b95b3147017d6e5381c4f3e8264: Status 404 returned error can't find the container with id 9e87925f7364b9442023b26442725a8d564f9b95b3147017d6e5381c4f3e8264 Apr 23 13:40:35.049682 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:35.049660 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30gf79n"] Apr 23 13:40:35.079029 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:40:35.079001 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcef7c57d_d088_47ef_b81f_acbc7af31b5a.slice/crio-0569a8ba02e218397a83d1fc2b7dc818e13beb02fcf1eb3f1931a25a30947f59 WatchSource:0}: Error finding container 0569a8ba02e218397a83d1fc2b7dc818e13beb02fcf1eb3f1931a25a30947f59: Status 404 returned error can't find the container with id 0569a8ba02e218397a83d1fc2b7dc818e13beb02fcf1eb3f1931a25a30947f59 Apr 23 13:40:35.082976 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:35.082953 2571 generic.go:358] "Generic (PLEG): container finished" podID="2b002a2d-0cc9-471b-be88-a45268e1b1bf" containerID="acb0b94932886debea88caa775a5bc943650e80375e6d1b5b9645d2dae3853a9" exitCode=0 Apr 23 13:40:35.083067 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:35.083036 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503fc846" event={"ID":"2b002a2d-0cc9-471b-be88-a45268e1b1bf","Type":"ContainerDied","Data":"acb0b94932886debea88caa775a5bc943650e80375e6d1b5b9645d2dae3853a9"} Apr 23 13:40:35.083115 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:35.083074 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503fc846" event={"ID":"2b002a2d-0cc9-471b-be88-a45268e1b1bf","Type":"ContainerStarted","Data":"fc63ccc94cf4855e15ea60acfcb5ff3999a971c091fc2928523ed9362d46e70b"} Apr 23 13:40:35.084308 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:35.084285 2571 generic.go:358] "Generic (PLEG): container finished" podID="bc4c43e2-3fa0-4efe-b3b4-1bf4ed623d50" containerID="7165f94d340f2f537dc6d0b64bf0371c10d42bfca109ee66d80eac6cc11c796b" exitCode=0 Apr 23 13:40:35.084389 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:35.084361 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec885wxmd" event={"ID":"bc4c43e2-3fa0-4efe-b3b4-1bf4ed623d50","Type":"ContainerDied","Data":"7165f94d340f2f537dc6d0b64bf0371c10d42bfca109ee66d80eac6cc11c796b"} Apr 23 13:40:35.084446 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:35.084392 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec885wxmd" event={"ID":"bc4c43e2-3fa0-4efe-b3b4-1bf4ed623d50","Type":"ContainerStarted","Data":"8d7a7107942cc9ad11436ee979036029b169b4eaa3faa75e77a29088e9f1cd60"} Apr 23 13:40:35.085958 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:35.085914 2571 generic.go:358] "Generic (PLEG): container finished" podID="86da3a6f-c91c-4b25-be13-d7e76f9f426f" containerID="c8c12ccb568ef62c823d6eab0b0048564c829b60f38f2a80baec43bea1ec2028" exitCode=0 Apr 23 13:40:35.086028 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:35.085985 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bpj6rk" event={"ID":"86da3a6f-c91c-4b25-be13-d7e76f9f426f","Type":"ContainerDied","Data":"c8c12ccb568ef62c823d6eab0b0048564c829b60f38f2a80baec43bea1ec2028"} Apr 23 13:40:35.086091 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:35.086051 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bpj6rk" event={"ID":"86da3a6f-c91c-4b25-be13-d7e76f9f426f","Type":"ContainerStarted","Data":"9e87925f7364b9442023b26442725a8d564f9b95b3147017d6e5381c4f3e8264"} Apr 23 13:40:36.091147 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:36.091111 2571 generic.go:358] "Generic (PLEG): container finished" podID="bc4c43e2-3fa0-4efe-b3b4-1bf4ed623d50" containerID="dd4888325f041691d98b029b375070592c5bc7ce92dccc0c32cf4917616adabd" exitCode=0 Apr 23 13:40:36.091551 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:36.091196 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec885wxmd" event={"ID":"bc4c43e2-3fa0-4efe-b3b4-1bf4ed623d50","Type":"ContainerDied","Data":"dd4888325f041691d98b029b375070592c5bc7ce92dccc0c32cf4917616adabd"} Apr 23 13:40:36.092583 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:36.092562 2571 generic.go:358] "Generic (PLEG): container finished" podID="cef7c57d-d088-47ef-b81f-acbc7af31b5a" containerID="e871d2591807cb8ef07b688c366944f8a7585ac1bbcd17970bbf06227f4be594" exitCode=0 Apr 23 13:40:36.092675 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:36.092603 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30gf79n" event={"ID":"cef7c57d-d088-47ef-b81f-acbc7af31b5a","Type":"ContainerDied","Data":"e871d2591807cb8ef07b688c366944f8a7585ac1bbcd17970bbf06227f4be594"} Apr 23 13:40:36.092675 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:36.092636 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30gf79n" event={"ID":"cef7c57d-d088-47ef-b81f-acbc7af31b5a","Type":"ContainerStarted","Data":"0569a8ba02e218397a83d1fc2b7dc818e13beb02fcf1eb3f1931a25a30947f59"} Apr 23 13:40:37.098128 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:37.098096 2571 generic.go:358] "Generic (PLEG): container finished" podID="bc4c43e2-3fa0-4efe-b3b4-1bf4ed623d50" containerID="68ea322c7f9b8ba7091cf9a8dc8855fababdee6a386cb84b45580128ee90d851" exitCode=0 Apr 23 13:40:37.098522 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:37.098178 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec885wxmd" event={"ID":"bc4c43e2-3fa0-4efe-b3b4-1bf4ed623d50","Type":"ContainerDied","Data":"68ea322c7f9b8ba7091cf9a8dc8855fababdee6a386cb84b45580128ee90d851"} Apr 23 13:40:37.099668 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:37.099645 2571 generic.go:358] "Generic (PLEG): container finished" podID="86da3a6f-c91c-4b25-be13-d7e76f9f426f" containerID="be68abe81ca4422a7fb30be951138b7066cfba719dd37c6edf7cfa4391a0f48d" exitCode=0 Apr 23 13:40:37.099767 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:37.099732 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bpj6rk" event={"ID":"86da3a6f-c91c-4b25-be13-d7e76f9f426f","Type":"ContainerDied","Data":"be68abe81ca4422a7fb30be951138b7066cfba719dd37c6edf7cfa4391a0f48d"} Apr 23 13:40:37.101232 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:37.101212 2571 generic.go:358] "Generic (PLEG): container finished" podID="2b002a2d-0cc9-471b-be88-a45268e1b1bf" containerID="3f0df2887ef2ec7685590c55ad0a205a8038e7001e42ea6b54c1f629c802863d" exitCode=0 Apr 23 13:40:37.101298 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:37.101251 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503fc846" event={"ID":"2b002a2d-0cc9-471b-be88-a45268e1b1bf","Type":"ContainerDied","Data":"3f0df2887ef2ec7685590c55ad0a205a8038e7001e42ea6b54c1f629c802863d"} Apr 23 13:40:38.106536 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:38.106501 2571 generic.go:358] "Generic (PLEG): container finished" podID="cef7c57d-d088-47ef-b81f-acbc7af31b5a" containerID="7599e84cf7bec0e24fc4613c22fc4364b7802a25361e94c0a55ba94acbc83b4f" exitCode=0 Apr 23 13:40:38.106957 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:38.106592 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30gf79n" event={"ID":"cef7c57d-d088-47ef-b81f-acbc7af31b5a","Type":"ContainerDied","Data":"7599e84cf7bec0e24fc4613c22fc4364b7802a25361e94c0a55ba94acbc83b4f"} Apr 23 13:40:38.108690 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:38.108666 2571 generic.go:358] "Generic (PLEG): container finished" podID="2b002a2d-0cc9-471b-be88-a45268e1b1bf" containerID="3dc12deb44d3d378ecf36f72c318109bde51811e0f259c363f32d49095f2b1a1" exitCode=0 Apr 23 13:40:38.108794 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:38.108744 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503fc846" event={"ID":"2b002a2d-0cc9-471b-be88-a45268e1b1bf","Type":"ContainerDied","Data":"3dc12deb44d3d378ecf36f72c318109bde51811e0f259c363f32d49095f2b1a1"} Apr 23 13:40:38.110465 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:38.110443 2571 generic.go:358] "Generic (PLEG): container finished" podID="86da3a6f-c91c-4b25-be13-d7e76f9f426f" containerID="ac78b7880bd992330f2bc3be4c5ee79abf805ee5a14712d286e25b1255882fea" exitCode=0 Apr 23 13:40:38.110553 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:38.110524 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bpj6rk" event={"ID":"86da3a6f-c91c-4b25-be13-d7e76f9f426f","Type":"ContainerDied","Data":"ac78b7880bd992330f2bc3be4c5ee79abf805ee5a14712d286e25b1255882fea"} Apr 23 13:40:38.235800 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:38.235776 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec885wxmd" Apr 23 13:40:38.340870 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:38.340805 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bsqlk\" (UniqueName: \"kubernetes.io/projected/bc4c43e2-3fa0-4efe-b3b4-1bf4ed623d50-kube-api-access-bsqlk\") pod \"bc4c43e2-3fa0-4efe-b3b4-1bf4ed623d50\" (UID: \"bc4c43e2-3fa0-4efe-b3b4-1bf4ed623d50\") " Apr 23 13:40:38.341021 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:38.340943 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bc4c43e2-3fa0-4efe-b3b4-1bf4ed623d50-bundle\") pod \"bc4c43e2-3fa0-4efe-b3b4-1bf4ed623d50\" (UID: \"bc4c43e2-3fa0-4efe-b3b4-1bf4ed623d50\") " Apr 23 13:40:38.341021 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:38.340973 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bc4c43e2-3fa0-4efe-b3b4-1bf4ed623d50-util\") pod \"bc4c43e2-3fa0-4efe-b3b4-1bf4ed623d50\" (UID: \"bc4c43e2-3fa0-4efe-b3b4-1bf4ed623d50\") " Apr 23 13:40:38.341449 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:38.341421 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc4c43e2-3fa0-4efe-b3b4-1bf4ed623d50-bundle" (OuterVolumeSpecName: "bundle") pod "bc4c43e2-3fa0-4efe-b3b4-1bf4ed623d50" (UID: "bc4c43e2-3fa0-4efe-b3b4-1bf4ed623d50"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 13:40:38.342782 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:38.342756 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc4c43e2-3fa0-4efe-b3b4-1bf4ed623d50-kube-api-access-bsqlk" (OuterVolumeSpecName: "kube-api-access-bsqlk") pod "bc4c43e2-3fa0-4efe-b3b4-1bf4ed623d50" (UID: "bc4c43e2-3fa0-4efe-b3b4-1bf4ed623d50"). InnerVolumeSpecName "kube-api-access-bsqlk". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 13:40:38.346002 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:38.345983 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc4c43e2-3fa0-4efe-b3b4-1bf4ed623d50-util" (OuterVolumeSpecName: "util") pod "bc4c43e2-3fa0-4efe-b3b4-1bf4ed623d50" (UID: "bc4c43e2-3fa0-4efe-b3b4-1bf4ed623d50"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 13:40:38.441666 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:38.441645 2571 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bc4c43e2-3fa0-4efe-b3b4-1bf4ed623d50-bundle\") on node \"ip-10-0-128-108.ec2.internal\" DevicePath \"\"" Apr 23 13:40:38.441666 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:38.441668 2571 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bc4c43e2-3fa0-4efe-b3b4-1bf4ed623d50-util\") on node \"ip-10-0-128-108.ec2.internal\" DevicePath \"\"" Apr 23 13:40:38.441769 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:38.441677 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-bsqlk\" (UniqueName: \"kubernetes.io/projected/bc4c43e2-3fa0-4efe-b3b4-1bf4ed623d50-kube-api-access-bsqlk\") on node \"ip-10-0-128-108.ec2.internal\" DevicePath \"\"" Apr 23 13:40:39.115974 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:39.115942 2571 generic.go:358] "Generic (PLEG): container finished" podID="cef7c57d-d088-47ef-b81f-acbc7af31b5a" containerID="9ad6bebf7c026df9413611521e1e9a09c65a42e4016df019289e91135a068990" exitCode=0 Apr 23 13:40:39.116365 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:39.116021 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30gf79n" event={"ID":"cef7c57d-d088-47ef-b81f-acbc7af31b5a","Type":"ContainerDied","Data":"9ad6bebf7c026df9413611521e1e9a09c65a42e4016df019289e91135a068990"} Apr 23 13:40:39.117659 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:39.117635 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec885wxmd" event={"ID":"bc4c43e2-3fa0-4efe-b3b4-1bf4ed623d50","Type":"ContainerDied","Data":"8d7a7107942cc9ad11436ee979036029b169b4eaa3faa75e77a29088e9f1cd60"} Apr 23 13:40:39.117659 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:39.117659 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec885wxmd" Apr 23 13:40:39.117790 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:39.117664 2571 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8d7a7107942cc9ad11436ee979036029b169b4eaa3faa75e77a29088e9f1cd60" Apr 23 13:40:39.265133 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:39.265111 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bpj6rk" Apr 23 13:40:39.268023 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:39.268005 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503fc846" Apr 23 13:40:39.348393 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:39.348364 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2b002a2d-0cc9-471b-be88-a45268e1b1bf-bundle\") pod \"2b002a2d-0cc9-471b-be88-a45268e1b1bf\" (UID: \"2b002a2d-0cc9-471b-be88-a45268e1b1bf\") " Apr 23 13:40:39.348577 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:39.348413 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/86da3a6f-c91c-4b25-be13-d7e76f9f426f-bundle\") pod \"86da3a6f-c91c-4b25-be13-d7e76f9f426f\" (UID: \"86da3a6f-c91c-4b25-be13-d7e76f9f426f\") " Apr 23 13:40:39.348577 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:39.348455 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hnlpg\" (UniqueName: \"kubernetes.io/projected/86da3a6f-c91c-4b25-be13-d7e76f9f426f-kube-api-access-hnlpg\") pod \"86da3a6f-c91c-4b25-be13-d7e76f9f426f\" (UID: \"86da3a6f-c91c-4b25-be13-d7e76f9f426f\") " Apr 23 13:40:39.348577 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:39.348489 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bk7jm\" (UniqueName: \"kubernetes.io/projected/2b002a2d-0cc9-471b-be88-a45268e1b1bf-kube-api-access-bk7jm\") pod \"2b002a2d-0cc9-471b-be88-a45268e1b1bf\" (UID: \"2b002a2d-0cc9-471b-be88-a45268e1b1bf\") " Apr 23 13:40:39.348577 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:39.348507 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/86da3a6f-c91c-4b25-be13-d7e76f9f426f-util\") pod \"86da3a6f-c91c-4b25-be13-d7e76f9f426f\" (UID: \"86da3a6f-c91c-4b25-be13-d7e76f9f426f\") " Apr 23 13:40:39.348577 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:39.348528 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2b002a2d-0cc9-471b-be88-a45268e1b1bf-util\") pod \"2b002a2d-0cc9-471b-be88-a45268e1b1bf\" (UID: \"2b002a2d-0cc9-471b-be88-a45268e1b1bf\") " Apr 23 13:40:39.348974 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:39.348950 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/86da3a6f-c91c-4b25-be13-d7e76f9f426f-bundle" (OuterVolumeSpecName: "bundle") pod "86da3a6f-c91c-4b25-be13-d7e76f9f426f" (UID: "86da3a6f-c91c-4b25-be13-d7e76f9f426f"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 13:40:39.349117 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:39.349095 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2b002a2d-0cc9-471b-be88-a45268e1b1bf-bundle" (OuterVolumeSpecName: "bundle") pod "2b002a2d-0cc9-471b-be88-a45268e1b1bf" (UID: "2b002a2d-0cc9-471b-be88-a45268e1b1bf"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 13:40:39.350605 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:39.350577 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/86da3a6f-c91c-4b25-be13-d7e76f9f426f-kube-api-access-hnlpg" (OuterVolumeSpecName: "kube-api-access-hnlpg") pod "86da3a6f-c91c-4b25-be13-d7e76f9f426f" (UID: "86da3a6f-c91c-4b25-be13-d7e76f9f426f"). InnerVolumeSpecName "kube-api-access-hnlpg". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 13:40:39.350978 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:39.350956 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b002a2d-0cc9-471b-be88-a45268e1b1bf-kube-api-access-bk7jm" (OuterVolumeSpecName: "kube-api-access-bk7jm") pod "2b002a2d-0cc9-471b-be88-a45268e1b1bf" (UID: "2b002a2d-0cc9-471b-be88-a45268e1b1bf"). InnerVolumeSpecName "kube-api-access-bk7jm". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 13:40:39.354314 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:39.354284 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2b002a2d-0cc9-471b-be88-a45268e1b1bf-util" (OuterVolumeSpecName: "util") pod "2b002a2d-0cc9-471b-be88-a45268e1b1bf" (UID: "2b002a2d-0cc9-471b-be88-a45268e1b1bf"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 13:40:39.354522 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:39.354502 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/86da3a6f-c91c-4b25-be13-d7e76f9f426f-util" (OuterVolumeSpecName: "util") pod "86da3a6f-c91c-4b25-be13-d7e76f9f426f" (UID: "86da3a6f-c91c-4b25-be13-d7e76f9f426f"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 13:40:39.449644 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:39.449620 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-hnlpg\" (UniqueName: \"kubernetes.io/projected/86da3a6f-c91c-4b25-be13-d7e76f9f426f-kube-api-access-hnlpg\") on node \"ip-10-0-128-108.ec2.internal\" DevicePath \"\"" Apr 23 13:40:39.449644 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:39.449644 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-bk7jm\" (UniqueName: \"kubernetes.io/projected/2b002a2d-0cc9-471b-be88-a45268e1b1bf-kube-api-access-bk7jm\") on node \"ip-10-0-128-108.ec2.internal\" DevicePath \"\"" Apr 23 13:40:39.449787 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:39.449655 2571 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/86da3a6f-c91c-4b25-be13-d7e76f9f426f-util\") on node \"ip-10-0-128-108.ec2.internal\" DevicePath \"\"" Apr 23 13:40:39.449787 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:39.449664 2571 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2b002a2d-0cc9-471b-be88-a45268e1b1bf-util\") on node \"ip-10-0-128-108.ec2.internal\" DevicePath \"\"" Apr 23 13:40:39.449787 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:39.449672 2571 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2b002a2d-0cc9-471b-be88-a45268e1b1bf-bundle\") on node \"ip-10-0-128-108.ec2.internal\" DevicePath \"\"" Apr 23 13:40:39.449787 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:39.449681 2571 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/86da3a6f-c91c-4b25-be13-d7e76f9f426f-bundle\") on node \"ip-10-0-128-108.ec2.internal\" DevicePath \"\"" Apr 23 13:40:40.123662 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:40.123619 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bpj6rk" Apr 23 13:40:40.123662 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:40.123638 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bpj6rk" event={"ID":"86da3a6f-c91c-4b25-be13-d7e76f9f426f","Type":"ContainerDied","Data":"9e87925f7364b9442023b26442725a8d564f9b95b3147017d6e5381c4f3e8264"} Apr 23 13:40:40.123662 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:40.123670 2571 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9e87925f7364b9442023b26442725a8d564f9b95b3147017d6e5381c4f3e8264" Apr 23 13:40:40.125385 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:40.125360 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503fc846" event={"ID":"2b002a2d-0cc9-471b-be88-a45268e1b1bf","Type":"ContainerDied","Data":"fc63ccc94cf4855e15ea60acfcb5ff3999a971c091fc2928523ed9362d46e70b"} Apr 23 13:40:40.125474 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:40.125390 2571 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fc63ccc94cf4855e15ea60acfcb5ff3999a971c091fc2928523ed9362d46e70b" Apr 23 13:40:40.125474 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:40.125408 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e503fc846" Apr 23 13:40:40.254956 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:40.254933 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30gf79n" Apr 23 13:40:40.356511 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:40.356479 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cef7c57d-d088-47ef-b81f-acbc7af31b5a-bundle\") pod \"cef7c57d-d088-47ef-b81f-acbc7af31b5a\" (UID: \"cef7c57d-d088-47ef-b81f-acbc7af31b5a\") " Apr 23 13:40:40.356668 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:40.356527 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cef7c57d-d088-47ef-b81f-acbc7af31b5a-util\") pod \"cef7c57d-d088-47ef-b81f-acbc7af31b5a\" (UID: \"cef7c57d-d088-47ef-b81f-acbc7af31b5a\") " Apr 23 13:40:40.356668 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:40.356566 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9whdk\" (UniqueName: \"kubernetes.io/projected/cef7c57d-d088-47ef-b81f-acbc7af31b5a-kube-api-access-9whdk\") pod \"cef7c57d-d088-47ef-b81f-acbc7af31b5a\" (UID: \"cef7c57d-d088-47ef-b81f-acbc7af31b5a\") " Apr 23 13:40:40.357066 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:40.357040 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cef7c57d-d088-47ef-b81f-acbc7af31b5a-bundle" (OuterVolumeSpecName: "bundle") pod "cef7c57d-d088-47ef-b81f-acbc7af31b5a" (UID: "cef7c57d-d088-47ef-b81f-acbc7af31b5a"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 13:40:40.358652 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:40.358628 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cef7c57d-d088-47ef-b81f-acbc7af31b5a-kube-api-access-9whdk" (OuterVolumeSpecName: "kube-api-access-9whdk") pod "cef7c57d-d088-47ef-b81f-acbc7af31b5a" (UID: "cef7c57d-d088-47ef-b81f-acbc7af31b5a"). InnerVolumeSpecName "kube-api-access-9whdk". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 13:40:40.361517 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:40.361496 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cef7c57d-d088-47ef-b81f-acbc7af31b5a-util" (OuterVolumeSpecName: "util") pod "cef7c57d-d088-47ef-b81f-acbc7af31b5a" (UID: "cef7c57d-d088-47ef-b81f-acbc7af31b5a"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 13:40:40.457550 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:40.457529 2571 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cef7c57d-d088-47ef-b81f-acbc7af31b5a-util\") on node \"ip-10-0-128-108.ec2.internal\" DevicePath \"\"" Apr 23 13:40:40.457634 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:40.457552 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-9whdk\" (UniqueName: \"kubernetes.io/projected/cef7c57d-d088-47ef-b81f-acbc7af31b5a-kube-api-access-9whdk\") on node \"ip-10-0-128-108.ec2.internal\" DevicePath \"\"" Apr 23 13:40:40.457634 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:40.457562 2571 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cef7c57d-d088-47ef-b81f-acbc7af31b5a-bundle\") on node \"ip-10-0-128-108.ec2.internal\" DevicePath \"\"" Apr 23 13:40:41.130205 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:41.130161 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30gf79n" event={"ID":"cef7c57d-d088-47ef-b81f-acbc7af31b5a","Type":"ContainerDied","Data":"0569a8ba02e218397a83d1fc2b7dc818e13beb02fcf1eb3f1931a25a30947f59"} Apr 23 13:40:41.130205 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:41.130200 2571 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0569a8ba02e218397a83d1fc2b7dc818e13beb02fcf1eb3f1931a25a30947f59" Apr 23 13:40:41.130656 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:41.130237 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30gf79n" Apr 23 13:40:46.626569 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:46.626535 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-j6x6p"] Apr 23 13:40:46.627020 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:46.626809 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="86da3a6f-c91c-4b25-be13-d7e76f9f426f" containerName="pull" Apr 23 13:40:46.627020 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:46.626820 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="86da3a6f-c91c-4b25-be13-d7e76f9f426f" containerName="pull" Apr 23 13:40:46.627020 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:46.626827 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bc4c43e2-3fa0-4efe-b3b4-1bf4ed623d50" containerName="util" Apr 23 13:40:46.627020 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:46.626833 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc4c43e2-3fa0-4efe-b3b4-1bf4ed623d50" containerName="util" Apr 23 13:40:46.627020 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:46.626841 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="86da3a6f-c91c-4b25-be13-d7e76f9f426f" containerName="util" Apr 23 13:40:46.627020 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:46.626846 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="86da3a6f-c91c-4b25-be13-d7e76f9f426f" containerName="util" Apr 23 13:40:46.627020 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:46.626852 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cef7c57d-d088-47ef-b81f-acbc7af31b5a" containerName="extract" Apr 23 13:40:46.627020 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:46.626857 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="cef7c57d-d088-47ef-b81f-acbc7af31b5a" containerName="extract" Apr 23 13:40:46.627020 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:46.626864 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2b002a2d-0cc9-471b-be88-a45268e1b1bf" containerName="pull" Apr 23 13:40:46.627020 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:46.626869 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b002a2d-0cc9-471b-be88-a45268e1b1bf" containerName="pull" Apr 23 13:40:46.627020 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:46.626876 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2b002a2d-0cc9-471b-be88-a45268e1b1bf" containerName="extract" Apr 23 13:40:46.627020 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:46.626881 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b002a2d-0cc9-471b-be88-a45268e1b1bf" containerName="extract" Apr 23 13:40:46.627020 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:46.626887 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bc4c43e2-3fa0-4efe-b3b4-1bf4ed623d50" containerName="extract" Apr 23 13:40:46.627020 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:46.626891 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc4c43e2-3fa0-4efe-b3b4-1bf4ed623d50" containerName="extract" Apr 23 13:40:46.627020 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:46.626901 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="86da3a6f-c91c-4b25-be13-d7e76f9f426f" containerName="extract" Apr 23 13:40:46.627020 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:46.626905 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="86da3a6f-c91c-4b25-be13-d7e76f9f426f" containerName="extract" Apr 23 13:40:46.627020 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:46.626911 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cef7c57d-d088-47ef-b81f-acbc7af31b5a" containerName="util" Apr 23 13:40:46.627020 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:46.626916 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="cef7c57d-d088-47ef-b81f-acbc7af31b5a" containerName="util" Apr 23 13:40:46.627020 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:46.626925 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2b002a2d-0cc9-471b-be88-a45268e1b1bf" containerName="util" Apr 23 13:40:46.627020 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:46.626929 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b002a2d-0cc9-471b-be88-a45268e1b1bf" containerName="util" Apr 23 13:40:46.627020 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:46.626935 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bc4c43e2-3fa0-4efe-b3b4-1bf4ed623d50" containerName="pull" Apr 23 13:40:46.627020 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:46.626939 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc4c43e2-3fa0-4efe-b3b4-1bf4ed623d50" containerName="pull" Apr 23 13:40:46.627020 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:46.626945 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cef7c57d-d088-47ef-b81f-acbc7af31b5a" containerName="pull" Apr 23 13:40:46.627020 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:46.626949 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="cef7c57d-d088-47ef-b81f-acbc7af31b5a" containerName="pull" Apr 23 13:40:46.627020 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:46.626990 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="86da3a6f-c91c-4b25-be13-d7e76f9f426f" containerName="extract" Apr 23 13:40:46.627020 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:46.626999 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="bc4c43e2-3fa0-4efe-b3b4-1bf4ed623d50" containerName="extract" Apr 23 13:40:46.627020 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:46.627005 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="cef7c57d-d088-47ef-b81f-acbc7af31b5a" containerName="extract" Apr 23 13:40:46.627020 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:46.627011 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="2b002a2d-0cc9-471b-be88-a45268e1b1bf" containerName="extract" Apr 23 13:40:46.643252 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:46.643223 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-j6x6p"] Apr 23 13:40:46.643407 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:46.643370 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-j6x6p" Apr 23 13:40:46.645925 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:46.645901 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 23 13:40:46.646103 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:46.646079 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"kuadrant-operator-controller-manager-dockercfg-zktfm\"" Apr 23 13:40:46.646257 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:46.646236 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 23 13:40:46.801605 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:46.801571 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/9a520890-080f-4739-bbea-32aea118ade2-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6ddf9554fc-j6x6p\" (UID: \"9a520890-080f-4739-bbea-32aea118ade2\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-j6x6p" Apr 23 13:40:46.801753 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:46.801619 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hwpnx\" (UniqueName: \"kubernetes.io/projected/9a520890-080f-4739-bbea-32aea118ade2-kube-api-access-hwpnx\") pod \"kuadrant-operator-controller-manager-6ddf9554fc-j6x6p\" (UID: \"9a520890-080f-4739-bbea-32aea118ade2\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-j6x6p" Apr 23 13:40:46.903071 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:46.902985 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/9a520890-080f-4739-bbea-32aea118ade2-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6ddf9554fc-j6x6p\" (UID: \"9a520890-080f-4739-bbea-32aea118ade2\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-j6x6p" Apr 23 13:40:46.903071 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:46.903047 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hwpnx\" (UniqueName: \"kubernetes.io/projected/9a520890-080f-4739-bbea-32aea118ade2-kube-api-access-hwpnx\") pod \"kuadrant-operator-controller-manager-6ddf9554fc-j6x6p\" (UID: \"9a520890-080f-4739-bbea-32aea118ade2\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-j6x6p" Apr 23 13:40:46.903423 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:46.903398 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/9a520890-080f-4739-bbea-32aea118ade2-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6ddf9554fc-j6x6p\" (UID: \"9a520890-080f-4739-bbea-32aea118ade2\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-j6x6p" Apr 23 13:40:46.915629 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:46.915593 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hwpnx\" (UniqueName: \"kubernetes.io/projected/9a520890-080f-4739-bbea-32aea118ade2-kube-api-access-hwpnx\") pod \"kuadrant-operator-controller-manager-6ddf9554fc-j6x6p\" (UID: \"9a520890-080f-4739-bbea-32aea118ade2\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-j6x6p" Apr 23 13:40:46.954247 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:46.954229 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-j6x6p" Apr 23 13:40:47.078311 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:47.078283 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-j6x6p"] Apr 23 13:40:47.078809 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:40:47.078786 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9a520890_080f_4739_bbea_32aea118ade2.slice/crio-d63b5135da13926adc53ed4bb29109db9dc3ac5945cabfef0a065af975d4cd36 WatchSource:0}: Error finding container d63b5135da13926adc53ed4bb29109db9dc3ac5945cabfef0a065af975d4cd36: Status 404 returned error can't find the container with id d63b5135da13926adc53ed4bb29109db9dc3ac5945cabfef0a065af975d4cd36 Apr 23 13:40:47.153460 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:47.153388 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-j6x6p" event={"ID":"9a520890-080f-4739-bbea-32aea118ade2","Type":"ContainerStarted","Data":"d63b5135da13926adc53ed4bb29109db9dc3ac5945cabfef0a065af975d4cd36"} Apr 23 13:40:49.323086 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:49.323045 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-operator-7587b89b76-f4rbk"] Apr 23 13:40:49.335092 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:49.335058 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-operator-7587b89b76-f4rbk" Apr 23 13:40:49.339185 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:49.339164 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-operator-dockercfg-r49wp\"" Apr 23 13:40:49.343376 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:49.343355 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-operator-7587b89b76-f4rbk"] Apr 23 13:40:49.427264 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:49.427227 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ggcgd\" (UniqueName: \"kubernetes.io/projected/97a95b10-b768-476e-998a-5cfd6ae0c06c-kube-api-access-ggcgd\") pod \"authorino-operator-7587b89b76-f4rbk\" (UID: \"97a95b10-b768-476e-998a-5cfd6ae0c06c\") " pod="kuadrant-system/authorino-operator-7587b89b76-f4rbk" Apr 23 13:40:49.527961 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:49.527925 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ggcgd\" (UniqueName: \"kubernetes.io/projected/97a95b10-b768-476e-998a-5cfd6ae0c06c-kube-api-access-ggcgd\") pod \"authorino-operator-7587b89b76-f4rbk\" (UID: \"97a95b10-b768-476e-998a-5cfd6ae0c06c\") " pod="kuadrant-system/authorino-operator-7587b89b76-f4rbk" Apr 23 13:40:49.540119 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:49.540089 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ggcgd\" (UniqueName: \"kubernetes.io/projected/97a95b10-b768-476e-998a-5cfd6ae0c06c-kube-api-access-ggcgd\") pod \"authorino-operator-7587b89b76-f4rbk\" (UID: \"97a95b10-b768-476e-998a-5cfd6ae0c06c\") " pod="kuadrant-system/authorino-operator-7587b89b76-f4rbk" Apr 23 13:40:49.649239 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:49.649160 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-operator-7587b89b76-f4rbk" Apr 23 13:40:49.790959 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:49.790907 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-operator-7587b89b76-f4rbk"] Apr 23 13:40:49.793538 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:40:49.793476 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod97a95b10_b768_476e_998a_5cfd6ae0c06c.slice/crio-2ac81cd856a0025f5266426aa926a8738193492082629f4c911ce2d78d75d5dd WatchSource:0}: Error finding container 2ac81cd856a0025f5266426aa926a8738193492082629f4c911ce2d78d75d5dd: Status 404 returned error can't find the container with id 2ac81cd856a0025f5266426aa926a8738193492082629f4c911ce2d78d75d5dd Apr 23 13:40:50.167116 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:50.167079 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-operator-7587b89b76-f4rbk" event={"ID":"97a95b10-b768-476e-998a-5cfd6ae0c06c","Type":"ContainerStarted","Data":"2ac81cd856a0025f5266426aa926a8738193492082629f4c911ce2d78d75d5dd"} Apr 23 13:40:52.180943 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:52.180902 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-j6x6p" event={"ID":"9a520890-080f-4739-bbea-32aea118ade2","Type":"ContainerStarted","Data":"a9f213856d678177699839d8e24d98931de5a55d70d3640fec9c11ea28d3671f"} Apr 23 13:40:52.181578 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:52.180974 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-j6x6p" Apr 23 13:40:52.202905 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:52.202863 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-j6x6p" podStartSLOduration=1.970466777 podStartE2EDuration="6.20284935s" podCreationTimestamp="2026-04-23 13:40:46 +0000 UTC" firstStartedPulling="2026-04-23 13:40:47.081365818 +0000 UTC m=+581.219644494" lastFinishedPulling="2026-04-23 13:40:51.313748404 +0000 UTC m=+585.452027067" observedRunningTime="2026-04-23 13:40:52.200783369 +0000 UTC m=+586.339062056" watchObservedRunningTime="2026-04-23 13:40:52.20284935 +0000 UTC m=+586.341128035" Apr 23 13:40:53.185886 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:53.185838 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-operator-7587b89b76-f4rbk" event={"ID":"97a95b10-b768-476e-998a-5cfd6ae0c06c","Type":"ContainerStarted","Data":"095df0c90023f194acc855e29ae2a8b64534385269599f0d4ac46bb73e18e845"} Apr 23 13:40:53.186277 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:53.186082 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/authorino-operator-7587b89b76-f4rbk" Apr 23 13:40:53.209952 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:53.209904 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-operator-7587b89b76-f4rbk" podStartSLOduration=1.01080513 podStartE2EDuration="4.209890778s" podCreationTimestamp="2026-04-23 13:40:49 +0000 UTC" firstStartedPulling="2026-04-23 13:40:49.796467494 +0000 UTC m=+583.934746157" lastFinishedPulling="2026-04-23 13:40:52.995553129 +0000 UTC m=+587.133831805" observedRunningTime="2026-04-23 13:40:53.207817905 +0000 UTC m=+587.346096589" watchObservedRunningTime="2026-04-23 13:40:53.209890778 +0000 UTC m=+587.348169463" Apr 23 13:40:53.515471 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:53.515433 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/dns-operator-controller-manager-844548ff4c-pbnzh"] Apr 23 13:40:53.518704 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:53.518685 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-pbnzh" Apr 23 13:40:53.521116 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:53.521099 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"dns-operator-controller-manager-dockercfg-dpt94\"" Apr 23 13:40:53.521221 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:53.521175 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"dns-operator-controller-env\"" Apr 23 13:40:53.535257 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:53.535236 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/dns-operator-controller-manager-844548ff4c-pbnzh"] Apr 23 13:40:53.661643 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:53.661610 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jkrf7\" (UniqueName: \"kubernetes.io/projected/e4a6a5ec-8f9e-462f-b00b-667f60065b16-kube-api-access-jkrf7\") pod \"dns-operator-controller-manager-844548ff4c-pbnzh\" (UID: \"e4a6a5ec-8f9e-462f-b00b-667f60065b16\") " pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-pbnzh" Apr 23 13:40:53.762110 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:53.762081 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jkrf7\" (UniqueName: \"kubernetes.io/projected/e4a6a5ec-8f9e-462f-b00b-667f60065b16-kube-api-access-jkrf7\") pod \"dns-operator-controller-manager-844548ff4c-pbnzh\" (UID: \"e4a6a5ec-8f9e-462f-b00b-667f60065b16\") " pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-pbnzh" Apr 23 13:40:53.770673 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:53.770616 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jkrf7\" (UniqueName: \"kubernetes.io/projected/e4a6a5ec-8f9e-462f-b00b-667f60065b16-kube-api-access-jkrf7\") pod \"dns-operator-controller-manager-844548ff4c-pbnzh\" (UID: \"e4a6a5ec-8f9e-462f-b00b-667f60065b16\") " pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-pbnzh" Apr 23 13:40:53.828116 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:53.828089 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-pbnzh" Apr 23 13:40:53.944695 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:53.944675 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/dns-operator-controller-manager-844548ff4c-pbnzh"] Apr 23 13:40:53.946753 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:40:53.946723 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode4a6a5ec_8f9e_462f_b00b_667f60065b16.slice/crio-8ab3674354347dac901f81d7e30baf9083371896ac2fbace012bec5d21b46511 WatchSource:0}: Error finding container 8ab3674354347dac901f81d7e30baf9083371896ac2fbace012bec5d21b46511: Status 404 returned error can't find the container with id 8ab3674354347dac901f81d7e30baf9083371896ac2fbace012bec5d21b46511 Apr 23 13:40:54.191530 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:54.191447 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-pbnzh" event={"ID":"e4a6a5ec-8f9e-462f-b00b-667f60065b16","Type":"ContainerStarted","Data":"8ab3674354347dac901f81d7e30baf9083371896ac2fbace012bec5d21b46511"} Apr 23 13:40:57.202496 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:57.202465 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-pbnzh" event={"ID":"e4a6a5ec-8f9e-462f-b00b-667f60065b16","Type":"ContainerStarted","Data":"9cc0b494352a2ae4ec5c38c76c4be269272ddb5ce366b3e1fd673797d15264c6"} Apr 23 13:40:57.202905 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:57.202698 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-pbnzh" Apr 23 13:40:57.220825 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:40:57.220769 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-pbnzh" podStartSLOduration=1.6528095839999999 podStartE2EDuration="4.220753368s" podCreationTimestamp="2026-04-23 13:40:53 +0000 UTC" firstStartedPulling="2026-04-23 13:40:53.948735971 +0000 UTC m=+588.087014635" lastFinishedPulling="2026-04-23 13:40:56.51667975 +0000 UTC m=+590.654958419" observedRunningTime="2026-04-23 13:40:57.219724947 +0000 UTC m=+591.358003635" watchObservedRunningTime="2026-04-23 13:40:57.220753368 +0000 UTC m=+591.359032054" Apr 23 13:41:03.188548 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:41:03.188517 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-j6x6p" Apr 23 13:41:04.194186 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:41:04.194157 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/authorino-operator-7587b89b76-f4rbk" Apr 23 13:41:06.370572 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:41:06.370548 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zk8wt_a5619baf-099b-4d83-ad43-fd7d0083f57b/ovn-acl-logging/0.log" Apr 23 13:41:06.370932 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:41:06.370550 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zk8wt_a5619baf-099b-4d83-ad43-fd7d0083f57b/ovn-acl-logging/0.log" Apr 23 13:41:08.208629 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:41:08.208602 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-pbnzh" Apr 23 13:41:39.088675 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:41:39.088634 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-limitador-67566c68b4-sh5bk"] Apr 23 13:41:39.097865 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:41:39.097841 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-67566c68b4-sh5bk" Apr 23 13:41:39.100701 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:41:39.100678 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-g7kfd\"" Apr 23 13:41:39.100938 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:41:39.100917 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"limitador-limits-config-limitador\"" Apr 23 13:41:39.102237 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:41:39.102214 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-67566c68b4-sh5bk"] Apr 23 13:41:39.117563 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:41:39.117538 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-67566c68b4-sh5bk"] Apr 23 13:41:39.202875 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:41:39.202846 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pcn22\" (UniqueName: \"kubernetes.io/projected/c74e27bd-11d7-4cf1-a944-09e65e8f6fe3-kube-api-access-pcn22\") pod \"limitador-limitador-67566c68b4-sh5bk\" (UID: \"c74e27bd-11d7-4cf1-a944-09e65e8f6fe3\") " pod="kuadrant-system/limitador-limitador-67566c68b4-sh5bk" Apr 23 13:41:39.203019 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:41:39.202895 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/c74e27bd-11d7-4cf1-a944-09e65e8f6fe3-config-file\") pod \"limitador-limitador-67566c68b4-sh5bk\" (UID: \"c74e27bd-11d7-4cf1-a944-09e65e8f6fe3\") " pod="kuadrant-system/limitador-limitador-67566c68b4-sh5bk" Apr 23 13:41:39.304229 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:41:39.304186 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pcn22\" (UniqueName: \"kubernetes.io/projected/c74e27bd-11d7-4cf1-a944-09e65e8f6fe3-kube-api-access-pcn22\") pod \"limitador-limitador-67566c68b4-sh5bk\" (UID: \"c74e27bd-11d7-4cf1-a944-09e65e8f6fe3\") " pod="kuadrant-system/limitador-limitador-67566c68b4-sh5bk" Apr 23 13:41:39.304406 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:41:39.304248 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/c74e27bd-11d7-4cf1-a944-09e65e8f6fe3-config-file\") pod \"limitador-limitador-67566c68b4-sh5bk\" (UID: \"c74e27bd-11d7-4cf1-a944-09e65e8f6fe3\") " pod="kuadrant-system/limitador-limitador-67566c68b4-sh5bk" Apr 23 13:41:39.304828 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:41:39.304811 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/c74e27bd-11d7-4cf1-a944-09e65e8f6fe3-config-file\") pod \"limitador-limitador-67566c68b4-sh5bk\" (UID: \"c74e27bd-11d7-4cf1-a944-09e65e8f6fe3\") " pod="kuadrant-system/limitador-limitador-67566c68b4-sh5bk" Apr 23 13:41:39.317480 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:41:39.317453 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pcn22\" (UniqueName: \"kubernetes.io/projected/c74e27bd-11d7-4cf1-a944-09e65e8f6fe3-kube-api-access-pcn22\") pod \"limitador-limitador-67566c68b4-sh5bk\" (UID: \"c74e27bd-11d7-4cf1-a944-09e65e8f6fe3\") " pod="kuadrant-system/limitador-limitador-67566c68b4-sh5bk" Apr 23 13:41:39.409200 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:41:39.409108 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-67566c68b4-sh5bk" Apr 23 13:41:39.542166 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:41:39.542142 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-67566c68b4-sh5bk"] Apr 23 13:41:39.543883 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:41:39.543852 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc74e27bd_11d7_4cf1_a944_09e65e8f6fe3.slice/crio-e03a2978f03ea8d1d868b79339ea80f37e59e154f911b995f3516a64196c3b2a WatchSource:0}: Error finding container e03a2978f03ea8d1d868b79339ea80f37e59e154f911b995f3516a64196c3b2a: Status 404 returned error can't find the container with id e03a2978f03ea8d1d868b79339ea80f37e59e154f911b995f3516a64196c3b2a Apr 23 13:41:40.355570 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:41:40.355532 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-67566c68b4-sh5bk" event={"ID":"c74e27bd-11d7-4cf1-a944-09e65e8f6fe3","Type":"ContainerStarted","Data":"e03a2978f03ea8d1d868b79339ea80f37e59e154f911b995f3516a64196c3b2a"} Apr 23 13:41:44.373358 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:41:44.373305 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-67566c68b4-sh5bk" event={"ID":"c74e27bd-11d7-4cf1-a944-09e65e8f6fe3","Type":"ContainerStarted","Data":"144479a0456d71b74807dbc32a2793da7f4de0cc25d55c4d05cb40bbe6204b4d"} Apr 23 13:41:44.373752 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:41:44.373410 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-limitador-67566c68b4-sh5bk" Apr 23 13:41:44.392794 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:41:44.392749 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-limitador-67566c68b4-sh5bk" podStartSLOduration=1.590981303 podStartE2EDuration="5.392737179s" podCreationTimestamp="2026-04-23 13:41:39 +0000 UTC" firstStartedPulling="2026-04-23 13:41:39.545530741 +0000 UTC m=+633.683809407" lastFinishedPulling="2026-04-23 13:41:43.347286616 +0000 UTC m=+637.485565283" observedRunningTime="2026-04-23 13:41:44.390199913 +0000 UTC m=+638.528478599" watchObservedRunningTime="2026-04-23 13:41:44.392737179 +0000 UTC m=+638.531015864" Apr 23 13:41:55.377949 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:41:55.377919 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-limitador-67566c68b4-sh5bk" Apr 23 13:42:15.154788 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:42:15.154756 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-rmkrp"] Apr 23 13:42:15.161667 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:42:15.161646 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-rmkrp" Apr 23 13:42:15.171590 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:42:15.171561 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-rmkrp"] Apr 23 13:42:15.289872 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:42:15.289829 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/139e0319-1cde-4812-8f88-f87a37388144-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-rmkrp\" (UID: \"139e0319-1cde-4812-8f88-f87a37388144\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-rmkrp" Apr 23 13:42:15.290080 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:42:15.289892 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/139e0319-1cde-4812-8f88-f87a37388144-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-rmkrp\" (UID: \"139e0319-1cde-4812-8f88-f87a37388144\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-rmkrp" Apr 23 13:42:15.290080 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:42:15.289947 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xgbcg\" (UniqueName: \"kubernetes.io/projected/139e0319-1cde-4812-8f88-f87a37388144-kube-api-access-xgbcg\") pod \"istiod-openshift-gateway-55ff986f96-rmkrp\" (UID: \"139e0319-1cde-4812-8f88-f87a37388144\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-rmkrp" Apr 23 13:42:15.290080 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:42:15.290005 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/139e0319-1cde-4812-8f88-f87a37388144-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-rmkrp\" (UID: \"139e0319-1cde-4812-8f88-f87a37388144\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-rmkrp" Apr 23 13:42:15.290080 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:42:15.290038 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/139e0319-1cde-4812-8f88-f87a37388144-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-rmkrp\" (UID: \"139e0319-1cde-4812-8f88-f87a37388144\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-rmkrp" Apr 23 13:42:15.290307 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:42:15.290086 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/139e0319-1cde-4812-8f88-f87a37388144-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-rmkrp\" (UID: \"139e0319-1cde-4812-8f88-f87a37388144\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-rmkrp" Apr 23 13:42:15.290307 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:42:15.290103 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/139e0319-1cde-4812-8f88-f87a37388144-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-rmkrp\" (UID: \"139e0319-1cde-4812-8f88-f87a37388144\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-rmkrp" Apr 23 13:42:15.391182 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:42:15.391148 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/139e0319-1cde-4812-8f88-f87a37388144-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-rmkrp\" (UID: \"139e0319-1cde-4812-8f88-f87a37388144\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-rmkrp" Apr 23 13:42:15.391382 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:42:15.391205 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xgbcg\" (UniqueName: \"kubernetes.io/projected/139e0319-1cde-4812-8f88-f87a37388144-kube-api-access-xgbcg\") pod \"istiod-openshift-gateway-55ff986f96-rmkrp\" (UID: \"139e0319-1cde-4812-8f88-f87a37388144\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-rmkrp" Apr 23 13:42:15.391382 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:42:15.391223 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/139e0319-1cde-4812-8f88-f87a37388144-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-rmkrp\" (UID: \"139e0319-1cde-4812-8f88-f87a37388144\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-rmkrp" Apr 23 13:42:15.391382 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:42:15.391242 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/139e0319-1cde-4812-8f88-f87a37388144-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-rmkrp\" (UID: \"139e0319-1cde-4812-8f88-f87a37388144\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-rmkrp" Apr 23 13:42:15.391558 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:42:15.391384 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/139e0319-1cde-4812-8f88-f87a37388144-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-rmkrp\" (UID: \"139e0319-1cde-4812-8f88-f87a37388144\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-rmkrp" Apr 23 13:42:15.391558 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:42:15.391430 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/139e0319-1cde-4812-8f88-f87a37388144-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-rmkrp\" (UID: \"139e0319-1cde-4812-8f88-f87a37388144\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-rmkrp" Apr 23 13:42:15.391558 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:42:15.391478 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/139e0319-1cde-4812-8f88-f87a37388144-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-rmkrp\" (UID: \"139e0319-1cde-4812-8f88-f87a37388144\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-rmkrp" Apr 23 13:42:15.392032 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:42:15.392006 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/139e0319-1cde-4812-8f88-f87a37388144-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-rmkrp\" (UID: \"139e0319-1cde-4812-8f88-f87a37388144\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-rmkrp" Apr 23 13:42:15.393603 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:42:15.393579 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/139e0319-1cde-4812-8f88-f87a37388144-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-rmkrp\" (UID: \"139e0319-1cde-4812-8f88-f87a37388144\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-rmkrp" Apr 23 13:42:15.393759 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:42:15.393739 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/139e0319-1cde-4812-8f88-f87a37388144-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-rmkrp\" (UID: \"139e0319-1cde-4812-8f88-f87a37388144\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-rmkrp" Apr 23 13:42:15.393759 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:42:15.393753 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/139e0319-1cde-4812-8f88-f87a37388144-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-rmkrp\" (UID: \"139e0319-1cde-4812-8f88-f87a37388144\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-rmkrp" Apr 23 13:42:15.393995 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:42:15.393978 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/139e0319-1cde-4812-8f88-f87a37388144-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-rmkrp\" (UID: \"139e0319-1cde-4812-8f88-f87a37388144\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-rmkrp" Apr 23 13:42:15.399442 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:42:15.399419 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xgbcg\" (UniqueName: \"kubernetes.io/projected/139e0319-1cde-4812-8f88-f87a37388144-kube-api-access-xgbcg\") pod \"istiod-openshift-gateway-55ff986f96-rmkrp\" (UID: \"139e0319-1cde-4812-8f88-f87a37388144\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-rmkrp" Apr 23 13:42:15.399538 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:42:15.399490 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/139e0319-1cde-4812-8f88-f87a37388144-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-rmkrp\" (UID: \"139e0319-1cde-4812-8f88-f87a37388144\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-rmkrp" Apr 23 13:42:15.471814 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:42:15.471770 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-rmkrp" Apr 23 13:42:15.655989 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:42:15.655958 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-rmkrp"] Apr 23 13:42:15.657345 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:42:15.657295 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod139e0319_1cde_4812_8f88_f87a37388144.slice/crio-6eac9e4182a73be63d43ae92304067db9f6a0e0422130dd40526f2d580159f2b WatchSource:0}: Error finding container 6eac9e4182a73be63d43ae92304067db9f6a0e0422130dd40526f2d580159f2b: Status 404 returned error can't find the container with id 6eac9e4182a73be63d43ae92304067db9f6a0e0422130dd40526f2d580159f2b Apr 23 13:42:15.659603 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:42:15.659571 2571 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 23 13:42:15.659733 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:42:15.659641 2571 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 23 13:42:16.481365 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:42:16.481308 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-rmkrp" event={"ID":"139e0319-1cde-4812-8f88-f87a37388144","Type":"ContainerStarted","Data":"3d1c7f689cd0a173421c96166977530431d21070d4e5f0d58385b4b106f0f340"} Apr 23 13:42:16.481365 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:42:16.481369 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-rmkrp" Apr 23 13:42:16.481826 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:42:16.481383 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-rmkrp" event={"ID":"139e0319-1cde-4812-8f88-f87a37388144","Type":"ContainerStarted","Data":"6eac9e4182a73be63d43ae92304067db9f6a0e0422130dd40526f2d580159f2b"} Apr 23 13:42:16.483231 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:42:16.483208 2571 patch_prober.go:28] interesting pod/istiod-openshift-gateway-55ff986f96-rmkrp container/discovery namespace/openshift-ingress: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body= Apr 23 13:42:16.483362 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:42:16.483251 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-rmkrp" podUID="139e0319-1cde-4812-8f88-f87a37388144" containerName="discovery" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 13:42:16.524870 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:42:16.524826 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-rmkrp" podStartSLOduration=1.524814195 podStartE2EDuration="1.524814195s" podCreationTimestamp="2026-04-23 13:42:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 13:42:16.52332613 +0000 UTC m=+670.661604814" watchObservedRunningTime="2026-04-23 13:42:16.524814195 +0000 UTC m=+670.663092879" Apr 23 13:42:17.485119 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:42:17.485090 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-rmkrp" Apr 23 13:42:17.599425 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:42:17.599386 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-kfm5c"] Apr 23 13:42:17.599740 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:42:17.599711 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-kfm5c" podUID="254690a2-4108-40d3-9aca-efd709937afd" containerName="discovery" containerID="cri-o://b9e6404aa2cec451fb3ca95707d896a4baccee8697ee1846f96ed3bb9dc85526" gracePeriod=30 Apr 23 13:42:17.841850 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:42:17.841825 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-kfm5c" Apr 23 13:42:18.011541 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:42:18.011503 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/254690a2-4108-40d3-9aca-efd709937afd-istio-kubeconfig\") pod \"254690a2-4108-40d3-9aca-efd709937afd\" (UID: \"254690a2-4108-40d3-9aca-efd709937afd\") " Apr 23 13:42:18.011718 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:42:18.011566 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/254690a2-4108-40d3-9aca-efd709937afd-cacerts\") pod \"254690a2-4108-40d3-9aca-efd709937afd\" (UID: \"254690a2-4108-40d3-9aca-efd709937afd\") " Apr 23 13:42:18.011718 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:42:18.011596 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/254690a2-4108-40d3-9aca-efd709937afd-local-certs\") pod \"254690a2-4108-40d3-9aca-efd709937afd\" (UID: \"254690a2-4108-40d3-9aca-efd709937afd\") " Apr 23 13:42:18.011718 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:42:18.011616 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/254690a2-4108-40d3-9aca-efd709937afd-istio-csr-ca-configmap\") pod \"254690a2-4108-40d3-9aca-efd709937afd\" (UID: \"254690a2-4108-40d3-9aca-efd709937afd\") " Apr 23 13:42:18.011718 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:42:18.011679 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/254690a2-4108-40d3-9aca-efd709937afd-istio-csr-dns-cert\") pod \"254690a2-4108-40d3-9aca-efd709937afd\" (UID: \"254690a2-4108-40d3-9aca-efd709937afd\") " Apr 23 13:42:18.011718 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:42:18.011702 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjnmf\" (UniqueName: \"kubernetes.io/projected/254690a2-4108-40d3-9aca-efd709937afd-kube-api-access-pjnmf\") pod \"254690a2-4108-40d3-9aca-efd709937afd\" (UID: \"254690a2-4108-40d3-9aca-efd709937afd\") " Apr 23 13:42:18.011985 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:42:18.011760 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/254690a2-4108-40d3-9aca-efd709937afd-istio-token\") pod \"254690a2-4108-40d3-9aca-efd709937afd\" (UID: \"254690a2-4108-40d3-9aca-efd709937afd\") " Apr 23 13:42:18.012036 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:42:18.012004 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/254690a2-4108-40d3-9aca-efd709937afd-istio-csr-ca-configmap" (OuterVolumeSpecName: "istio-csr-ca-configmap") pod "254690a2-4108-40d3-9aca-efd709937afd" (UID: "254690a2-4108-40d3-9aca-efd709937afd"). InnerVolumeSpecName "istio-csr-ca-configmap". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 13:42:18.013995 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:42:18.013965 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/254690a2-4108-40d3-9aca-efd709937afd-cacerts" (OuterVolumeSpecName: "cacerts") pod "254690a2-4108-40d3-9aca-efd709937afd" (UID: "254690a2-4108-40d3-9aca-efd709937afd"). InnerVolumeSpecName "cacerts". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 13:42:18.014419 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:42:18.014393 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/254690a2-4108-40d3-9aca-efd709937afd-istio-token" (OuterVolumeSpecName: "istio-token") pod "254690a2-4108-40d3-9aca-efd709937afd" (UID: "254690a2-4108-40d3-9aca-efd709937afd"). InnerVolumeSpecName "istio-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 13:42:18.014618 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:42:18.014585 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/254690a2-4108-40d3-9aca-efd709937afd-istio-csr-dns-cert" (OuterVolumeSpecName: "istio-csr-dns-cert") pod "254690a2-4108-40d3-9aca-efd709937afd" (UID: "254690a2-4108-40d3-9aca-efd709937afd"). InnerVolumeSpecName "istio-csr-dns-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 13:42:18.014689 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:42:18.014577 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/254690a2-4108-40d3-9aca-efd709937afd-local-certs" (OuterVolumeSpecName: "local-certs") pod "254690a2-4108-40d3-9aca-efd709937afd" (UID: "254690a2-4108-40d3-9aca-efd709937afd"). InnerVolumeSpecName "local-certs". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 13:42:18.014689 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:42:18.014623 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/254690a2-4108-40d3-9aca-efd709937afd-kube-api-access-pjnmf" (OuterVolumeSpecName: "kube-api-access-pjnmf") pod "254690a2-4108-40d3-9aca-efd709937afd" (UID: "254690a2-4108-40d3-9aca-efd709937afd"). InnerVolumeSpecName "kube-api-access-pjnmf". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 13:42:18.015303 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:42:18.015283 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/254690a2-4108-40d3-9aca-efd709937afd-istio-kubeconfig" (OuterVolumeSpecName: "istio-kubeconfig") pod "254690a2-4108-40d3-9aca-efd709937afd" (UID: "254690a2-4108-40d3-9aca-efd709937afd"). InnerVolumeSpecName "istio-kubeconfig". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 13:42:18.113047 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:42:18.112976 2571 reconciler_common.go:299] "Volume detached for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/254690a2-4108-40d3-9aca-efd709937afd-istio-csr-ca-configmap\") on node \"ip-10-0-128-108.ec2.internal\" DevicePath \"\"" Apr 23 13:42:18.113047 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:42:18.113002 2571 reconciler_common.go:299] "Volume detached for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/254690a2-4108-40d3-9aca-efd709937afd-istio-csr-dns-cert\") on node \"ip-10-0-128-108.ec2.internal\" DevicePath \"\"" Apr 23 13:42:18.113047 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:42:18.113012 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-pjnmf\" (UniqueName: \"kubernetes.io/projected/254690a2-4108-40d3-9aca-efd709937afd-kube-api-access-pjnmf\") on node \"ip-10-0-128-108.ec2.internal\" DevicePath \"\"" Apr 23 13:42:18.113047 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:42:18.113021 2571 reconciler_common.go:299] "Volume detached for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/254690a2-4108-40d3-9aca-efd709937afd-istio-token\") on node \"ip-10-0-128-108.ec2.internal\" DevicePath \"\"" Apr 23 13:42:18.113047 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:42:18.113029 2571 reconciler_common.go:299] "Volume detached for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/254690a2-4108-40d3-9aca-efd709937afd-istio-kubeconfig\") on node \"ip-10-0-128-108.ec2.internal\" DevicePath \"\"" Apr 23 13:42:18.113047 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:42:18.113038 2571 reconciler_common.go:299] "Volume detached for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/254690a2-4108-40d3-9aca-efd709937afd-cacerts\") on node \"ip-10-0-128-108.ec2.internal\" DevicePath \"\"" Apr 23 13:42:18.113047 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:42:18.113047 2571 reconciler_common.go:299] "Volume detached for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/254690a2-4108-40d3-9aca-efd709937afd-local-certs\") on node \"ip-10-0-128-108.ec2.internal\" DevicePath \"\"" Apr 23 13:42:18.488455 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:42:18.488421 2571 generic.go:358] "Generic (PLEG): container finished" podID="254690a2-4108-40d3-9aca-efd709937afd" containerID="b9e6404aa2cec451fb3ca95707d896a4baccee8697ee1846f96ed3bb9dc85526" exitCode=0 Apr 23 13:42:18.488880 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:42:18.488490 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-kfm5c" Apr 23 13:42:18.488880 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:42:18.488510 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-kfm5c" event={"ID":"254690a2-4108-40d3-9aca-efd709937afd","Type":"ContainerDied","Data":"b9e6404aa2cec451fb3ca95707d896a4baccee8697ee1846f96ed3bb9dc85526"} Apr 23 13:42:18.488880 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:42:18.488555 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-kfm5c" event={"ID":"254690a2-4108-40d3-9aca-efd709937afd","Type":"ContainerDied","Data":"ba93bdc8952b50440dbc70bc9e90c164ce587d0e2cc2013fa3f09e0b53448cbc"} Apr 23 13:42:18.488880 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:42:18.488572 2571 scope.go:117] "RemoveContainer" containerID="b9e6404aa2cec451fb3ca95707d896a4baccee8697ee1846f96ed3bb9dc85526" Apr 23 13:42:18.496613 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:42:18.496597 2571 scope.go:117] "RemoveContainer" containerID="b9e6404aa2cec451fb3ca95707d896a4baccee8697ee1846f96ed3bb9dc85526" Apr 23 13:42:18.496862 ip-10-0-128-108 kubenswrapper[2571]: E0423 13:42:18.496842 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b9e6404aa2cec451fb3ca95707d896a4baccee8697ee1846f96ed3bb9dc85526\": container with ID starting with b9e6404aa2cec451fb3ca95707d896a4baccee8697ee1846f96ed3bb9dc85526 not found: ID does not exist" containerID="b9e6404aa2cec451fb3ca95707d896a4baccee8697ee1846f96ed3bb9dc85526" Apr 23 13:42:18.496926 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:42:18.496870 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b9e6404aa2cec451fb3ca95707d896a4baccee8697ee1846f96ed3bb9dc85526"} err="failed to get container status \"b9e6404aa2cec451fb3ca95707d896a4baccee8697ee1846f96ed3bb9dc85526\": rpc error: code = NotFound desc = could not find container \"b9e6404aa2cec451fb3ca95707d896a4baccee8697ee1846f96ed3bb9dc85526\": container with ID starting with b9e6404aa2cec451fb3ca95707d896a4baccee8697ee1846f96ed3bb9dc85526 not found: ID does not exist" Apr 23 13:42:18.522954 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:42:18.522923 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-kfm5c"] Apr 23 13:42:18.526132 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:42:18.526112 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-kfm5c"] Apr 23 13:42:20.433944 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:42:20.433912 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="254690a2-4108-40d3-9aca-efd709937afd" path="/var/lib/kubelet/pods/254690a2-4108-40d3-9aca-efd709937afd/volumes" Apr 23 13:44:11.580902 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:44:11.580865 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-l2djv"] Apr 23 13:44:11.581448 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:44:11.581285 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="254690a2-4108-40d3-9aca-efd709937afd" containerName="discovery" Apr 23 13:44:11.581448 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:44:11.581306 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="254690a2-4108-40d3-9aca-efd709937afd" containerName="discovery" Apr 23 13:44:11.581448 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:44:11.581404 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="254690a2-4108-40d3-9aca-efd709937afd" containerName="discovery" Apr 23 13:44:11.583465 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:44:11.583441 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-l2djv" Apr 23 13:44:11.586212 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:44:11.586189 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 23 13:44:11.586324 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:44:11.586189 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-gateway-1-openshift-default-dockercfg-l5g8n\"" Apr 23 13:44:11.586433 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:44:11.586202 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 23 13:44:11.586986 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:44:11.586959 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"istio-ca-root-cert\"" Apr 23 13:44:11.597727 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:44:11.597705 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-l2djv"] Apr 23 13:44:11.644421 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:44:11.644384 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/a8ae7600-6eac-46dd-b8c2-1f96063d95a6-istio-data\") pod \"router-gateway-1-openshift-default-6c59fbf55c-l2djv\" (UID: \"a8ae7600-6eac-46dd-b8c2-1f96063d95a6\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-l2djv" Apr 23 13:44:11.644570 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:44:11.644437 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wm5pl\" (UniqueName: \"kubernetes.io/projected/a8ae7600-6eac-46dd-b8c2-1f96063d95a6-kube-api-access-wm5pl\") pod \"router-gateway-1-openshift-default-6c59fbf55c-l2djv\" (UID: \"a8ae7600-6eac-46dd-b8c2-1f96063d95a6\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-l2djv" Apr 23 13:44:11.644570 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:44:11.644495 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/a8ae7600-6eac-46dd-b8c2-1f96063d95a6-istio-envoy\") pod \"router-gateway-1-openshift-default-6c59fbf55c-l2djv\" (UID: \"a8ae7600-6eac-46dd-b8c2-1f96063d95a6\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-l2djv" Apr 23 13:44:11.644570 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:44:11.644533 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/a8ae7600-6eac-46dd-b8c2-1f96063d95a6-workload-certs\") pod \"router-gateway-1-openshift-default-6c59fbf55c-l2djv\" (UID: \"a8ae7600-6eac-46dd-b8c2-1f96063d95a6\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-l2djv" Apr 23 13:44:11.644692 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:44:11.644586 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/a8ae7600-6eac-46dd-b8c2-1f96063d95a6-workload-socket\") pod \"router-gateway-1-openshift-default-6c59fbf55c-l2djv\" (UID: \"a8ae7600-6eac-46dd-b8c2-1f96063d95a6\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-l2djv" Apr 23 13:44:11.644692 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:44:11.644610 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/a8ae7600-6eac-46dd-b8c2-1f96063d95a6-istio-token\") pod \"router-gateway-1-openshift-default-6c59fbf55c-l2djv\" (UID: \"a8ae7600-6eac-46dd-b8c2-1f96063d95a6\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-l2djv" Apr 23 13:44:11.644692 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:44:11.644628 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/a8ae7600-6eac-46dd-b8c2-1f96063d95a6-istio-podinfo\") pod \"router-gateway-1-openshift-default-6c59fbf55c-l2djv\" (UID: \"a8ae7600-6eac-46dd-b8c2-1f96063d95a6\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-l2djv" Apr 23 13:44:11.644692 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:44:11.644645 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/a8ae7600-6eac-46dd-b8c2-1f96063d95a6-credential-socket\") pod \"router-gateway-1-openshift-default-6c59fbf55c-l2djv\" (UID: \"a8ae7600-6eac-46dd-b8c2-1f96063d95a6\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-l2djv" Apr 23 13:44:11.644874 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:44:11.644710 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/a8ae7600-6eac-46dd-b8c2-1f96063d95a6-istiod-ca-cert\") pod \"router-gateway-1-openshift-default-6c59fbf55c-l2djv\" (UID: \"a8ae7600-6eac-46dd-b8c2-1f96063d95a6\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-l2djv" Apr 23 13:44:11.745873 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:44:11.745825 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/a8ae7600-6eac-46dd-b8c2-1f96063d95a6-istio-data\") pod \"router-gateway-1-openshift-default-6c59fbf55c-l2djv\" (UID: \"a8ae7600-6eac-46dd-b8c2-1f96063d95a6\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-l2djv" Apr 23 13:44:11.745873 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:44:11.745864 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wm5pl\" (UniqueName: \"kubernetes.io/projected/a8ae7600-6eac-46dd-b8c2-1f96063d95a6-kube-api-access-wm5pl\") pod \"router-gateway-1-openshift-default-6c59fbf55c-l2djv\" (UID: \"a8ae7600-6eac-46dd-b8c2-1f96063d95a6\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-l2djv" Apr 23 13:44:11.746116 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:44:11.745890 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/a8ae7600-6eac-46dd-b8c2-1f96063d95a6-istio-envoy\") pod \"router-gateway-1-openshift-default-6c59fbf55c-l2djv\" (UID: \"a8ae7600-6eac-46dd-b8c2-1f96063d95a6\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-l2djv" Apr 23 13:44:11.746116 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:44:11.745923 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/a8ae7600-6eac-46dd-b8c2-1f96063d95a6-workload-certs\") pod \"router-gateway-1-openshift-default-6c59fbf55c-l2djv\" (UID: \"a8ae7600-6eac-46dd-b8c2-1f96063d95a6\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-l2djv" Apr 23 13:44:11.746116 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:44:11.745963 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/a8ae7600-6eac-46dd-b8c2-1f96063d95a6-workload-socket\") pod \"router-gateway-1-openshift-default-6c59fbf55c-l2djv\" (UID: \"a8ae7600-6eac-46dd-b8c2-1f96063d95a6\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-l2djv" Apr 23 13:44:11.746116 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:44:11.745989 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/a8ae7600-6eac-46dd-b8c2-1f96063d95a6-istio-token\") pod \"router-gateway-1-openshift-default-6c59fbf55c-l2djv\" (UID: \"a8ae7600-6eac-46dd-b8c2-1f96063d95a6\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-l2djv" Apr 23 13:44:11.746116 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:44:11.746016 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/a8ae7600-6eac-46dd-b8c2-1f96063d95a6-istio-podinfo\") pod \"router-gateway-1-openshift-default-6c59fbf55c-l2djv\" (UID: \"a8ae7600-6eac-46dd-b8c2-1f96063d95a6\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-l2djv" Apr 23 13:44:11.746116 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:44:11.746043 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/a8ae7600-6eac-46dd-b8c2-1f96063d95a6-credential-socket\") pod \"router-gateway-1-openshift-default-6c59fbf55c-l2djv\" (UID: \"a8ae7600-6eac-46dd-b8c2-1f96063d95a6\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-l2djv" Apr 23 13:44:11.746116 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:44:11.746074 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/a8ae7600-6eac-46dd-b8c2-1f96063d95a6-istiod-ca-cert\") pod \"router-gateway-1-openshift-default-6c59fbf55c-l2djv\" (UID: \"a8ae7600-6eac-46dd-b8c2-1f96063d95a6\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-l2djv" Apr 23 13:44:11.746514 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:44:11.746269 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/a8ae7600-6eac-46dd-b8c2-1f96063d95a6-istio-data\") pod \"router-gateway-1-openshift-default-6c59fbf55c-l2djv\" (UID: \"a8ae7600-6eac-46dd-b8c2-1f96063d95a6\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-l2djv" Apr 23 13:44:11.746903 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:44:11.746780 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/a8ae7600-6eac-46dd-b8c2-1f96063d95a6-workload-socket\") pod \"router-gateway-1-openshift-default-6c59fbf55c-l2djv\" (UID: \"a8ae7600-6eac-46dd-b8c2-1f96063d95a6\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-l2djv" Apr 23 13:44:11.746903 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:44:11.746855 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/a8ae7600-6eac-46dd-b8c2-1f96063d95a6-istiod-ca-cert\") pod \"router-gateway-1-openshift-default-6c59fbf55c-l2djv\" (UID: \"a8ae7600-6eac-46dd-b8c2-1f96063d95a6\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-l2djv" Apr 23 13:44:11.746903 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:44:11.746861 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/a8ae7600-6eac-46dd-b8c2-1f96063d95a6-workload-certs\") pod \"router-gateway-1-openshift-default-6c59fbf55c-l2djv\" (UID: \"a8ae7600-6eac-46dd-b8c2-1f96063d95a6\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-l2djv" Apr 23 13:44:11.746903 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:44:11.746901 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/a8ae7600-6eac-46dd-b8c2-1f96063d95a6-credential-socket\") pod \"router-gateway-1-openshift-default-6c59fbf55c-l2djv\" (UID: \"a8ae7600-6eac-46dd-b8c2-1f96063d95a6\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-l2djv" Apr 23 13:44:11.748675 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:44:11.748655 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/a8ae7600-6eac-46dd-b8c2-1f96063d95a6-istio-envoy\") pod \"router-gateway-1-openshift-default-6c59fbf55c-l2djv\" (UID: \"a8ae7600-6eac-46dd-b8c2-1f96063d95a6\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-l2djv" Apr 23 13:44:11.748889 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:44:11.748869 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/a8ae7600-6eac-46dd-b8c2-1f96063d95a6-istio-podinfo\") pod \"router-gateway-1-openshift-default-6c59fbf55c-l2djv\" (UID: \"a8ae7600-6eac-46dd-b8c2-1f96063d95a6\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-l2djv" Apr 23 13:44:11.756587 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:44:11.756564 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wm5pl\" (UniqueName: \"kubernetes.io/projected/a8ae7600-6eac-46dd-b8c2-1f96063d95a6-kube-api-access-wm5pl\") pod \"router-gateway-1-openshift-default-6c59fbf55c-l2djv\" (UID: \"a8ae7600-6eac-46dd-b8c2-1f96063d95a6\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-l2djv" Apr 23 13:44:11.757106 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:44:11.757085 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/a8ae7600-6eac-46dd-b8c2-1f96063d95a6-istio-token\") pod \"router-gateway-1-openshift-default-6c59fbf55c-l2djv\" (UID: \"a8ae7600-6eac-46dd-b8c2-1f96063d95a6\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-l2djv" Apr 23 13:44:11.896604 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:44:11.896520 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-l2djv" Apr 23 13:44:12.035218 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:44:12.035191 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-l2djv"] Apr 23 13:44:12.036536 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:44:12.036510 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda8ae7600_6eac_46dd_b8c2_1f96063d95a6.slice/crio-9ed237f4a9e19adae3f73809ef31db8f003263df886aa677d1e5489dd71abe41 WatchSource:0}: Error finding container 9ed237f4a9e19adae3f73809ef31db8f003263df886aa677d1e5489dd71abe41: Status 404 returned error can't find the container with id 9ed237f4a9e19adae3f73809ef31db8f003263df886aa677d1e5489dd71abe41 Apr 23 13:44:12.038435 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:44:12.038416 2571 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 13:44:12.038718 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:44:12.038691 2571 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 23 13:44:12.038800 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:44:12.038744 2571 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 23 13:44:12.038800 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:44:12.038778 2571 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 23 13:44:12.901463 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:44:12.901427 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-l2djv" event={"ID":"a8ae7600-6eac-46dd-b8c2-1f96063d95a6","Type":"ContainerStarted","Data":"cab72e7cdc30c64645ed55a5d9777c4eecfbc4ffdfcbf1bf5960386117f96179"} Apr 23 13:44:12.901463 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:44:12.901465 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-l2djv" event={"ID":"a8ae7600-6eac-46dd-b8c2-1f96063d95a6","Type":"ContainerStarted","Data":"9ed237f4a9e19adae3f73809ef31db8f003263df886aa677d1e5489dd71abe41"} Apr 23 13:44:13.901038 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:44:13.900314 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-l2djv" Apr 23 13:44:13.905347 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:44:13.905296 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-l2djv" Apr 23 13:44:13.928684 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:44:13.928637 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-l2djv" podStartSLOduration=2.928624509 podStartE2EDuration="2.928624509s" podCreationTimestamp="2026-04-23 13:44:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 13:44:12.945488593 +0000 UTC m=+787.083767278" watchObservedRunningTime="2026-04-23 13:44:13.928624509 +0000 UTC m=+788.066903194" Apr 23 13:44:14.907830 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:44:14.907798 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-l2djv" Apr 23 13:44:14.908881 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:44:14.908861 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-l2djv" Apr 23 13:44:37.172315 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:44:37.172281 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/gw-section-name-router-with-gat-2f0a622e-kserve-57b5f859f7ft5kl"] Apr 23 13:44:37.174542 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:44:37.174525 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-2f0a622e-kserve-57b5f859f7ft5kl" Apr 23 13:44:37.177274 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:44:37.177253 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"gw-sec0c69dceeb48768325d1a53a749e65786-kserve-self-signed-certs\"" Apr 23 13:44:37.177468 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:44:37.177450 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-z29bv\"" Apr 23 13:44:37.186788 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:44:37.186768 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/gw-section-name-router-with-gat-2f0a622e-kserve-57b5f859f7ft5kl"] Apr 23 13:44:37.250872 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:44:37.250845 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/f16029bc-2306-41e9-a2e1-5827fbf3be0a-home\") pod \"gw-section-name-router-with-gat-2f0a622e-kserve-57b5f859f7ft5kl\" (UID: \"f16029bc-2306-41e9-a2e1-5827fbf3be0a\") " pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-2f0a622e-kserve-57b5f859f7ft5kl" Apr 23 13:44:37.251012 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:44:37.250901 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/f16029bc-2306-41e9-a2e1-5827fbf3be0a-dshm\") pod \"gw-section-name-router-with-gat-2f0a622e-kserve-57b5f859f7ft5kl\" (UID: \"f16029bc-2306-41e9-a2e1-5827fbf3be0a\") " pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-2f0a622e-kserve-57b5f859f7ft5kl" Apr 23 13:44:37.251012 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:44:37.250994 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/f16029bc-2306-41e9-a2e1-5827fbf3be0a-tmp-dir\") pod \"gw-section-name-router-with-gat-2f0a622e-kserve-57b5f859f7ft5kl\" (UID: \"f16029bc-2306-41e9-a2e1-5827fbf3be0a\") " pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-2f0a622e-kserve-57b5f859f7ft5kl" Apr 23 13:44:37.251099 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:44:37.251022 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/f16029bc-2306-41e9-a2e1-5827fbf3be0a-model-cache\") pod \"gw-section-name-router-with-gat-2f0a622e-kserve-57b5f859f7ft5kl\" (UID: \"f16029bc-2306-41e9-a2e1-5827fbf3be0a\") " pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-2f0a622e-kserve-57b5f859f7ft5kl" Apr 23 13:44:37.251099 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:44:37.251051 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f16029bc-2306-41e9-a2e1-5827fbf3be0a-kserve-provision-location\") pod \"gw-section-name-router-with-gat-2f0a622e-kserve-57b5f859f7ft5kl\" (UID: \"f16029bc-2306-41e9-a2e1-5827fbf3be0a\") " pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-2f0a622e-kserve-57b5f859f7ft5kl" Apr 23 13:44:37.251099 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:44:37.251070 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f16029bc-2306-41e9-a2e1-5827fbf3be0a-tls-certs\") pod \"gw-section-name-router-with-gat-2f0a622e-kserve-57b5f859f7ft5kl\" (UID: \"f16029bc-2306-41e9-a2e1-5827fbf3be0a\") " pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-2f0a622e-kserve-57b5f859f7ft5kl" Apr 23 13:44:37.251206 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:44:37.251106 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pc4fn\" (UniqueName: \"kubernetes.io/projected/f16029bc-2306-41e9-a2e1-5827fbf3be0a-kube-api-access-pc4fn\") pod \"gw-section-name-router-with-gat-2f0a622e-kserve-57b5f859f7ft5kl\" (UID: \"f16029bc-2306-41e9-a2e1-5827fbf3be0a\") " pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-2f0a622e-kserve-57b5f859f7ft5kl" Apr 23 13:44:37.352588 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:44:37.352551 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/f16029bc-2306-41e9-a2e1-5827fbf3be0a-tmp-dir\") pod \"gw-section-name-router-with-gat-2f0a622e-kserve-57b5f859f7ft5kl\" (UID: \"f16029bc-2306-41e9-a2e1-5827fbf3be0a\") " pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-2f0a622e-kserve-57b5f859f7ft5kl" Apr 23 13:44:37.352588 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:44:37.352591 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/f16029bc-2306-41e9-a2e1-5827fbf3be0a-model-cache\") pod \"gw-section-name-router-with-gat-2f0a622e-kserve-57b5f859f7ft5kl\" (UID: \"f16029bc-2306-41e9-a2e1-5827fbf3be0a\") " pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-2f0a622e-kserve-57b5f859f7ft5kl" Apr 23 13:44:37.352807 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:44:37.352630 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f16029bc-2306-41e9-a2e1-5827fbf3be0a-kserve-provision-location\") pod \"gw-section-name-router-with-gat-2f0a622e-kserve-57b5f859f7ft5kl\" (UID: \"f16029bc-2306-41e9-a2e1-5827fbf3be0a\") " pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-2f0a622e-kserve-57b5f859f7ft5kl" Apr 23 13:44:37.352807 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:44:37.352660 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f16029bc-2306-41e9-a2e1-5827fbf3be0a-tls-certs\") pod \"gw-section-name-router-with-gat-2f0a622e-kserve-57b5f859f7ft5kl\" (UID: \"f16029bc-2306-41e9-a2e1-5827fbf3be0a\") " pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-2f0a622e-kserve-57b5f859f7ft5kl" Apr 23 13:44:37.352807 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:44:37.352688 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pc4fn\" (UniqueName: \"kubernetes.io/projected/f16029bc-2306-41e9-a2e1-5827fbf3be0a-kube-api-access-pc4fn\") pod \"gw-section-name-router-with-gat-2f0a622e-kserve-57b5f859f7ft5kl\" (UID: \"f16029bc-2306-41e9-a2e1-5827fbf3be0a\") " pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-2f0a622e-kserve-57b5f859f7ft5kl" Apr 23 13:44:37.352807 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:44:37.352718 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/f16029bc-2306-41e9-a2e1-5827fbf3be0a-home\") pod \"gw-section-name-router-with-gat-2f0a622e-kserve-57b5f859f7ft5kl\" (UID: \"f16029bc-2306-41e9-a2e1-5827fbf3be0a\") " pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-2f0a622e-kserve-57b5f859f7ft5kl" Apr 23 13:44:37.352807 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:44:37.352797 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/f16029bc-2306-41e9-a2e1-5827fbf3be0a-dshm\") pod \"gw-section-name-router-with-gat-2f0a622e-kserve-57b5f859f7ft5kl\" (UID: \"f16029bc-2306-41e9-a2e1-5827fbf3be0a\") " pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-2f0a622e-kserve-57b5f859f7ft5kl" Apr 23 13:44:37.353043 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:44:37.353019 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/f16029bc-2306-41e9-a2e1-5827fbf3be0a-tmp-dir\") pod \"gw-section-name-router-with-gat-2f0a622e-kserve-57b5f859f7ft5kl\" (UID: \"f16029bc-2306-41e9-a2e1-5827fbf3be0a\") " pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-2f0a622e-kserve-57b5f859f7ft5kl" Apr 23 13:44:37.353132 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:44:37.353084 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f16029bc-2306-41e9-a2e1-5827fbf3be0a-kserve-provision-location\") pod \"gw-section-name-router-with-gat-2f0a622e-kserve-57b5f859f7ft5kl\" (UID: \"f16029bc-2306-41e9-a2e1-5827fbf3be0a\") " pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-2f0a622e-kserve-57b5f859f7ft5kl" Apr 23 13:44:37.353190 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:44:37.353111 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/f16029bc-2306-41e9-a2e1-5827fbf3be0a-model-cache\") pod \"gw-section-name-router-with-gat-2f0a622e-kserve-57b5f859f7ft5kl\" (UID: \"f16029bc-2306-41e9-a2e1-5827fbf3be0a\") " pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-2f0a622e-kserve-57b5f859f7ft5kl" Apr 23 13:44:37.353190 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:44:37.353161 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/f16029bc-2306-41e9-a2e1-5827fbf3be0a-home\") pod \"gw-section-name-router-with-gat-2f0a622e-kserve-57b5f859f7ft5kl\" (UID: \"f16029bc-2306-41e9-a2e1-5827fbf3be0a\") " pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-2f0a622e-kserve-57b5f859f7ft5kl" Apr 23 13:44:37.355039 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:44:37.355017 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/f16029bc-2306-41e9-a2e1-5827fbf3be0a-dshm\") pod \"gw-section-name-router-with-gat-2f0a622e-kserve-57b5f859f7ft5kl\" (UID: \"f16029bc-2306-41e9-a2e1-5827fbf3be0a\") " pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-2f0a622e-kserve-57b5f859f7ft5kl" Apr 23 13:44:37.355261 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:44:37.355241 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f16029bc-2306-41e9-a2e1-5827fbf3be0a-tls-certs\") pod \"gw-section-name-router-with-gat-2f0a622e-kserve-57b5f859f7ft5kl\" (UID: \"f16029bc-2306-41e9-a2e1-5827fbf3be0a\") " pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-2f0a622e-kserve-57b5f859f7ft5kl" Apr 23 13:44:37.360253 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:44:37.360234 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pc4fn\" (UniqueName: \"kubernetes.io/projected/f16029bc-2306-41e9-a2e1-5827fbf3be0a-kube-api-access-pc4fn\") pod \"gw-section-name-router-with-gat-2f0a622e-kserve-57b5f859f7ft5kl\" (UID: \"f16029bc-2306-41e9-a2e1-5827fbf3be0a\") " pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-2f0a622e-kserve-57b5f859f7ft5kl" Apr 23 13:44:37.488683 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:44:37.488650 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-2f0a622e-kserve-57b5f859f7ft5kl" Apr 23 13:44:37.619242 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:44:37.619220 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/gw-section-name-router-with-gat-2f0a622e-kserve-57b5f859f7ft5kl"] Apr 23 13:44:37.621319 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:44:37.621281 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf16029bc_2306_41e9_a2e1_5827fbf3be0a.slice/crio-13f384efda9fb6db38e6821ba89f3abfbf124ade1d46b8db693851e22f657a51 WatchSource:0}: Error finding container 13f384efda9fb6db38e6821ba89f3abfbf124ade1d46b8db693851e22f657a51: Status 404 returned error can't find the container with id 13f384efda9fb6db38e6821ba89f3abfbf124ade1d46b8db693851e22f657a51 Apr 23 13:44:37.986513 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:44:37.986480 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-2f0a622e-kserve-57b5f859f7ft5kl" event={"ID":"f16029bc-2306-41e9-a2e1-5827fbf3be0a","Type":"ContainerStarted","Data":"13f384efda9fb6db38e6821ba89f3abfbf124ade1d46b8db693851e22f657a51"} Apr 23 13:44:42.005344 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:44:42.005304 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-2f0a622e-kserve-57b5f859f7ft5kl" event={"ID":"f16029bc-2306-41e9-a2e1-5827fbf3be0a","Type":"ContainerStarted","Data":"5dddb2523a562c0b61ec14e8e70ff8d5629b4588ecde698cede31e3c03b122a9"} Apr 23 13:44:44.961667 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:44:44.961628 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/gw-section-name-router-with-gat-2f0a622e-kserve-57b5f859f7ft5kl"] Apr 23 13:44:44.962155 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:44:44.961941 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-2f0a622e-kserve-57b5f859f7ft5kl" podUID="f16029bc-2306-41e9-a2e1-5827fbf3be0a" containerName="storage-initializer" containerID="cri-o://5dddb2523a562c0b61ec14e8e70ff8d5629b4588ecde698cede31e3c03b122a9" gracePeriod=30 Apr 23 13:44:45.511871 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:44:45.511848 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-2f0a622e-kserve-57b5f859f7ft5kl" Apr 23 13:44:45.634002 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:44:45.633917 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f16029bc-2306-41e9-a2e1-5827fbf3be0a-tls-certs\") pod \"f16029bc-2306-41e9-a2e1-5827fbf3be0a\" (UID: \"f16029bc-2306-41e9-a2e1-5827fbf3be0a\") " Apr 23 13:44:45.634144 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:44:45.634008 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/f16029bc-2306-41e9-a2e1-5827fbf3be0a-home\") pod \"f16029bc-2306-41e9-a2e1-5827fbf3be0a\" (UID: \"f16029bc-2306-41e9-a2e1-5827fbf3be0a\") " Apr 23 13:44:45.634144 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:44:45.634048 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/f16029bc-2306-41e9-a2e1-5827fbf3be0a-model-cache\") pod \"f16029bc-2306-41e9-a2e1-5827fbf3be0a\" (UID: \"f16029bc-2306-41e9-a2e1-5827fbf3be0a\") " Apr 23 13:44:45.634144 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:44:45.634095 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/f16029bc-2306-41e9-a2e1-5827fbf3be0a-tmp-dir\") pod \"f16029bc-2306-41e9-a2e1-5827fbf3be0a\" (UID: \"f16029bc-2306-41e9-a2e1-5827fbf3be0a\") " Apr 23 13:44:45.634279 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:44:45.634144 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f16029bc-2306-41e9-a2e1-5827fbf3be0a-kserve-provision-location\") pod \"f16029bc-2306-41e9-a2e1-5827fbf3be0a\" (UID: \"f16029bc-2306-41e9-a2e1-5827fbf3be0a\") " Apr 23 13:44:45.634279 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:44:45.634169 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/f16029bc-2306-41e9-a2e1-5827fbf3be0a-dshm\") pod \"f16029bc-2306-41e9-a2e1-5827fbf3be0a\" (UID: \"f16029bc-2306-41e9-a2e1-5827fbf3be0a\") " Apr 23 13:44:45.634279 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:44:45.634196 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pc4fn\" (UniqueName: \"kubernetes.io/projected/f16029bc-2306-41e9-a2e1-5827fbf3be0a-kube-api-access-pc4fn\") pod \"f16029bc-2306-41e9-a2e1-5827fbf3be0a\" (UID: \"f16029bc-2306-41e9-a2e1-5827fbf3be0a\") " Apr 23 13:44:45.634452 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:44:45.634297 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f16029bc-2306-41e9-a2e1-5827fbf3be0a-home" (OuterVolumeSpecName: "home") pod "f16029bc-2306-41e9-a2e1-5827fbf3be0a" (UID: "f16029bc-2306-41e9-a2e1-5827fbf3be0a"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 13:44:45.634452 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:44:45.634375 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f16029bc-2306-41e9-a2e1-5827fbf3be0a-model-cache" (OuterVolumeSpecName: "model-cache") pod "f16029bc-2306-41e9-a2e1-5827fbf3be0a" (UID: "f16029bc-2306-41e9-a2e1-5827fbf3be0a"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 13:44:45.634452 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:44:45.634399 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f16029bc-2306-41e9-a2e1-5827fbf3be0a-tmp-dir" (OuterVolumeSpecName: "tmp-dir") pod "f16029bc-2306-41e9-a2e1-5827fbf3be0a" (UID: "f16029bc-2306-41e9-a2e1-5827fbf3be0a"). InnerVolumeSpecName "tmp-dir". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 13:44:45.634665 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:44:45.634495 2571 reconciler_common.go:299] "Volume detached for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/f16029bc-2306-41e9-a2e1-5827fbf3be0a-tmp-dir\") on node \"ip-10-0-128-108.ec2.internal\" DevicePath \"\"" Apr 23 13:44:45.634665 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:44:45.634514 2571 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/f16029bc-2306-41e9-a2e1-5827fbf3be0a-home\") on node \"ip-10-0-128-108.ec2.internal\" DevicePath \"\"" Apr 23 13:44:45.634665 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:44:45.634529 2571 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/f16029bc-2306-41e9-a2e1-5827fbf3be0a-model-cache\") on node \"ip-10-0-128-108.ec2.internal\" DevicePath \"\"" Apr 23 13:44:45.636701 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:44:45.636670 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f16029bc-2306-41e9-a2e1-5827fbf3be0a-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "f16029bc-2306-41e9-a2e1-5827fbf3be0a" (UID: "f16029bc-2306-41e9-a2e1-5827fbf3be0a"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 13:44:45.636701 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:44:45.636681 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f16029bc-2306-41e9-a2e1-5827fbf3be0a-kube-api-access-pc4fn" (OuterVolumeSpecName: "kube-api-access-pc4fn") pod "f16029bc-2306-41e9-a2e1-5827fbf3be0a" (UID: "f16029bc-2306-41e9-a2e1-5827fbf3be0a"). InnerVolumeSpecName "kube-api-access-pc4fn". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 13:44:45.636951 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:44:45.636935 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f16029bc-2306-41e9-a2e1-5827fbf3be0a-dshm" (OuterVolumeSpecName: "dshm") pod "f16029bc-2306-41e9-a2e1-5827fbf3be0a" (UID: "f16029bc-2306-41e9-a2e1-5827fbf3be0a"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 13:44:45.699768 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:44:45.699727 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f16029bc-2306-41e9-a2e1-5827fbf3be0a-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "f16029bc-2306-41e9-a2e1-5827fbf3be0a" (UID: "f16029bc-2306-41e9-a2e1-5827fbf3be0a"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 13:44:45.735520 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:44:45.735494 2571 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f16029bc-2306-41e9-a2e1-5827fbf3be0a-kserve-provision-location\") on node \"ip-10-0-128-108.ec2.internal\" DevicePath \"\"" Apr 23 13:44:45.735520 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:44:45.735519 2571 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/f16029bc-2306-41e9-a2e1-5827fbf3be0a-dshm\") on node \"ip-10-0-128-108.ec2.internal\" DevicePath \"\"" Apr 23 13:44:45.735686 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:44:45.735532 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-pc4fn\" (UniqueName: \"kubernetes.io/projected/f16029bc-2306-41e9-a2e1-5827fbf3be0a-kube-api-access-pc4fn\") on node \"ip-10-0-128-108.ec2.internal\" DevicePath \"\"" Apr 23 13:44:45.735686 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:44:45.735542 2571 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f16029bc-2306-41e9-a2e1-5827fbf3be0a-tls-certs\") on node \"ip-10-0-128-108.ec2.internal\" DevicePath \"\"" Apr 23 13:44:46.024976 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:44:46.024938 2571 generic.go:358] "Generic (PLEG): container finished" podID="f16029bc-2306-41e9-a2e1-5827fbf3be0a" containerID="5dddb2523a562c0b61ec14e8e70ff8d5629b4588ecde698cede31e3c03b122a9" exitCode=0 Apr 23 13:44:46.025424 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:44:46.025003 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-2f0a622e-kserve-57b5f859f7ft5kl" Apr 23 13:44:46.025424 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:44:46.025021 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-2f0a622e-kserve-57b5f859f7ft5kl" event={"ID":"f16029bc-2306-41e9-a2e1-5827fbf3be0a","Type":"ContainerDied","Data":"5dddb2523a562c0b61ec14e8e70ff8d5629b4588ecde698cede31e3c03b122a9"} Apr 23 13:44:46.025424 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:44:46.025056 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-2f0a622e-kserve-57b5f859f7ft5kl" event={"ID":"f16029bc-2306-41e9-a2e1-5827fbf3be0a","Type":"ContainerDied","Data":"13f384efda9fb6db38e6821ba89f3abfbf124ade1d46b8db693851e22f657a51"} Apr 23 13:44:46.025424 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:44:46.025071 2571 scope.go:117] "RemoveContainer" containerID="5dddb2523a562c0b61ec14e8e70ff8d5629b4588ecde698cede31e3c03b122a9" Apr 23 13:44:46.062243 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:44:46.062215 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/gw-section-name-router-with-gat-2f0a622e-kserve-57b5f859f7ft5kl"] Apr 23 13:44:46.064457 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:44:46.064438 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/gw-section-name-router-with-gat-2f0a622e-kserve-57b5f859f7ft5kl"] Apr 23 13:44:46.093927 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:44:46.093904 2571 scope.go:117] "RemoveContainer" containerID="5dddb2523a562c0b61ec14e8e70ff8d5629b4588ecde698cede31e3c03b122a9" Apr 23 13:44:46.094235 ip-10-0-128-108 kubenswrapper[2571]: E0423 13:44:46.094218 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5dddb2523a562c0b61ec14e8e70ff8d5629b4588ecde698cede31e3c03b122a9\": container with ID starting with 5dddb2523a562c0b61ec14e8e70ff8d5629b4588ecde698cede31e3c03b122a9 not found: ID does not exist" containerID="5dddb2523a562c0b61ec14e8e70ff8d5629b4588ecde698cede31e3c03b122a9" Apr 23 13:44:46.094292 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:44:46.094244 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5dddb2523a562c0b61ec14e8e70ff8d5629b4588ecde698cede31e3c03b122a9"} err="failed to get container status \"5dddb2523a562c0b61ec14e8e70ff8d5629b4588ecde698cede31e3c03b122a9\": rpc error: code = NotFound desc = could not find container \"5dddb2523a562c0b61ec14e8e70ff8d5629b4588ecde698cede31e3c03b122a9\": container with ID starting with 5dddb2523a562c0b61ec14e8e70ff8d5629b4588ecde698cede31e3c03b122a9 not found: ID does not exist" Apr 23 13:44:46.440015 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:44:46.439939 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f16029bc-2306-41e9-a2e1-5827fbf3be0a" path="/var/lib/kubelet/pods/f16029bc-2306-41e9-a2e1-5827fbf3be0a/volumes" Apr 23 13:45:40.790737 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:45:40.790699 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-5b4d64b8bb-ggpnq"] Apr 23 13:45:40.791208 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:45:40.791068 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f16029bc-2306-41e9-a2e1-5827fbf3be0a" containerName="storage-initializer" Apr 23 13:45:40.791208 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:45:40.791084 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="f16029bc-2306-41e9-a2e1-5827fbf3be0a" containerName="storage-initializer" Apr 23 13:45:40.791208 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:45:40.791137 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="f16029bc-2306-41e9-a2e1-5827fbf3be0a" containerName="storage-initializer" Apr 23 13:45:40.794135 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:45:40.794119 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-5b4d64b8bb-ggpnq" Apr 23 13:45:40.797379 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:45:40.797356 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-z29bv\"" Apr 23 13:45:40.797489 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:45:40.797412 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-configmap-ref-test-kserve-self-signed-certs\"" Apr 23 13:45:40.802815 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:45:40.802793 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-5b4d64b8bb-ggpnq"] Apr 23 13:45:40.856998 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:45:40.856964 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6n6zk\" (UniqueName: \"kubernetes.io/projected/7bfa4341-043d-4b90-a7ef-79a7a1b2cbe5-kube-api-access-6n6zk\") pod \"scheduler-configmap-ref-test-kserve-5b4d64b8bb-ggpnq\" (UID: \"7bfa4341-043d-4b90-a7ef-79a7a1b2cbe5\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-5b4d64b8bb-ggpnq" Apr 23 13:45:40.857151 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:45:40.857004 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7bfa4341-043d-4b90-a7ef-79a7a1b2cbe5-kserve-provision-location\") pod \"scheduler-configmap-ref-test-kserve-5b4d64b8bb-ggpnq\" (UID: \"7bfa4341-043d-4b90-a7ef-79a7a1b2cbe5\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-5b4d64b8bb-ggpnq" Apr 23 13:45:40.857151 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:45:40.857028 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/7bfa4341-043d-4b90-a7ef-79a7a1b2cbe5-dshm\") pod \"scheduler-configmap-ref-test-kserve-5b4d64b8bb-ggpnq\" (UID: \"7bfa4341-043d-4b90-a7ef-79a7a1b2cbe5\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-5b4d64b8bb-ggpnq" Apr 23 13:45:40.857151 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:45:40.857078 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/7bfa4341-043d-4b90-a7ef-79a7a1b2cbe5-model-cache\") pod \"scheduler-configmap-ref-test-kserve-5b4d64b8bb-ggpnq\" (UID: \"7bfa4341-043d-4b90-a7ef-79a7a1b2cbe5\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-5b4d64b8bb-ggpnq" Apr 23 13:45:40.857151 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:45:40.857124 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/7bfa4341-043d-4b90-a7ef-79a7a1b2cbe5-tls-certs\") pod \"scheduler-configmap-ref-test-kserve-5b4d64b8bb-ggpnq\" (UID: \"7bfa4341-043d-4b90-a7ef-79a7a1b2cbe5\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-5b4d64b8bb-ggpnq" Apr 23 13:45:40.857151 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:45:40.857147 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/7bfa4341-043d-4b90-a7ef-79a7a1b2cbe5-tmp-dir\") pod \"scheduler-configmap-ref-test-kserve-5b4d64b8bb-ggpnq\" (UID: \"7bfa4341-043d-4b90-a7ef-79a7a1b2cbe5\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-5b4d64b8bb-ggpnq" Apr 23 13:45:40.857351 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:45:40.857190 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/7bfa4341-043d-4b90-a7ef-79a7a1b2cbe5-home\") pod \"scheduler-configmap-ref-test-kserve-5b4d64b8bb-ggpnq\" (UID: \"7bfa4341-043d-4b90-a7ef-79a7a1b2cbe5\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-5b4d64b8bb-ggpnq" Apr 23 13:45:40.958259 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:45:40.958226 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/7bfa4341-043d-4b90-a7ef-79a7a1b2cbe5-home\") pod \"scheduler-configmap-ref-test-kserve-5b4d64b8bb-ggpnq\" (UID: \"7bfa4341-043d-4b90-a7ef-79a7a1b2cbe5\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-5b4d64b8bb-ggpnq" Apr 23 13:45:40.958426 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:45:40.958268 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6n6zk\" (UniqueName: \"kubernetes.io/projected/7bfa4341-043d-4b90-a7ef-79a7a1b2cbe5-kube-api-access-6n6zk\") pod \"scheduler-configmap-ref-test-kserve-5b4d64b8bb-ggpnq\" (UID: \"7bfa4341-043d-4b90-a7ef-79a7a1b2cbe5\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-5b4d64b8bb-ggpnq" Apr 23 13:45:40.958426 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:45:40.958295 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7bfa4341-043d-4b90-a7ef-79a7a1b2cbe5-kserve-provision-location\") pod \"scheduler-configmap-ref-test-kserve-5b4d64b8bb-ggpnq\" (UID: \"7bfa4341-043d-4b90-a7ef-79a7a1b2cbe5\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-5b4d64b8bb-ggpnq" Apr 23 13:45:40.958426 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:45:40.958324 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/7bfa4341-043d-4b90-a7ef-79a7a1b2cbe5-dshm\") pod \"scheduler-configmap-ref-test-kserve-5b4d64b8bb-ggpnq\" (UID: \"7bfa4341-043d-4b90-a7ef-79a7a1b2cbe5\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-5b4d64b8bb-ggpnq" Apr 23 13:45:40.958426 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:45:40.958379 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/7bfa4341-043d-4b90-a7ef-79a7a1b2cbe5-model-cache\") pod \"scheduler-configmap-ref-test-kserve-5b4d64b8bb-ggpnq\" (UID: \"7bfa4341-043d-4b90-a7ef-79a7a1b2cbe5\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-5b4d64b8bb-ggpnq" Apr 23 13:45:40.958426 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:45:40.958419 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/7bfa4341-043d-4b90-a7ef-79a7a1b2cbe5-tls-certs\") pod \"scheduler-configmap-ref-test-kserve-5b4d64b8bb-ggpnq\" (UID: \"7bfa4341-043d-4b90-a7ef-79a7a1b2cbe5\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-5b4d64b8bb-ggpnq" Apr 23 13:45:40.958693 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:45:40.958450 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/7bfa4341-043d-4b90-a7ef-79a7a1b2cbe5-tmp-dir\") pod \"scheduler-configmap-ref-test-kserve-5b4d64b8bb-ggpnq\" (UID: \"7bfa4341-043d-4b90-a7ef-79a7a1b2cbe5\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-5b4d64b8bb-ggpnq" Apr 23 13:45:40.958786 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:45:40.958766 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7bfa4341-043d-4b90-a7ef-79a7a1b2cbe5-kserve-provision-location\") pod \"scheduler-configmap-ref-test-kserve-5b4d64b8bb-ggpnq\" (UID: \"7bfa4341-043d-4b90-a7ef-79a7a1b2cbe5\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-5b4d64b8bb-ggpnq" Apr 23 13:45:40.958828 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:45:40.958810 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/7bfa4341-043d-4b90-a7ef-79a7a1b2cbe5-model-cache\") pod \"scheduler-configmap-ref-test-kserve-5b4d64b8bb-ggpnq\" (UID: \"7bfa4341-043d-4b90-a7ef-79a7a1b2cbe5\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-5b4d64b8bb-ggpnq" Apr 23 13:45:40.958868 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:45:40.958839 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/7bfa4341-043d-4b90-a7ef-79a7a1b2cbe5-tmp-dir\") pod \"scheduler-configmap-ref-test-kserve-5b4d64b8bb-ggpnq\" (UID: \"7bfa4341-043d-4b90-a7ef-79a7a1b2cbe5\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-5b4d64b8bb-ggpnq" Apr 23 13:45:40.958934 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:45:40.958919 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/7bfa4341-043d-4b90-a7ef-79a7a1b2cbe5-home\") pod \"scheduler-configmap-ref-test-kserve-5b4d64b8bb-ggpnq\" (UID: \"7bfa4341-043d-4b90-a7ef-79a7a1b2cbe5\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-5b4d64b8bb-ggpnq" Apr 23 13:45:40.960508 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:45:40.960484 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/7bfa4341-043d-4b90-a7ef-79a7a1b2cbe5-dshm\") pod \"scheduler-configmap-ref-test-kserve-5b4d64b8bb-ggpnq\" (UID: \"7bfa4341-043d-4b90-a7ef-79a7a1b2cbe5\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-5b4d64b8bb-ggpnq" Apr 23 13:45:40.960836 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:45:40.960814 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/7bfa4341-043d-4b90-a7ef-79a7a1b2cbe5-tls-certs\") pod \"scheduler-configmap-ref-test-kserve-5b4d64b8bb-ggpnq\" (UID: \"7bfa4341-043d-4b90-a7ef-79a7a1b2cbe5\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-5b4d64b8bb-ggpnq" Apr 23 13:45:40.968877 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:45:40.968851 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6n6zk\" (UniqueName: \"kubernetes.io/projected/7bfa4341-043d-4b90-a7ef-79a7a1b2cbe5-kube-api-access-6n6zk\") pod \"scheduler-configmap-ref-test-kserve-5b4d64b8bb-ggpnq\" (UID: \"7bfa4341-043d-4b90-a7ef-79a7a1b2cbe5\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-5b4d64b8bb-ggpnq" Apr 23 13:45:41.105482 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:45:41.105399 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-5b4d64b8bb-ggpnq" Apr 23 13:45:41.227974 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:45:41.227933 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-5b4d64b8bb-ggpnq"] Apr 23 13:45:41.230055 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:45:41.230010 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7bfa4341_043d_4b90_a7ef_79a7a1b2cbe5.slice/crio-42a6dfef0f99114e84d9384e39504ac9e7821af4d58051790b4616fd11ffafe4 WatchSource:0}: Error finding container 42a6dfef0f99114e84d9384e39504ac9e7821af4d58051790b4616fd11ffafe4: Status 404 returned error can't find the container with id 42a6dfef0f99114e84d9384e39504ac9e7821af4d58051790b4616fd11ffafe4 Apr 23 13:45:42.233183 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:45:42.233145 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-5b4d64b8bb-ggpnq" event={"ID":"7bfa4341-043d-4b90-a7ef-79a7a1b2cbe5","Type":"ContainerStarted","Data":"2af889406407e300c3de18bdbcc06eaa4c8ff62669e60acfecf57dac94837711"} Apr 23 13:45:42.233183 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:45:42.233186 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-5b4d64b8bb-ggpnq" event={"ID":"7bfa4341-043d-4b90-a7ef-79a7a1b2cbe5","Type":"ContainerStarted","Data":"42a6dfef0f99114e84d9384e39504ac9e7821af4d58051790b4616fd11ffafe4"} Apr 23 13:45:46.249145 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:45:46.249111 2571 generic.go:358] "Generic (PLEG): container finished" podID="7bfa4341-043d-4b90-a7ef-79a7a1b2cbe5" containerID="2af889406407e300c3de18bdbcc06eaa4c8ff62669e60acfecf57dac94837711" exitCode=0 Apr 23 13:45:46.249536 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:45:46.249186 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-5b4d64b8bb-ggpnq" event={"ID":"7bfa4341-043d-4b90-a7ef-79a7a1b2cbe5","Type":"ContainerDied","Data":"2af889406407e300c3de18bdbcc06eaa4c8ff62669e60acfecf57dac94837711"} Apr 23 13:45:48.262651 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:45:48.262614 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-5b4d64b8bb-ggpnq" event={"ID":"7bfa4341-043d-4b90-a7ef-79a7a1b2cbe5","Type":"ContainerStarted","Data":"68f4983e7d1a84098fb5d8f5b8ed91f89d9a0a05ad15290d83ad5e99338ea8d9"} Apr 23 13:45:48.291665 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:45:48.291609 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-5b4d64b8bb-ggpnq" podStartSLOduration=7.093608993 podStartE2EDuration="8.291596369s" podCreationTimestamp="2026-04-23 13:45:40 +0000 UTC" firstStartedPulling="2026-04-23 13:45:46.250278687 +0000 UTC m=+880.388557353" lastFinishedPulling="2026-04-23 13:45:47.448266051 +0000 UTC m=+881.586544729" observedRunningTime="2026-04-23 13:45:48.289716822 +0000 UTC m=+882.427995506" watchObservedRunningTime="2026-04-23 13:45:48.291596369 +0000 UTC m=+882.429875053" Apr 23 13:45:51.106320 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:45:51.106244 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-5b4d64b8bb-ggpnq" Apr 23 13:45:51.106686 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:45:51.106323 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-5b4d64b8bb-ggpnq" Apr 23 13:45:51.118666 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:45:51.118644 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-5b4d64b8bb-ggpnq" Apr 23 13:45:51.291185 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:45:51.291155 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-5b4d64b8bb-ggpnq" Apr 23 13:46:06.404418 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:46:06.404390 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zk8wt_a5619baf-099b-4d83-ad43-fd7d0083f57b/ovn-acl-logging/0.log" Apr 23 13:46:06.407185 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:46:06.407161 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zk8wt_a5619baf-099b-4d83-ad43-fd7d0083f57b/ovn-acl-logging/0.log" Apr 23 13:46:35.418894 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:46:35.418855 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-5b4d64b8bb-ggpnq"] Apr 23 13:46:35.419316 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:46:35.419141 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-5b4d64b8bb-ggpnq" podUID="7bfa4341-043d-4b90-a7ef-79a7a1b2cbe5" containerName="main" containerID="cri-o://68f4983e7d1a84098fb5d8f5b8ed91f89d9a0a05ad15290d83ad5e99338ea8d9" gracePeriod=30 Apr 23 13:46:35.668436 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:46:35.668412 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-5b4d64b8bb-ggpnq" Apr 23 13:46:35.804184 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:46:35.804150 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7bfa4341-043d-4b90-a7ef-79a7a1b2cbe5-kserve-provision-location\") pod \"7bfa4341-043d-4b90-a7ef-79a7a1b2cbe5\" (UID: \"7bfa4341-043d-4b90-a7ef-79a7a1b2cbe5\") " Apr 23 13:46:35.804184 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:46:35.804184 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/7bfa4341-043d-4b90-a7ef-79a7a1b2cbe5-model-cache\") pod \"7bfa4341-043d-4b90-a7ef-79a7a1b2cbe5\" (UID: \"7bfa4341-043d-4b90-a7ef-79a7a1b2cbe5\") " Apr 23 13:46:35.804432 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:46:35.804244 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/7bfa4341-043d-4b90-a7ef-79a7a1b2cbe5-dshm\") pod \"7bfa4341-043d-4b90-a7ef-79a7a1b2cbe5\" (UID: \"7bfa4341-043d-4b90-a7ef-79a7a1b2cbe5\") " Apr 23 13:46:35.804432 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:46:35.804269 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6n6zk\" (UniqueName: \"kubernetes.io/projected/7bfa4341-043d-4b90-a7ef-79a7a1b2cbe5-kube-api-access-6n6zk\") pod \"7bfa4341-043d-4b90-a7ef-79a7a1b2cbe5\" (UID: \"7bfa4341-043d-4b90-a7ef-79a7a1b2cbe5\") " Apr 23 13:46:35.804432 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:46:35.804292 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/7bfa4341-043d-4b90-a7ef-79a7a1b2cbe5-tls-certs\") pod \"7bfa4341-043d-4b90-a7ef-79a7a1b2cbe5\" (UID: \"7bfa4341-043d-4b90-a7ef-79a7a1b2cbe5\") " Apr 23 13:46:35.804432 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:46:35.804364 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/7bfa4341-043d-4b90-a7ef-79a7a1b2cbe5-tmp-dir\") pod \"7bfa4341-043d-4b90-a7ef-79a7a1b2cbe5\" (UID: \"7bfa4341-043d-4b90-a7ef-79a7a1b2cbe5\") " Apr 23 13:46:35.804606 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:46:35.804551 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/7bfa4341-043d-4b90-a7ef-79a7a1b2cbe5-home\") pod \"7bfa4341-043d-4b90-a7ef-79a7a1b2cbe5\" (UID: \"7bfa4341-043d-4b90-a7ef-79a7a1b2cbe5\") " Apr 23 13:46:35.804606 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:46:35.804570 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7bfa4341-043d-4b90-a7ef-79a7a1b2cbe5-tmp-dir" (OuterVolumeSpecName: "tmp-dir") pod "7bfa4341-043d-4b90-a7ef-79a7a1b2cbe5" (UID: "7bfa4341-043d-4b90-a7ef-79a7a1b2cbe5"). InnerVolumeSpecName "tmp-dir". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 13:46:35.804606 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:46:35.804579 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7bfa4341-043d-4b90-a7ef-79a7a1b2cbe5-model-cache" (OuterVolumeSpecName: "model-cache") pod "7bfa4341-043d-4b90-a7ef-79a7a1b2cbe5" (UID: "7bfa4341-043d-4b90-a7ef-79a7a1b2cbe5"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 13:46:35.804821 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:46:35.804803 2571 reconciler_common.go:299] "Volume detached for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/7bfa4341-043d-4b90-a7ef-79a7a1b2cbe5-tmp-dir\") on node \"ip-10-0-128-108.ec2.internal\" DevicePath \"\"" Apr 23 13:46:35.804874 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:46:35.804828 2571 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/7bfa4341-043d-4b90-a7ef-79a7a1b2cbe5-model-cache\") on node \"ip-10-0-128-108.ec2.internal\" DevicePath \"\"" Apr 23 13:46:35.804911 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:46:35.804874 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7bfa4341-043d-4b90-a7ef-79a7a1b2cbe5-home" (OuterVolumeSpecName: "home") pod "7bfa4341-043d-4b90-a7ef-79a7a1b2cbe5" (UID: "7bfa4341-043d-4b90-a7ef-79a7a1b2cbe5"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 13:46:35.806491 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:46:35.806459 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bfa4341-043d-4b90-a7ef-79a7a1b2cbe5-kube-api-access-6n6zk" (OuterVolumeSpecName: "kube-api-access-6n6zk") pod "7bfa4341-043d-4b90-a7ef-79a7a1b2cbe5" (UID: "7bfa4341-043d-4b90-a7ef-79a7a1b2cbe5"). InnerVolumeSpecName "kube-api-access-6n6zk". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 13:46:35.806824 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:46:35.806793 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7bfa4341-043d-4b90-a7ef-79a7a1b2cbe5-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "7bfa4341-043d-4b90-a7ef-79a7a1b2cbe5" (UID: "7bfa4341-043d-4b90-a7ef-79a7a1b2cbe5"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 13:46:35.806824 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:46:35.806798 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7bfa4341-043d-4b90-a7ef-79a7a1b2cbe5-dshm" (OuterVolumeSpecName: "dshm") pod "7bfa4341-043d-4b90-a7ef-79a7a1b2cbe5" (UID: "7bfa4341-043d-4b90-a7ef-79a7a1b2cbe5"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 13:46:35.869833 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:46:35.869784 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7bfa4341-043d-4b90-a7ef-79a7a1b2cbe5-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "7bfa4341-043d-4b90-a7ef-79a7a1b2cbe5" (UID: "7bfa4341-043d-4b90-a7ef-79a7a1b2cbe5"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 13:46:35.905202 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:46:35.905167 2571 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/7bfa4341-043d-4b90-a7ef-79a7a1b2cbe5-dshm\") on node \"ip-10-0-128-108.ec2.internal\" DevicePath \"\"" Apr 23 13:46:35.905202 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:46:35.905196 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-6n6zk\" (UniqueName: \"kubernetes.io/projected/7bfa4341-043d-4b90-a7ef-79a7a1b2cbe5-kube-api-access-6n6zk\") on node \"ip-10-0-128-108.ec2.internal\" DevicePath \"\"" Apr 23 13:46:35.905413 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:46:35.905211 2571 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/7bfa4341-043d-4b90-a7ef-79a7a1b2cbe5-tls-certs\") on node \"ip-10-0-128-108.ec2.internal\" DevicePath \"\"" Apr 23 13:46:35.905413 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:46:35.905224 2571 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/7bfa4341-043d-4b90-a7ef-79a7a1b2cbe5-home\") on node \"ip-10-0-128-108.ec2.internal\" DevicePath \"\"" Apr 23 13:46:35.905413 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:46:35.905235 2571 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7bfa4341-043d-4b90-a7ef-79a7a1b2cbe5-kserve-provision-location\") on node \"ip-10-0-128-108.ec2.internal\" DevicePath \"\"" Apr 23 13:46:36.450258 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:46:36.450228 2571 generic.go:358] "Generic (PLEG): container finished" podID="7bfa4341-043d-4b90-a7ef-79a7a1b2cbe5" containerID="68f4983e7d1a84098fb5d8f5b8ed91f89d9a0a05ad15290d83ad5e99338ea8d9" exitCode=0 Apr 23 13:46:36.450606 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:46:36.450290 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-5b4d64b8bb-ggpnq" Apr 23 13:46:36.450606 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:46:36.450298 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-5b4d64b8bb-ggpnq" event={"ID":"7bfa4341-043d-4b90-a7ef-79a7a1b2cbe5","Type":"ContainerDied","Data":"68f4983e7d1a84098fb5d8f5b8ed91f89d9a0a05ad15290d83ad5e99338ea8d9"} Apr 23 13:46:36.450606 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:46:36.450343 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-5b4d64b8bb-ggpnq" event={"ID":"7bfa4341-043d-4b90-a7ef-79a7a1b2cbe5","Type":"ContainerDied","Data":"42a6dfef0f99114e84d9384e39504ac9e7821af4d58051790b4616fd11ffafe4"} Apr 23 13:46:36.450606 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:46:36.450353 2571 scope.go:117] "RemoveContainer" containerID="68f4983e7d1a84098fb5d8f5b8ed91f89d9a0a05ad15290d83ad5e99338ea8d9" Apr 23 13:46:36.458814 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:46:36.458798 2571 scope.go:117] "RemoveContainer" containerID="2af889406407e300c3de18bdbcc06eaa4c8ff62669e60acfecf57dac94837711" Apr 23 13:46:36.468475 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:46:36.468454 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-5b4d64b8bb-ggpnq"] Apr 23 13:46:36.472621 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:46:36.472602 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-5b4d64b8bb-ggpnq"] Apr 23 13:46:36.518572 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:46:36.518552 2571 scope.go:117] "RemoveContainer" containerID="68f4983e7d1a84098fb5d8f5b8ed91f89d9a0a05ad15290d83ad5e99338ea8d9" Apr 23 13:46:36.518874 ip-10-0-128-108 kubenswrapper[2571]: E0423 13:46:36.518848 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"68f4983e7d1a84098fb5d8f5b8ed91f89d9a0a05ad15290d83ad5e99338ea8d9\": container with ID starting with 68f4983e7d1a84098fb5d8f5b8ed91f89d9a0a05ad15290d83ad5e99338ea8d9 not found: ID does not exist" containerID="68f4983e7d1a84098fb5d8f5b8ed91f89d9a0a05ad15290d83ad5e99338ea8d9" Apr 23 13:46:36.518916 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:46:36.518876 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"68f4983e7d1a84098fb5d8f5b8ed91f89d9a0a05ad15290d83ad5e99338ea8d9"} err="failed to get container status \"68f4983e7d1a84098fb5d8f5b8ed91f89d9a0a05ad15290d83ad5e99338ea8d9\": rpc error: code = NotFound desc = could not find container \"68f4983e7d1a84098fb5d8f5b8ed91f89d9a0a05ad15290d83ad5e99338ea8d9\": container with ID starting with 68f4983e7d1a84098fb5d8f5b8ed91f89d9a0a05ad15290d83ad5e99338ea8d9 not found: ID does not exist" Apr 23 13:46:36.518916 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:46:36.518896 2571 scope.go:117] "RemoveContainer" containerID="2af889406407e300c3de18bdbcc06eaa4c8ff62669e60acfecf57dac94837711" Apr 23 13:46:36.519141 ip-10-0-128-108 kubenswrapper[2571]: E0423 13:46:36.519129 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2af889406407e300c3de18bdbcc06eaa4c8ff62669e60acfecf57dac94837711\": container with ID starting with 2af889406407e300c3de18bdbcc06eaa4c8ff62669e60acfecf57dac94837711 not found: ID does not exist" containerID="2af889406407e300c3de18bdbcc06eaa4c8ff62669e60acfecf57dac94837711" Apr 23 13:46:36.519180 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:46:36.519144 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2af889406407e300c3de18bdbcc06eaa4c8ff62669e60acfecf57dac94837711"} err="failed to get container status \"2af889406407e300c3de18bdbcc06eaa4c8ff62669e60acfecf57dac94837711\": rpc error: code = NotFound desc = could not find container \"2af889406407e300c3de18bdbcc06eaa4c8ff62669e60acfecf57dac94837711\": container with ID starting with 2af889406407e300c3de18bdbcc06eaa4c8ff62669e60acfecf57dac94837711 not found: ID does not exist" Apr 23 13:46:38.433272 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:46:38.433237 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bfa4341-043d-4b90-a7ef-79a7a1b2cbe5" path="/var/lib/kubelet/pods/7bfa4341-043d-4b90-a7ef-79a7a1b2cbe5/volumes" Apr 23 13:46:38.664841 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:46:38.664805 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-84dfd4948b-7jrwn"] Apr 23 13:46:38.665279 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:46:38.665256 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7bfa4341-043d-4b90-a7ef-79a7a1b2cbe5" containerName="storage-initializer" Apr 23 13:46:38.665402 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:46:38.665281 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="7bfa4341-043d-4b90-a7ef-79a7a1b2cbe5" containerName="storage-initializer" Apr 23 13:46:38.665402 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:46:38.665309 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7bfa4341-043d-4b90-a7ef-79a7a1b2cbe5" containerName="main" Apr 23 13:46:38.665402 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:46:38.665317 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="7bfa4341-043d-4b90-a7ef-79a7a1b2cbe5" containerName="main" Apr 23 13:46:38.665570 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:46:38.665412 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="7bfa4341-043d-4b90-a7ef-79a7a1b2cbe5" containerName="main" Apr 23 13:46:38.670594 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:46:38.670572 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-84dfd4948b-7jrwn" Apr 23 13:46:38.673344 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:46:38.673303 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-ha-replicas-test-kserve-self-signed-certs\"" Apr 23 13:46:38.673457 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:46:38.673420 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-z29bv\"" Apr 23 13:46:38.682423 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:46:38.682399 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-84dfd4948b-7jrwn"] Apr 23 13:46:38.827126 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:46:38.827090 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/880f9313-cfe0-4f7c-b418-2ffa1fd13533-home\") pod \"scheduler-ha-replicas-test-kserve-84dfd4948b-7jrwn\" (UID: \"880f9313-cfe0-4f7c-b418-2ffa1fd13533\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-84dfd4948b-7jrwn" Apr 23 13:46:38.827302 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:46:38.827136 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/880f9313-cfe0-4f7c-b418-2ffa1fd13533-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-84dfd4948b-7jrwn\" (UID: \"880f9313-cfe0-4f7c-b418-2ffa1fd13533\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-84dfd4948b-7jrwn" Apr 23 13:46:38.827302 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:46:38.827187 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/880f9313-cfe0-4f7c-b418-2ffa1fd13533-model-cache\") pod \"scheduler-ha-replicas-test-kserve-84dfd4948b-7jrwn\" (UID: \"880f9313-cfe0-4f7c-b418-2ffa1fd13533\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-84dfd4948b-7jrwn" Apr 23 13:46:38.827302 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:46:38.827227 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qfbl5\" (UniqueName: \"kubernetes.io/projected/880f9313-cfe0-4f7c-b418-2ffa1fd13533-kube-api-access-qfbl5\") pod \"scheduler-ha-replicas-test-kserve-84dfd4948b-7jrwn\" (UID: \"880f9313-cfe0-4f7c-b418-2ffa1fd13533\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-84dfd4948b-7jrwn" Apr 23 13:46:38.827470 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:46:38.827324 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/880f9313-cfe0-4f7c-b418-2ffa1fd13533-tmp-dir\") pod \"scheduler-ha-replicas-test-kserve-84dfd4948b-7jrwn\" (UID: \"880f9313-cfe0-4f7c-b418-2ffa1fd13533\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-84dfd4948b-7jrwn" Apr 23 13:46:38.827470 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:46:38.827380 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/880f9313-cfe0-4f7c-b418-2ffa1fd13533-dshm\") pod \"scheduler-ha-replicas-test-kserve-84dfd4948b-7jrwn\" (UID: \"880f9313-cfe0-4f7c-b418-2ffa1fd13533\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-84dfd4948b-7jrwn" Apr 23 13:46:38.827470 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:46:38.827405 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/880f9313-cfe0-4f7c-b418-2ffa1fd13533-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-84dfd4948b-7jrwn\" (UID: \"880f9313-cfe0-4f7c-b418-2ffa1fd13533\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-84dfd4948b-7jrwn" Apr 23 13:46:38.928768 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:46:38.928732 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/880f9313-cfe0-4f7c-b418-2ffa1fd13533-home\") pod \"scheduler-ha-replicas-test-kserve-84dfd4948b-7jrwn\" (UID: \"880f9313-cfe0-4f7c-b418-2ffa1fd13533\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-84dfd4948b-7jrwn" Apr 23 13:46:38.928768 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:46:38.928771 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/880f9313-cfe0-4f7c-b418-2ffa1fd13533-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-84dfd4948b-7jrwn\" (UID: \"880f9313-cfe0-4f7c-b418-2ffa1fd13533\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-84dfd4948b-7jrwn" Apr 23 13:46:38.929005 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:46:38.928794 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/880f9313-cfe0-4f7c-b418-2ffa1fd13533-model-cache\") pod \"scheduler-ha-replicas-test-kserve-84dfd4948b-7jrwn\" (UID: \"880f9313-cfe0-4f7c-b418-2ffa1fd13533\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-84dfd4948b-7jrwn" Apr 23 13:46:38.929005 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:46:38.928824 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qfbl5\" (UniqueName: \"kubernetes.io/projected/880f9313-cfe0-4f7c-b418-2ffa1fd13533-kube-api-access-qfbl5\") pod \"scheduler-ha-replicas-test-kserve-84dfd4948b-7jrwn\" (UID: \"880f9313-cfe0-4f7c-b418-2ffa1fd13533\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-84dfd4948b-7jrwn" Apr 23 13:46:38.929005 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:46:38.928879 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/880f9313-cfe0-4f7c-b418-2ffa1fd13533-tmp-dir\") pod \"scheduler-ha-replicas-test-kserve-84dfd4948b-7jrwn\" (UID: \"880f9313-cfe0-4f7c-b418-2ffa1fd13533\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-84dfd4948b-7jrwn" Apr 23 13:46:38.929005 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:46:38.928899 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/880f9313-cfe0-4f7c-b418-2ffa1fd13533-dshm\") pod \"scheduler-ha-replicas-test-kserve-84dfd4948b-7jrwn\" (UID: \"880f9313-cfe0-4f7c-b418-2ffa1fd13533\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-84dfd4948b-7jrwn" Apr 23 13:46:38.929005 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:46:38.928917 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/880f9313-cfe0-4f7c-b418-2ffa1fd13533-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-84dfd4948b-7jrwn\" (UID: \"880f9313-cfe0-4f7c-b418-2ffa1fd13533\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-84dfd4948b-7jrwn" Apr 23 13:46:38.929268 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:46:38.929168 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/880f9313-cfe0-4f7c-b418-2ffa1fd13533-home\") pod \"scheduler-ha-replicas-test-kserve-84dfd4948b-7jrwn\" (UID: \"880f9313-cfe0-4f7c-b418-2ffa1fd13533\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-84dfd4948b-7jrwn" Apr 23 13:46:38.929268 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:46:38.929220 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/880f9313-cfe0-4f7c-b418-2ffa1fd13533-model-cache\") pod \"scheduler-ha-replicas-test-kserve-84dfd4948b-7jrwn\" (UID: \"880f9313-cfe0-4f7c-b418-2ffa1fd13533\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-84dfd4948b-7jrwn" Apr 23 13:46:38.929367 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:46:38.929280 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/880f9313-cfe0-4f7c-b418-2ffa1fd13533-tmp-dir\") pod \"scheduler-ha-replicas-test-kserve-84dfd4948b-7jrwn\" (UID: \"880f9313-cfe0-4f7c-b418-2ffa1fd13533\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-84dfd4948b-7jrwn" Apr 23 13:46:38.929367 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:46:38.929321 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/880f9313-cfe0-4f7c-b418-2ffa1fd13533-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-84dfd4948b-7jrwn\" (UID: \"880f9313-cfe0-4f7c-b418-2ffa1fd13533\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-84dfd4948b-7jrwn" Apr 23 13:46:38.931154 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:46:38.931136 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/880f9313-cfe0-4f7c-b418-2ffa1fd13533-dshm\") pod \"scheduler-ha-replicas-test-kserve-84dfd4948b-7jrwn\" (UID: \"880f9313-cfe0-4f7c-b418-2ffa1fd13533\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-84dfd4948b-7jrwn" Apr 23 13:46:38.931318 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:46:38.931303 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/880f9313-cfe0-4f7c-b418-2ffa1fd13533-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-84dfd4948b-7jrwn\" (UID: \"880f9313-cfe0-4f7c-b418-2ffa1fd13533\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-84dfd4948b-7jrwn" Apr 23 13:46:38.937833 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:46:38.937813 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qfbl5\" (UniqueName: \"kubernetes.io/projected/880f9313-cfe0-4f7c-b418-2ffa1fd13533-kube-api-access-qfbl5\") pod \"scheduler-ha-replicas-test-kserve-84dfd4948b-7jrwn\" (UID: \"880f9313-cfe0-4f7c-b418-2ffa1fd13533\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-84dfd4948b-7jrwn" Apr 23 13:46:38.985765 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:46:38.985741 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-84dfd4948b-7jrwn" Apr 23 13:46:39.111909 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:46:39.111878 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-84dfd4948b-7jrwn"] Apr 23 13:46:39.113573 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:46:39.113545 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod880f9313_cfe0_4f7c_b418_2ffa1fd13533.slice/crio-6647c15c68be369458786113841fa6d9fea67ec8c8252ab04605fb7a56f19dc0 WatchSource:0}: Error finding container 6647c15c68be369458786113841fa6d9fea67ec8c8252ab04605fb7a56f19dc0: Status 404 returned error can't find the container with id 6647c15c68be369458786113841fa6d9fea67ec8c8252ab04605fb7a56f19dc0 Apr 23 13:46:39.464460 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:46:39.464424 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-84dfd4948b-7jrwn" event={"ID":"880f9313-cfe0-4f7c-b418-2ffa1fd13533","Type":"ContainerStarted","Data":"e93df25062a1a02841add4677c547656d961fa5013eab234acad405d272fd9f9"} Apr 23 13:46:39.464460 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:46:39.464464 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-84dfd4948b-7jrwn" event={"ID":"880f9313-cfe0-4f7c-b418-2ffa1fd13533","Type":"ContainerStarted","Data":"6647c15c68be369458786113841fa6d9fea67ec8c8252ab04605fb7a56f19dc0"} Apr 23 13:46:43.480434 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:46:43.480347 2571 generic.go:358] "Generic (PLEG): container finished" podID="880f9313-cfe0-4f7c-b418-2ffa1fd13533" containerID="e93df25062a1a02841add4677c547656d961fa5013eab234acad405d272fd9f9" exitCode=0 Apr 23 13:46:43.480434 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:46:43.480407 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-84dfd4948b-7jrwn" event={"ID":"880f9313-cfe0-4f7c-b418-2ffa1fd13533","Type":"ContainerDied","Data":"e93df25062a1a02841add4677c547656d961fa5013eab234acad405d272fd9f9"} Apr 23 13:46:44.485377 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:46:44.485322 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-84dfd4948b-7jrwn" event={"ID":"880f9313-cfe0-4f7c-b418-2ffa1fd13533","Type":"ContainerStarted","Data":"9ad7de090a4ed279e0a195d97f3076a091f9f467b75db39280a4f1b53785ef56"} Apr 23 13:46:44.504483 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:46:44.504438 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-84dfd4948b-7jrwn" podStartSLOduration=6.50442419 podStartE2EDuration="6.50442419s" podCreationTimestamp="2026-04-23 13:46:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 13:46:44.502718755 +0000 UTC m=+938.640997440" watchObservedRunningTime="2026-04-23 13:46:44.50442419 +0000 UTC m=+938.642702918" Apr 23 13:46:48.986801 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:46:48.986755 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-84dfd4948b-7jrwn" Apr 23 13:46:48.987207 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:46:48.986814 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-84dfd4948b-7jrwn" Apr 23 13:46:48.999309 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:46:48.999284 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-84dfd4948b-7jrwn" Apr 23 13:46:49.514142 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:46:49.514115 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-84dfd4948b-7jrwn" Apr 23 13:47:13.052078 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:47:13.052040 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-84dfd4948b-7jrwn"] Apr 23 13:47:13.052610 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:47:13.052552 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-84dfd4948b-7jrwn" podUID="880f9313-cfe0-4f7c-b418-2ffa1fd13533" containerName="main" containerID="cri-o://9ad7de090a4ed279e0a195d97f3076a091f9f467b75db39280a4f1b53785ef56" gracePeriod=30 Apr 23 13:47:13.298152 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:47:13.298126 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-84dfd4948b-7jrwn" Apr 23 13:47:13.303364 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:47:13.303292 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/880f9313-cfe0-4f7c-b418-2ffa1fd13533-tmp-dir\") pod \"880f9313-cfe0-4f7c-b418-2ffa1fd13533\" (UID: \"880f9313-cfe0-4f7c-b418-2ffa1fd13533\") " Apr 23 13:47:13.303364 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:47:13.303322 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/880f9313-cfe0-4f7c-b418-2ffa1fd13533-dshm\") pod \"880f9313-cfe0-4f7c-b418-2ffa1fd13533\" (UID: \"880f9313-cfe0-4f7c-b418-2ffa1fd13533\") " Apr 23 13:47:13.303364 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:47:13.303359 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qfbl5\" (UniqueName: \"kubernetes.io/projected/880f9313-cfe0-4f7c-b418-2ffa1fd13533-kube-api-access-qfbl5\") pod \"880f9313-cfe0-4f7c-b418-2ffa1fd13533\" (UID: \"880f9313-cfe0-4f7c-b418-2ffa1fd13533\") " Apr 23 13:47:13.303588 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:47:13.303377 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/880f9313-cfe0-4f7c-b418-2ffa1fd13533-home\") pod \"880f9313-cfe0-4f7c-b418-2ffa1fd13533\" (UID: \"880f9313-cfe0-4f7c-b418-2ffa1fd13533\") " Apr 23 13:47:13.303588 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:47:13.303403 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/880f9313-cfe0-4f7c-b418-2ffa1fd13533-tls-certs\") pod \"880f9313-cfe0-4f7c-b418-2ffa1fd13533\" (UID: \"880f9313-cfe0-4f7c-b418-2ffa1fd13533\") " Apr 23 13:47:13.303588 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:47:13.303465 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/880f9313-cfe0-4f7c-b418-2ffa1fd13533-kserve-provision-location\") pod \"880f9313-cfe0-4f7c-b418-2ffa1fd13533\" (UID: \"880f9313-cfe0-4f7c-b418-2ffa1fd13533\") " Apr 23 13:47:13.303588 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:47:13.303511 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/880f9313-cfe0-4f7c-b418-2ffa1fd13533-model-cache\") pod \"880f9313-cfe0-4f7c-b418-2ffa1fd13533\" (UID: \"880f9313-cfe0-4f7c-b418-2ffa1fd13533\") " Apr 23 13:47:13.303588 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:47:13.303524 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/880f9313-cfe0-4f7c-b418-2ffa1fd13533-tmp-dir" (OuterVolumeSpecName: "tmp-dir") pod "880f9313-cfe0-4f7c-b418-2ffa1fd13533" (UID: "880f9313-cfe0-4f7c-b418-2ffa1fd13533"). InnerVolumeSpecName "tmp-dir". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 13:47:13.303851 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:47:13.303610 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/880f9313-cfe0-4f7c-b418-2ffa1fd13533-home" (OuterVolumeSpecName: "home") pod "880f9313-cfe0-4f7c-b418-2ffa1fd13533" (UID: "880f9313-cfe0-4f7c-b418-2ffa1fd13533"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 13:47:13.303851 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:47:13.303727 2571 reconciler_common.go:299] "Volume detached for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/880f9313-cfe0-4f7c-b418-2ffa1fd13533-tmp-dir\") on node \"ip-10-0-128-108.ec2.internal\" DevicePath \"\"" Apr 23 13:47:13.303851 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:47:13.303749 2571 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/880f9313-cfe0-4f7c-b418-2ffa1fd13533-home\") on node \"ip-10-0-128-108.ec2.internal\" DevicePath \"\"" Apr 23 13:47:13.303851 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:47:13.303776 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/880f9313-cfe0-4f7c-b418-2ffa1fd13533-model-cache" (OuterVolumeSpecName: "model-cache") pod "880f9313-cfe0-4f7c-b418-2ffa1fd13533" (UID: "880f9313-cfe0-4f7c-b418-2ffa1fd13533"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 13:47:13.305481 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:47:13.305434 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/880f9313-cfe0-4f7c-b418-2ffa1fd13533-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "880f9313-cfe0-4f7c-b418-2ffa1fd13533" (UID: "880f9313-cfe0-4f7c-b418-2ffa1fd13533"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 13:47:13.305565 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:47:13.305482 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/880f9313-cfe0-4f7c-b418-2ffa1fd13533-kube-api-access-qfbl5" (OuterVolumeSpecName: "kube-api-access-qfbl5") pod "880f9313-cfe0-4f7c-b418-2ffa1fd13533" (UID: "880f9313-cfe0-4f7c-b418-2ffa1fd13533"). InnerVolumeSpecName "kube-api-access-qfbl5". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 13:47:13.305565 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:47:13.305505 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/880f9313-cfe0-4f7c-b418-2ffa1fd13533-dshm" (OuterVolumeSpecName: "dshm") pod "880f9313-cfe0-4f7c-b418-2ffa1fd13533" (UID: "880f9313-cfe0-4f7c-b418-2ffa1fd13533"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 13:47:13.363065 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:47:13.363023 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/880f9313-cfe0-4f7c-b418-2ffa1fd13533-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "880f9313-cfe0-4f7c-b418-2ffa1fd13533" (UID: "880f9313-cfe0-4f7c-b418-2ffa1fd13533"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 13:47:13.405083 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:47:13.405049 2571 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/880f9313-cfe0-4f7c-b418-2ffa1fd13533-dshm\") on node \"ip-10-0-128-108.ec2.internal\" DevicePath \"\"" Apr 23 13:47:13.405083 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:47:13.405082 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qfbl5\" (UniqueName: \"kubernetes.io/projected/880f9313-cfe0-4f7c-b418-2ffa1fd13533-kube-api-access-qfbl5\") on node \"ip-10-0-128-108.ec2.internal\" DevicePath \"\"" Apr 23 13:47:13.405278 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:47:13.405093 2571 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/880f9313-cfe0-4f7c-b418-2ffa1fd13533-tls-certs\") on node \"ip-10-0-128-108.ec2.internal\" DevicePath \"\"" Apr 23 13:47:13.405278 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:47:13.405103 2571 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/880f9313-cfe0-4f7c-b418-2ffa1fd13533-kserve-provision-location\") on node \"ip-10-0-128-108.ec2.internal\" DevicePath \"\"" Apr 23 13:47:13.405278 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:47:13.405136 2571 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/880f9313-cfe0-4f7c-b418-2ffa1fd13533-model-cache\") on node \"ip-10-0-128-108.ec2.internal\" DevicePath \"\"" Apr 23 13:47:13.590748 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:47:13.590658 2571 generic.go:358] "Generic (PLEG): container finished" podID="880f9313-cfe0-4f7c-b418-2ffa1fd13533" containerID="9ad7de090a4ed279e0a195d97f3076a091f9f467b75db39280a4f1b53785ef56" exitCode=0 Apr 23 13:47:13.590896 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:47:13.590739 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-84dfd4948b-7jrwn" event={"ID":"880f9313-cfe0-4f7c-b418-2ffa1fd13533","Type":"ContainerDied","Data":"9ad7de090a4ed279e0a195d97f3076a091f9f467b75db39280a4f1b53785ef56"} Apr 23 13:47:13.590896 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:47:13.590762 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-84dfd4948b-7jrwn" Apr 23 13:47:13.590896 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:47:13.590775 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-84dfd4948b-7jrwn" event={"ID":"880f9313-cfe0-4f7c-b418-2ffa1fd13533","Type":"ContainerDied","Data":"6647c15c68be369458786113841fa6d9fea67ec8c8252ab04605fb7a56f19dc0"} Apr 23 13:47:13.590896 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:47:13.590791 2571 scope.go:117] "RemoveContainer" containerID="9ad7de090a4ed279e0a195d97f3076a091f9f467b75db39280a4f1b53785ef56" Apr 23 13:47:13.599975 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:47:13.599956 2571 scope.go:117] "RemoveContainer" containerID="e93df25062a1a02841add4677c547656d961fa5013eab234acad405d272fd9f9" Apr 23 13:47:13.612320 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:47:13.612299 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-84dfd4948b-7jrwn"] Apr 23 13:47:13.615892 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:47:13.615873 2571 scope.go:117] "RemoveContainer" containerID="9ad7de090a4ed279e0a195d97f3076a091f9f467b75db39280a4f1b53785ef56" Apr 23 13:47:13.616218 ip-10-0-128-108 kubenswrapper[2571]: E0423 13:47:13.616199 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9ad7de090a4ed279e0a195d97f3076a091f9f467b75db39280a4f1b53785ef56\": container with ID starting with 9ad7de090a4ed279e0a195d97f3076a091f9f467b75db39280a4f1b53785ef56 not found: ID does not exist" containerID="9ad7de090a4ed279e0a195d97f3076a091f9f467b75db39280a4f1b53785ef56" Apr 23 13:47:13.616315 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:47:13.616230 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9ad7de090a4ed279e0a195d97f3076a091f9f467b75db39280a4f1b53785ef56"} err="failed to get container status \"9ad7de090a4ed279e0a195d97f3076a091f9f467b75db39280a4f1b53785ef56\": rpc error: code = NotFound desc = could not find container \"9ad7de090a4ed279e0a195d97f3076a091f9f467b75db39280a4f1b53785ef56\": container with ID starting with 9ad7de090a4ed279e0a195d97f3076a091f9f467b75db39280a4f1b53785ef56 not found: ID does not exist" Apr 23 13:47:13.616315 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:47:13.616256 2571 scope.go:117] "RemoveContainer" containerID="e93df25062a1a02841add4677c547656d961fa5013eab234acad405d272fd9f9" Apr 23 13:47:13.616674 ip-10-0-128-108 kubenswrapper[2571]: E0423 13:47:13.616645 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e93df25062a1a02841add4677c547656d961fa5013eab234acad405d272fd9f9\": container with ID starting with e93df25062a1a02841add4677c547656d961fa5013eab234acad405d272fd9f9 not found: ID does not exist" containerID="e93df25062a1a02841add4677c547656d961fa5013eab234acad405d272fd9f9" Apr 23 13:47:13.616733 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:47:13.616680 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-84dfd4948b-7jrwn"] Apr 23 13:47:13.616733 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:47:13.616678 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e93df25062a1a02841add4677c547656d961fa5013eab234acad405d272fd9f9"} err="failed to get container status \"e93df25062a1a02841add4677c547656d961fa5013eab234acad405d272fd9f9\": rpc error: code = NotFound desc = could not find container \"e93df25062a1a02841add4677c547656d961fa5013eab234acad405d272fd9f9\": container with ID starting with e93df25062a1a02841add4677c547656d961fa5013eab234acad405d272fd9f9 not found: ID does not exist" Apr 23 13:47:14.433603 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:47:14.433570 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="880f9313-cfe0-4f7c-b418-2ffa1fd13533" path="/var/lib/kubelet/pods/880f9313-cfe0-4f7c-b418-2ffa1fd13533/volumes" Apr 23 13:47:15.185578 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:47:15.185542 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-65c6b49884-2cjdm"] Apr 23 13:47:15.186085 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:47:15.186055 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="880f9313-cfe0-4f7c-b418-2ffa1fd13533" containerName="main" Apr 23 13:47:15.186235 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:47:15.186084 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="880f9313-cfe0-4f7c-b418-2ffa1fd13533" containerName="main" Apr 23 13:47:15.186235 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:47:15.186125 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="880f9313-cfe0-4f7c-b418-2ffa1fd13533" containerName="storage-initializer" Apr 23 13:47:15.186235 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:47:15.186134 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="880f9313-cfe0-4f7c-b418-2ffa1fd13533" containerName="storage-initializer" Apr 23 13:47:15.186235 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:47:15.186227 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="880f9313-cfe0-4f7c-b418-2ffa1fd13533" containerName="main" Apr 23 13:47:15.191204 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:47:15.191184 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-65c6b49884-2cjdm" Apr 23 13:47:15.194670 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:47:15.194649 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-z29bv\"" Apr 23 13:47:15.194763 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:47:15.194681 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"precise-prefix-cache-test-kserve-self-signed-certs\"" Apr 23 13:47:15.198785 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:47:15.198765 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-65c6b49884-2cjdm"] Apr 23 13:47:15.217690 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:47:15.217664 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6e424413-bab6-4260-8b65-9b8c4fd431b5-tls-certs\") pod \"precise-prefix-cache-test-kserve-65c6b49884-2cjdm\" (UID: \"6e424413-bab6-4260-8b65-9b8c4fd431b5\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-65c6b49884-2cjdm" Apr 23 13:47:15.217802 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:47:15.217716 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/6e424413-bab6-4260-8b65-9b8c4fd431b5-home\") pod \"precise-prefix-cache-test-kserve-65c6b49884-2cjdm\" (UID: \"6e424413-bab6-4260-8b65-9b8c4fd431b5\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-65c6b49884-2cjdm" Apr 23 13:47:15.217802 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:47:15.217781 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/6e424413-bab6-4260-8b65-9b8c4fd431b5-tmp-dir\") pod \"precise-prefix-cache-test-kserve-65c6b49884-2cjdm\" (UID: \"6e424413-bab6-4260-8b65-9b8c4fd431b5\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-65c6b49884-2cjdm" Apr 23 13:47:15.217917 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:47:15.217833 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6e424413-bab6-4260-8b65-9b8c4fd431b5-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-65c6b49884-2cjdm\" (UID: \"6e424413-bab6-4260-8b65-9b8c4fd431b5\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-65c6b49884-2cjdm" Apr 23 13:47:15.217917 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:47:15.217866 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/6e424413-bab6-4260-8b65-9b8c4fd431b5-dshm\") pod \"precise-prefix-cache-test-kserve-65c6b49884-2cjdm\" (UID: \"6e424413-bab6-4260-8b65-9b8c4fd431b5\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-65c6b49884-2cjdm" Apr 23 13:47:15.217917 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:47:15.217892 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lc6d9\" (UniqueName: \"kubernetes.io/projected/6e424413-bab6-4260-8b65-9b8c4fd431b5-kube-api-access-lc6d9\") pod \"precise-prefix-cache-test-kserve-65c6b49884-2cjdm\" (UID: \"6e424413-bab6-4260-8b65-9b8c4fd431b5\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-65c6b49884-2cjdm" Apr 23 13:47:15.218041 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:47:15.217936 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/6e424413-bab6-4260-8b65-9b8c4fd431b5-model-cache\") pod \"precise-prefix-cache-test-kserve-65c6b49884-2cjdm\" (UID: \"6e424413-bab6-4260-8b65-9b8c4fd431b5\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-65c6b49884-2cjdm" Apr 23 13:47:15.318908 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:47:15.318864 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/6e424413-bab6-4260-8b65-9b8c4fd431b5-home\") pod \"precise-prefix-cache-test-kserve-65c6b49884-2cjdm\" (UID: \"6e424413-bab6-4260-8b65-9b8c4fd431b5\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-65c6b49884-2cjdm" Apr 23 13:47:15.319074 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:47:15.318927 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/6e424413-bab6-4260-8b65-9b8c4fd431b5-tmp-dir\") pod \"precise-prefix-cache-test-kserve-65c6b49884-2cjdm\" (UID: \"6e424413-bab6-4260-8b65-9b8c4fd431b5\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-65c6b49884-2cjdm" Apr 23 13:47:15.319074 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:47:15.318967 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6e424413-bab6-4260-8b65-9b8c4fd431b5-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-65c6b49884-2cjdm\" (UID: \"6e424413-bab6-4260-8b65-9b8c4fd431b5\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-65c6b49884-2cjdm" Apr 23 13:47:15.319074 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:47:15.319000 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/6e424413-bab6-4260-8b65-9b8c4fd431b5-dshm\") pod \"precise-prefix-cache-test-kserve-65c6b49884-2cjdm\" (UID: \"6e424413-bab6-4260-8b65-9b8c4fd431b5\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-65c6b49884-2cjdm" Apr 23 13:47:15.319074 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:47:15.319025 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lc6d9\" (UniqueName: \"kubernetes.io/projected/6e424413-bab6-4260-8b65-9b8c4fd431b5-kube-api-access-lc6d9\") pod \"precise-prefix-cache-test-kserve-65c6b49884-2cjdm\" (UID: \"6e424413-bab6-4260-8b65-9b8c4fd431b5\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-65c6b49884-2cjdm" Apr 23 13:47:15.319074 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:47:15.319066 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/6e424413-bab6-4260-8b65-9b8c4fd431b5-model-cache\") pod \"precise-prefix-cache-test-kserve-65c6b49884-2cjdm\" (UID: \"6e424413-bab6-4260-8b65-9b8c4fd431b5\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-65c6b49884-2cjdm" Apr 23 13:47:15.319367 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:47:15.319106 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6e424413-bab6-4260-8b65-9b8c4fd431b5-tls-certs\") pod \"precise-prefix-cache-test-kserve-65c6b49884-2cjdm\" (UID: \"6e424413-bab6-4260-8b65-9b8c4fd431b5\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-65c6b49884-2cjdm" Apr 23 13:47:15.319367 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:47:15.319304 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/6e424413-bab6-4260-8b65-9b8c4fd431b5-home\") pod \"precise-prefix-cache-test-kserve-65c6b49884-2cjdm\" (UID: \"6e424413-bab6-4260-8b65-9b8c4fd431b5\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-65c6b49884-2cjdm" Apr 23 13:47:15.319483 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:47:15.319403 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6e424413-bab6-4260-8b65-9b8c4fd431b5-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-65c6b49884-2cjdm\" (UID: \"6e424413-bab6-4260-8b65-9b8c4fd431b5\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-65c6b49884-2cjdm" Apr 23 13:47:15.319483 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:47:15.319451 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/6e424413-bab6-4260-8b65-9b8c4fd431b5-tmp-dir\") pod \"precise-prefix-cache-test-kserve-65c6b49884-2cjdm\" (UID: \"6e424413-bab6-4260-8b65-9b8c4fd431b5\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-65c6b49884-2cjdm" Apr 23 13:47:15.320138 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:47:15.319650 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/6e424413-bab6-4260-8b65-9b8c4fd431b5-model-cache\") pod \"precise-prefix-cache-test-kserve-65c6b49884-2cjdm\" (UID: \"6e424413-bab6-4260-8b65-9b8c4fd431b5\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-65c6b49884-2cjdm" Apr 23 13:47:15.322027 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:47:15.321968 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/6e424413-bab6-4260-8b65-9b8c4fd431b5-dshm\") pod \"precise-prefix-cache-test-kserve-65c6b49884-2cjdm\" (UID: \"6e424413-bab6-4260-8b65-9b8c4fd431b5\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-65c6b49884-2cjdm" Apr 23 13:47:15.322366 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:47:15.322315 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6e424413-bab6-4260-8b65-9b8c4fd431b5-tls-certs\") pod \"precise-prefix-cache-test-kserve-65c6b49884-2cjdm\" (UID: \"6e424413-bab6-4260-8b65-9b8c4fd431b5\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-65c6b49884-2cjdm" Apr 23 13:47:15.326910 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:47:15.326891 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lc6d9\" (UniqueName: \"kubernetes.io/projected/6e424413-bab6-4260-8b65-9b8c4fd431b5-kube-api-access-lc6d9\") pod \"precise-prefix-cache-test-kserve-65c6b49884-2cjdm\" (UID: \"6e424413-bab6-4260-8b65-9b8c4fd431b5\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-65c6b49884-2cjdm" Apr 23 13:47:15.501981 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:47:15.501952 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-65c6b49884-2cjdm" Apr 23 13:47:15.635488 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:47:15.635464 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-65c6b49884-2cjdm"] Apr 23 13:47:15.637019 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:47:15.636982 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6e424413_bab6_4260_8b65_9b8c4fd431b5.slice/crio-73aa2ae6a98ec6a97e6da7c2a2b6a0a58ad3e67b2ff5ac3cc1488f6ae61afb58 WatchSource:0}: Error finding container 73aa2ae6a98ec6a97e6da7c2a2b6a0a58ad3e67b2ff5ac3cc1488f6ae61afb58: Status 404 returned error can't find the container with id 73aa2ae6a98ec6a97e6da7c2a2b6a0a58ad3e67b2ff5ac3cc1488f6ae61afb58 Apr 23 13:47:16.605742 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:47:16.605703 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-65c6b49884-2cjdm" event={"ID":"6e424413-bab6-4260-8b65-9b8c4fd431b5","Type":"ContainerStarted","Data":"7fac2250f0ce58ff5d7860cba10c0ae2c2097e030b44e8493e9f1b81f62e0241"} Apr 23 13:47:16.605742 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:47:16.605748 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-65c6b49884-2cjdm" event={"ID":"6e424413-bab6-4260-8b65-9b8c4fd431b5","Type":"ContainerStarted","Data":"73aa2ae6a98ec6a97e6da7c2a2b6a0a58ad3e67b2ff5ac3cc1488f6ae61afb58"} Apr 23 13:47:20.621161 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:47:20.621122 2571 generic.go:358] "Generic (PLEG): container finished" podID="6e424413-bab6-4260-8b65-9b8c4fd431b5" containerID="7fac2250f0ce58ff5d7860cba10c0ae2c2097e030b44e8493e9f1b81f62e0241" exitCode=0 Apr 23 13:47:20.621516 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:47:20.621192 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-65c6b49884-2cjdm" event={"ID":"6e424413-bab6-4260-8b65-9b8c4fd431b5","Type":"ContainerDied","Data":"7fac2250f0ce58ff5d7860cba10c0ae2c2097e030b44e8493e9f1b81f62e0241"} Apr 23 13:47:21.626841 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:47:21.626810 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-65c6b49884-2cjdm" event={"ID":"6e424413-bab6-4260-8b65-9b8c4fd431b5","Type":"ContainerStarted","Data":"af42722ceea053c4fcfc890aa2d23e6a068aaa1753cd14871b85509377a90b9a"} Apr 23 13:47:21.644325 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:47:21.644273 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-65c6b49884-2cjdm" podStartSLOduration=6.644257292 podStartE2EDuration="6.644257292s" podCreationTimestamp="2026-04-23 13:47:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 13:47:21.64367958 +0000 UTC m=+975.781958268" watchObservedRunningTime="2026-04-23 13:47:21.644257292 +0000 UTC m=+975.782535978" Apr 23 13:47:25.502390 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:47:25.502347 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-65c6b49884-2cjdm" Apr 23 13:47:25.502390 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:47:25.502387 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-65c6b49884-2cjdm" Apr 23 13:47:25.514925 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:47:25.514896 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-65c6b49884-2cjdm" Apr 23 13:47:25.652197 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:47:25.652170 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-65c6b49884-2cjdm" Apr 23 13:48:01.582994 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:48:01.582954 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-65c6b49884-2cjdm"] Apr 23 13:48:01.583515 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:48:01.583250 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-65c6b49884-2cjdm" podUID="6e424413-bab6-4260-8b65-9b8c4fd431b5" containerName="main" containerID="cri-o://af42722ceea053c4fcfc890aa2d23e6a068aaa1753cd14871b85509377a90b9a" gracePeriod=30 Apr 23 13:48:01.775008 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:48:01.774975 2571 generic.go:358] "Generic (PLEG): container finished" podID="6e424413-bab6-4260-8b65-9b8c4fd431b5" containerID="af42722ceea053c4fcfc890aa2d23e6a068aaa1753cd14871b85509377a90b9a" exitCode=0 Apr 23 13:48:01.775175 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:48:01.775019 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-65c6b49884-2cjdm" event={"ID":"6e424413-bab6-4260-8b65-9b8c4fd431b5","Type":"ContainerDied","Data":"af42722ceea053c4fcfc890aa2d23e6a068aaa1753cd14871b85509377a90b9a"} Apr 23 13:48:01.827452 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:48:01.827426 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-65c6b49884-2cjdm" Apr 23 13:48:01.902805 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:48:01.902731 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/6e424413-bab6-4260-8b65-9b8c4fd431b5-model-cache\") pod \"6e424413-bab6-4260-8b65-9b8c4fd431b5\" (UID: \"6e424413-bab6-4260-8b65-9b8c4fd431b5\") " Apr 23 13:48:01.902805 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:48:01.902768 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/6e424413-bab6-4260-8b65-9b8c4fd431b5-home\") pod \"6e424413-bab6-4260-8b65-9b8c4fd431b5\" (UID: \"6e424413-bab6-4260-8b65-9b8c4fd431b5\") " Apr 23 13:48:01.902805 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:48:01.902799 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/6e424413-bab6-4260-8b65-9b8c4fd431b5-tmp-dir\") pod \"6e424413-bab6-4260-8b65-9b8c4fd431b5\" (UID: \"6e424413-bab6-4260-8b65-9b8c4fd431b5\") " Apr 23 13:48:01.903063 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:48:01.902838 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/6e424413-bab6-4260-8b65-9b8c4fd431b5-dshm\") pod \"6e424413-bab6-4260-8b65-9b8c4fd431b5\" (UID: \"6e424413-bab6-4260-8b65-9b8c4fd431b5\") " Apr 23 13:48:01.903063 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:48:01.902869 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lc6d9\" (UniqueName: \"kubernetes.io/projected/6e424413-bab6-4260-8b65-9b8c4fd431b5-kube-api-access-lc6d9\") pod \"6e424413-bab6-4260-8b65-9b8c4fd431b5\" (UID: \"6e424413-bab6-4260-8b65-9b8c4fd431b5\") " Apr 23 13:48:01.903063 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:48:01.902898 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6e424413-bab6-4260-8b65-9b8c4fd431b5-kserve-provision-location\") pod \"6e424413-bab6-4260-8b65-9b8c4fd431b5\" (UID: \"6e424413-bab6-4260-8b65-9b8c4fd431b5\") " Apr 23 13:48:01.903063 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:48:01.902946 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6e424413-bab6-4260-8b65-9b8c4fd431b5-tls-certs\") pod \"6e424413-bab6-4260-8b65-9b8c4fd431b5\" (UID: \"6e424413-bab6-4260-8b65-9b8c4fd431b5\") " Apr 23 13:48:01.903320 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:48:01.903056 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6e424413-bab6-4260-8b65-9b8c4fd431b5-model-cache" (OuterVolumeSpecName: "model-cache") pod "6e424413-bab6-4260-8b65-9b8c4fd431b5" (UID: "6e424413-bab6-4260-8b65-9b8c4fd431b5"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 13:48:01.903320 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:48:01.903076 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6e424413-bab6-4260-8b65-9b8c4fd431b5-home" (OuterVolumeSpecName: "home") pod "6e424413-bab6-4260-8b65-9b8c4fd431b5" (UID: "6e424413-bab6-4260-8b65-9b8c4fd431b5"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 13:48:01.903320 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:48:01.903149 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6e424413-bab6-4260-8b65-9b8c4fd431b5-tmp-dir" (OuterVolumeSpecName: "tmp-dir") pod "6e424413-bab6-4260-8b65-9b8c4fd431b5" (UID: "6e424413-bab6-4260-8b65-9b8c4fd431b5"). InnerVolumeSpecName "tmp-dir". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 13:48:01.903320 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:48:01.903211 2571 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/6e424413-bab6-4260-8b65-9b8c4fd431b5-home\") on node \"ip-10-0-128-108.ec2.internal\" DevicePath \"\"" Apr 23 13:48:01.903320 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:48:01.903231 2571 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/6e424413-bab6-4260-8b65-9b8c4fd431b5-model-cache\") on node \"ip-10-0-128-108.ec2.internal\" DevicePath \"\"" Apr 23 13:48:01.904913 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:48:01.904891 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6e424413-bab6-4260-8b65-9b8c4fd431b5-dshm" (OuterVolumeSpecName: "dshm") pod "6e424413-bab6-4260-8b65-9b8c4fd431b5" (UID: "6e424413-bab6-4260-8b65-9b8c4fd431b5"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 13:48:01.905043 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:48:01.905018 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6e424413-bab6-4260-8b65-9b8c4fd431b5-kube-api-access-lc6d9" (OuterVolumeSpecName: "kube-api-access-lc6d9") pod "6e424413-bab6-4260-8b65-9b8c4fd431b5" (UID: "6e424413-bab6-4260-8b65-9b8c4fd431b5"). InnerVolumeSpecName "kube-api-access-lc6d9". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 13:48:01.905097 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:48:01.905034 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e424413-bab6-4260-8b65-9b8c4fd431b5-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "6e424413-bab6-4260-8b65-9b8c4fd431b5" (UID: "6e424413-bab6-4260-8b65-9b8c4fd431b5"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 13:48:01.956967 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:48:01.956933 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6e424413-bab6-4260-8b65-9b8c4fd431b5-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "6e424413-bab6-4260-8b65-9b8c4fd431b5" (UID: "6e424413-bab6-4260-8b65-9b8c4fd431b5"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 13:48:02.004444 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:48:02.004409 2571 reconciler_common.go:299] "Volume detached for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/6e424413-bab6-4260-8b65-9b8c4fd431b5-tmp-dir\") on node \"ip-10-0-128-108.ec2.internal\" DevicePath \"\"" Apr 23 13:48:02.004444 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:48:02.004437 2571 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/6e424413-bab6-4260-8b65-9b8c4fd431b5-dshm\") on node \"ip-10-0-128-108.ec2.internal\" DevicePath \"\"" Apr 23 13:48:02.004444 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:48:02.004449 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-lc6d9\" (UniqueName: \"kubernetes.io/projected/6e424413-bab6-4260-8b65-9b8c4fd431b5-kube-api-access-lc6d9\") on node \"ip-10-0-128-108.ec2.internal\" DevicePath \"\"" Apr 23 13:48:02.004659 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:48:02.004461 2571 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6e424413-bab6-4260-8b65-9b8c4fd431b5-kserve-provision-location\") on node \"ip-10-0-128-108.ec2.internal\" DevicePath \"\"" Apr 23 13:48:02.004659 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:48:02.004470 2571 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6e424413-bab6-4260-8b65-9b8c4fd431b5-tls-certs\") on node \"ip-10-0-128-108.ec2.internal\" DevicePath \"\"" Apr 23 13:48:02.781092 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:48:02.781062 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-65c6b49884-2cjdm" Apr 23 13:48:02.781489 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:48:02.781067 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-65c6b49884-2cjdm" event={"ID":"6e424413-bab6-4260-8b65-9b8c4fd431b5","Type":"ContainerDied","Data":"73aa2ae6a98ec6a97e6da7c2a2b6a0a58ad3e67b2ff5ac3cc1488f6ae61afb58"} Apr 23 13:48:02.781489 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:48:02.781185 2571 scope.go:117] "RemoveContainer" containerID="af42722ceea053c4fcfc890aa2d23e6a068aaa1753cd14871b85509377a90b9a" Apr 23 13:48:02.791591 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:48:02.791561 2571 scope.go:117] "RemoveContainer" containerID="7fac2250f0ce58ff5d7860cba10c0ae2c2097e030b44e8493e9f1b81f62e0241" Apr 23 13:48:02.805125 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:48:02.805097 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-65c6b49884-2cjdm"] Apr 23 13:48:02.810722 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:48:02.810697 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-65c6b49884-2cjdm"] Apr 23 13:48:04.433809 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:48:04.433776 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6e424413-bab6-4260-8b65-9b8c4fd431b5" path="/var/lib/kubelet/pods/6e424413-bab6-4260-8b65-9b8c4fd431b5/volumes" Apr 23 13:51:06.433094 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:51:06.433061 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zk8wt_a5619baf-099b-4d83-ad43-fd7d0083f57b/ovn-acl-logging/0.log" Apr 23 13:51:06.443193 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:51:06.443165 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zk8wt_a5619baf-099b-4d83-ad43-fd7d0083f57b/ovn-acl-logging/0.log" Apr 23 13:53:39.959399 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:53:39.959363 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/llmisvc-controller-manager-5495d444b8-sfnwn"] Apr 23 13:53:39.959855 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:53:39.959686 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6e424413-bab6-4260-8b65-9b8c4fd431b5" containerName="main" Apr 23 13:53:39.959855 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:53:39.959699 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e424413-bab6-4260-8b65-9b8c4fd431b5" containerName="main" Apr 23 13:53:39.959855 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:53:39.959722 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6e424413-bab6-4260-8b65-9b8c4fd431b5" containerName="storage-initializer" Apr 23 13:53:39.959855 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:53:39.959728 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e424413-bab6-4260-8b65-9b8c4fd431b5" containerName="storage-initializer" Apr 23 13:53:39.959855 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:53:39.959782 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="6e424413-bab6-4260-8b65-9b8c4fd431b5" containerName="main" Apr 23 13:53:39.962554 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:53:39.962534 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-5495d444b8-sfnwn" Apr 23 13:53:39.965254 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:53:39.965229 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 23 13:53:39.966111 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:53:39.966087 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 23 13:53:39.966214 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:53:39.966134 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-webhook-server-cert\"" Apr 23 13:53:39.966214 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:53:39.966156 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-controller-manager-dockercfg-8g2fl\"" Apr 23 13:53:39.975881 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:53:39.975861 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-5495d444b8-sfnwn"] Apr 23 13:53:40.021759 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:53:40.021731 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/301592a3-8dfc-4cbf-99df-25de2040fb34-cert\") pod \"llmisvc-controller-manager-5495d444b8-sfnwn\" (UID: \"301592a3-8dfc-4cbf-99df-25de2040fb34\") " pod="kserve/llmisvc-controller-manager-5495d444b8-sfnwn" Apr 23 13:53:40.021890 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:53:40.021777 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ffx8k\" (UniqueName: \"kubernetes.io/projected/301592a3-8dfc-4cbf-99df-25de2040fb34-kube-api-access-ffx8k\") pod \"llmisvc-controller-manager-5495d444b8-sfnwn\" (UID: \"301592a3-8dfc-4cbf-99df-25de2040fb34\") " pod="kserve/llmisvc-controller-manager-5495d444b8-sfnwn" Apr 23 13:53:40.122943 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:53:40.122908 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/301592a3-8dfc-4cbf-99df-25de2040fb34-cert\") pod \"llmisvc-controller-manager-5495d444b8-sfnwn\" (UID: \"301592a3-8dfc-4cbf-99df-25de2040fb34\") " pod="kserve/llmisvc-controller-manager-5495d444b8-sfnwn" Apr 23 13:53:40.123092 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:53:40.122961 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ffx8k\" (UniqueName: \"kubernetes.io/projected/301592a3-8dfc-4cbf-99df-25de2040fb34-kube-api-access-ffx8k\") pod \"llmisvc-controller-manager-5495d444b8-sfnwn\" (UID: \"301592a3-8dfc-4cbf-99df-25de2040fb34\") " pod="kserve/llmisvc-controller-manager-5495d444b8-sfnwn" Apr 23 13:53:40.125221 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:53:40.125197 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/301592a3-8dfc-4cbf-99df-25de2040fb34-cert\") pod \"llmisvc-controller-manager-5495d444b8-sfnwn\" (UID: \"301592a3-8dfc-4cbf-99df-25de2040fb34\") " pod="kserve/llmisvc-controller-manager-5495d444b8-sfnwn" Apr 23 13:53:40.132483 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:53:40.132459 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ffx8k\" (UniqueName: \"kubernetes.io/projected/301592a3-8dfc-4cbf-99df-25de2040fb34-kube-api-access-ffx8k\") pod \"llmisvc-controller-manager-5495d444b8-sfnwn\" (UID: \"301592a3-8dfc-4cbf-99df-25de2040fb34\") " pod="kserve/llmisvc-controller-manager-5495d444b8-sfnwn" Apr 23 13:53:40.271857 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:53:40.271775 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-5495d444b8-sfnwn" Apr 23 13:53:40.389025 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:53:40.388996 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-5495d444b8-sfnwn"] Apr 23 13:53:40.390844 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:53:40.390816 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod301592a3_8dfc_4cbf_99df_25de2040fb34.slice/crio-633663de0df8922a807080c921ba13090b537ba21d851a0031add298c9967a5c WatchSource:0}: Error finding container 633663de0df8922a807080c921ba13090b537ba21d851a0031add298c9967a5c: Status 404 returned error can't find the container with id 633663de0df8922a807080c921ba13090b537ba21d851a0031add298c9967a5c Apr 23 13:53:40.392123 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:53:40.392105 2571 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 13:53:40.973914 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:53:40.973882 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-5495d444b8-sfnwn" event={"ID":"301592a3-8dfc-4cbf-99df-25de2040fb34","Type":"ContainerStarted","Data":"633663de0df8922a807080c921ba13090b537ba21d851a0031add298c9967a5c"} Apr 23 13:53:43.988154 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:53:43.988116 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-5495d444b8-sfnwn" event={"ID":"301592a3-8dfc-4cbf-99df-25de2040fb34","Type":"ContainerStarted","Data":"272157db6719a62ceb0bc2f6c1a09f8a2bfccd03d4a63d4c384770faef0972a5"} Apr 23 13:53:43.988541 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:53:43.988163 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/llmisvc-controller-manager-5495d444b8-sfnwn" Apr 23 13:53:44.007667 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:53:44.007622 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/llmisvc-controller-manager-5495d444b8-sfnwn" podStartSLOduration=1.921272064 podStartE2EDuration="5.00760993s" podCreationTimestamp="2026-04-23 13:53:39 +0000 UTC" firstStartedPulling="2026-04-23 13:53:40.392284508 +0000 UTC m=+1354.530563173" lastFinishedPulling="2026-04-23 13:53:43.478622365 +0000 UTC m=+1357.616901039" observedRunningTime="2026-04-23 13:53:44.005059247 +0000 UTC m=+1358.143337954" watchObservedRunningTime="2026-04-23 13:53:44.00760993 +0000 UTC m=+1358.145888604" Apr 23 13:54:14.994656 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:54:14.994625 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/llmisvc-controller-manager-5495d444b8-sfnwn" Apr 23 13:56:06.467013 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:56:06.466981 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zk8wt_a5619baf-099b-4d83-ad43-fd7d0083f57b/ovn-acl-logging/0.log" Apr 23 13:56:06.470806 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:56:06.470783 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zk8wt_a5619baf-099b-4d83-ad43-fd7d0083f57b/ovn-acl-logging/0.log" Apr 23 13:56:53.805364 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:56:53.805315 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1"] Apr 23 13:56:53.808741 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:56:53.808726 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 23 13:56:53.811412 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:56:53.811380 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvc-model-fb-opt-125m-route-f312f5-cb7fb8cf-dockercfg-8kjw9\"" Apr 23 13:56:53.812095 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:56:53.812072 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisv4e643bc258191ffc517a31cd1d0ddd27-kserve-self-signed-certs\"" Apr 23 13:56:53.812207 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:56:53.812073 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-z29bv\"" Apr 23 13:56:53.819746 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:56:53.819726 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1"] Apr 23 13:56:53.828816 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:56:53.828793 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/a20545ad-9664-41b0-bb23-f4c7a85b36d8-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"a20545ad-9664-41b0-bb23-f4c7a85b36d8\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 23 13:56:53.828930 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:56:53.828839 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a20545ad-9664-41b0-bb23-f4c7a85b36d8-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"a20545ad-9664-41b0-bb23-f4c7a85b36d8\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 23 13:56:53.828993 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:56:53.828932 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a20545ad-9664-41b0-bb23-f4c7a85b36d8-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"a20545ad-9664-41b0-bb23-f4c7a85b36d8\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 23 13:56:53.829061 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:56:53.829017 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/a20545ad-9664-41b0-bb23-f4c7a85b36d8-home\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"a20545ad-9664-41b0-bb23-f4c7a85b36d8\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 23 13:56:53.829117 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:56:53.829061 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/a20545ad-9664-41b0-bb23-f4c7a85b36d8-tmp-dir\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"a20545ad-9664-41b0-bb23-f4c7a85b36d8\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 23 13:56:53.829117 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:56:53.829090 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/a20545ad-9664-41b0-bb23-f4c7a85b36d8-dshm\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"a20545ad-9664-41b0-bb23-f4c7a85b36d8\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 23 13:56:53.829211 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:56:53.829138 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gbbkk\" (UniqueName: \"kubernetes.io/projected/a20545ad-9664-41b0-bb23-f4c7a85b36d8-kube-api-access-gbbkk\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"a20545ad-9664-41b0-bb23-f4c7a85b36d8\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 23 13:56:53.930246 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:56:53.930207 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a20545ad-9664-41b0-bb23-f4c7a85b36d8-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"a20545ad-9664-41b0-bb23-f4c7a85b36d8\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 23 13:56:53.930246 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:56:53.930249 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a20545ad-9664-41b0-bb23-f4c7a85b36d8-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"a20545ad-9664-41b0-bb23-f4c7a85b36d8\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 23 13:56:53.930508 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:56:53.930289 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/a20545ad-9664-41b0-bb23-f4c7a85b36d8-home\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"a20545ad-9664-41b0-bb23-f4c7a85b36d8\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 23 13:56:53.930508 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:56:53.930309 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/a20545ad-9664-41b0-bb23-f4c7a85b36d8-tmp-dir\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"a20545ad-9664-41b0-bb23-f4c7a85b36d8\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 23 13:56:53.930508 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:56:53.930324 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/a20545ad-9664-41b0-bb23-f4c7a85b36d8-dshm\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"a20545ad-9664-41b0-bb23-f4c7a85b36d8\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 23 13:56:53.930508 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:56:53.930409 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gbbkk\" (UniqueName: \"kubernetes.io/projected/a20545ad-9664-41b0-bb23-f4c7a85b36d8-kube-api-access-gbbkk\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"a20545ad-9664-41b0-bb23-f4c7a85b36d8\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 23 13:56:53.930508 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:56:53.930466 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/a20545ad-9664-41b0-bb23-f4c7a85b36d8-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"a20545ad-9664-41b0-bb23-f4c7a85b36d8\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 23 13:56:53.930738 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:56:53.930684 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/a20545ad-9664-41b0-bb23-f4c7a85b36d8-home\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"a20545ad-9664-41b0-bb23-f4c7a85b36d8\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 23 13:56:53.930795 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:56:53.930739 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a20545ad-9664-41b0-bb23-f4c7a85b36d8-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"a20545ad-9664-41b0-bb23-f4c7a85b36d8\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 23 13:56:53.930893 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:56:53.930873 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/a20545ad-9664-41b0-bb23-f4c7a85b36d8-tmp-dir\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"a20545ad-9664-41b0-bb23-f4c7a85b36d8\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 23 13:56:53.930952 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:56:53.930892 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/a20545ad-9664-41b0-bb23-f4c7a85b36d8-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"a20545ad-9664-41b0-bb23-f4c7a85b36d8\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 23 13:56:53.932616 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:56:53.932597 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/a20545ad-9664-41b0-bb23-f4c7a85b36d8-dshm\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"a20545ad-9664-41b0-bb23-f4c7a85b36d8\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 23 13:56:53.932706 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:56:53.932689 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a20545ad-9664-41b0-bb23-f4c7a85b36d8-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"a20545ad-9664-41b0-bb23-f4c7a85b36d8\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 23 13:56:53.945075 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:56:53.945047 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gbbkk\" (UniqueName: \"kubernetes.io/projected/a20545ad-9664-41b0-bb23-f4c7a85b36d8-kube-api-access-gbbkk\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"a20545ad-9664-41b0-bb23-f4c7a85b36d8\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 23 13:56:54.120403 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:56:54.120304 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 23 13:56:54.251950 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:56:54.251922 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1"] Apr 23 13:56:54.253630 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:56:54.253605 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda20545ad_9664_41b0_bb23_f4c7a85b36d8.slice/crio-4965fb7b87ba2db016a02bcc7823a7010629f68024e0abfc2f4f6112e2ebae95 WatchSource:0}: Error finding container 4965fb7b87ba2db016a02bcc7823a7010629f68024e0abfc2f4f6112e2ebae95: Status 404 returned error can't find the container with id 4965fb7b87ba2db016a02bcc7823a7010629f68024e0abfc2f4f6112e2ebae95 Apr 23 13:56:54.689904 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:56:54.689868 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" event={"ID":"a20545ad-9664-41b0-bb23-f4c7a85b36d8","Type":"ContainerStarted","Data":"4984180a7224af02bf12aabd83bb3bf9d955974b2778c3da0d6731bbd957d53b"} Apr 23 13:56:54.689904 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:56:54.689908 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" event={"ID":"a20545ad-9664-41b0-bb23-f4c7a85b36d8","Type":"ContainerStarted","Data":"4965fb7b87ba2db016a02bcc7823a7010629f68024e0abfc2f4f6112e2ebae95"} Apr 23 13:56:58.707781 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:56:58.707748 2571 generic.go:358] "Generic (PLEG): container finished" podID="a20545ad-9664-41b0-bb23-f4c7a85b36d8" containerID="4984180a7224af02bf12aabd83bb3bf9d955974b2778c3da0d6731bbd957d53b" exitCode=0 Apr 23 13:56:58.708205 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:56:58.707820 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" event={"ID":"a20545ad-9664-41b0-bb23-f4c7a85b36d8","Type":"ContainerDied","Data":"4984180a7224af02bf12aabd83bb3bf9d955974b2778c3da0d6731bbd957d53b"} Apr 23 13:57:38.876775 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:57:38.876738 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" event={"ID":"a20545ad-9664-41b0-bb23-f4c7a85b36d8","Type":"ContainerStarted","Data":"48fd680e23e02870a7340146b917967b289670bb21d84d677dcc93688d27a7de"} Apr 23 13:57:38.896361 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:57:38.896285 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" podStartSLOduration=6.677821195 podStartE2EDuration="45.896267028s" podCreationTimestamp="2026-04-23 13:56:53 +0000 UTC" firstStartedPulling="2026-04-23 13:56:58.708933684 +0000 UTC m=+1552.847212347" lastFinishedPulling="2026-04-23 13:57:37.927379516 +0000 UTC m=+1592.065658180" observedRunningTime="2026-04-23 13:57:38.894173944 +0000 UTC m=+1593.032452646" watchObservedRunningTime="2026-04-23 13:57:38.896267028 +0000 UTC m=+1593.034545714" Apr 23 13:59:25.672058 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:59:25.671976 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-6p5zt"] Apr 23 13:59:25.677477 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:59:25.677447 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-6p5zt" Apr 23 13:59:25.680491 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:59:25.680468 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-gateway-2-openshift-default-dockercfg-cfl5r\"" Apr 23 13:59:25.689406 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:59:25.689381 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-6p5zt"] Apr 23 13:59:25.818714 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:59:25.818679 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/b6496b4e-654d-4b48-ad97-c2e8d97273d0-credential-socket\") pod \"router-gateway-2-openshift-default-6866b85949-6p5zt\" (UID: \"b6496b4e-654d-4b48-ad97-c2e8d97273d0\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-6p5zt" Apr 23 13:59:25.818714 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:59:25.818718 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/b6496b4e-654d-4b48-ad97-c2e8d97273d0-workload-certs\") pod \"router-gateway-2-openshift-default-6866b85949-6p5zt\" (UID: \"b6496b4e-654d-4b48-ad97-c2e8d97273d0\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-6p5zt" Apr 23 13:59:25.818933 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:59:25.818749 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-khtwq\" (UniqueName: \"kubernetes.io/projected/b6496b4e-654d-4b48-ad97-c2e8d97273d0-kube-api-access-khtwq\") pod \"router-gateway-2-openshift-default-6866b85949-6p5zt\" (UID: \"b6496b4e-654d-4b48-ad97-c2e8d97273d0\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-6p5zt" Apr 23 13:59:25.818933 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:59:25.818784 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/b6496b4e-654d-4b48-ad97-c2e8d97273d0-istio-data\") pod \"router-gateway-2-openshift-default-6866b85949-6p5zt\" (UID: \"b6496b4e-654d-4b48-ad97-c2e8d97273d0\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-6p5zt" Apr 23 13:59:25.818933 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:59:25.818859 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/b6496b4e-654d-4b48-ad97-c2e8d97273d0-istiod-ca-cert\") pod \"router-gateway-2-openshift-default-6866b85949-6p5zt\" (UID: \"b6496b4e-654d-4b48-ad97-c2e8d97273d0\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-6p5zt" Apr 23 13:59:25.818933 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:59:25.818906 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/b6496b4e-654d-4b48-ad97-c2e8d97273d0-istio-envoy\") pod \"router-gateway-2-openshift-default-6866b85949-6p5zt\" (UID: \"b6496b4e-654d-4b48-ad97-c2e8d97273d0\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-6p5zt" Apr 23 13:59:25.819083 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:59:25.818961 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/b6496b4e-654d-4b48-ad97-c2e8d97273d0-istio-podinfo\") pod \"router-gateway-2-openshift-default-6866b85949-6p5zt\" (UID: \"b6496b4e-654d-4b48-ad97-c2e8d97273d0\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-6p5zt" Apr 23 13:59:25.819083 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:59:25.819004 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/b6496b4e-654d-4b48-ad97-c2e8d97273d0-workload-socket\") pod \"router-gateway-2-openshift-default-6866b85949-6p5zt\" (UID: \"b6496b4e-654d-4b48-ad97-c2e8d97273d0\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-6p5zt" Apr 23 13:59:25.819083 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:59:25.819052 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/b6496b4e-654d-4b48-ad97-c2e8d97273d0-istio-token\") pod \"router-gateway-2-openshift-default-6866b85949-6p5zt\" (UID: \"b6496b4e-654d-4b48-ad97-c2e8d97273d0\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-6p5zt" Apr 23 13:59:25.920421 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:59:25.920382 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/b6496b4e-654d-4b48-ad97-c2e8d97273d0-istio-token\") pod \"router-gateway-2-openshift-default-6866b85949-6p5zt\" (UID: \"b6496b4e-654d-4b48-ad97-c2e8d97273d0\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-6p5zt" Apr 23 13:59:25.920614 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:59:25.920433 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/b6496b4e-654d-4b48-ad97-c2e8d97273d0-credential-socket\") pod \"router-gateway-2-openshift-default-6866b85949-6p5zt\" (UID: \"b6496b4e-654d-4b48-ad97-c2e8d97273d0\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-6p5zt" Apr 23 13:59:25.920614 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:59:25.920458 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/b6496b4e-654d-4b48-ad97-c2e8d97273d0-workload-certs\") pod \"router-gateway-2-openshift-default-6866b85949-6p5zt\" (UID: \"b6496b4e-654d-4b48-ad97-c2e8d97273d0\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-6p5zt" Apr 23 13:59:25.920614 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:59:25.920483 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-khtwq\" (UniqueName: \"kubernetes.io/projected/b6496b4e-654d-4b48-ad97-c2e8d97273d0-kube-api-access-khtwq\") pod \"router-gateway-2-openshift-default-6866b85949-6p5zt\" (UID: \"b6496b4e-654d-4b48-ad97-c2e8d97273d0\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-6p5zt" Apr 23 13:59:25.920614 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:59:25.920508 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/b6496b4e-654d-4b48-ad97-c2e8d97273d0-istio-data\") pod \"router-gateway-2-openshift-default-6866b85949-6p5zt\" (UID: \"b6496b4e-654d-4b48-ad97-c2e8d97273d0\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-6p5zt" Apr 23 13:59:25.920614 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:59:25.920541 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/b6496b4e-654d-4b48-ad97-c2e8d97273d0-istiod-ca-cert\") pod \"router-gateway-2-openshift-default-6866b85949-6p5zt\" (UID: \"b6496b4e-654d-4b48-ad97-c2e8d97273d0\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-6p5zt" Apr 23 13:59:25.920930 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:59:25.920904 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/b6496b4e-654d-4b48-ad97-c2e8d97273d0-credential-socket\") pod \"router-gateway-2-openshift-default-6866b85949-6p5zt\" (UID: \"b6496b4e-654d-4b48-ad97-c2e8d97273d0\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-6p5zt" Apr 23 13:59:25.920994 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:59:25.920952 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/b6496b4e-654d-4b48-ad97-c2e8d97273d0-istio-data\") pod \"router-gateway-2-openshift-default-6866b85949-6p5zt\" (UID: \"b6496b4e-654d-4b48-ad97-c2e8d97273d0\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-6p5zt" Apr 23 13:59:25.920994 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:59:25.920954 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/b6496b4e-654d-4b48-ad97-c2e8d97273d0-istio-envoy\") pod \"router-gateway-2-openshift-default-6866b85949-6p5zt\" (UID: \"b6496b4e-654d-4b48-ad97-c2e8d97273d0\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-6p5zt" Apr 23 13:59:25.921096 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:59:25.921054 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/b6496b4e-654d-4b48-ad97-c2e8d97273d0-istio-podinfo\") pod \"router-gateway-2-openshift-default-6866b85949-6p5zt\" (UID: \"b6496b4e-654d-4b48-ad97-c2e8d97273d0\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-6p5zt" Apr 23 13:59:25.921155 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:59:25.921132 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/b6496b4e-654d-4b48-ad97-c2e8d97273d0-workload-socket\") pod \"router-gateway-2-openshift-default-6866b85949-6p5zt\" (UID: \"b6496b4e-654d-4b48-ad97-c2e8d97273d0\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-6p5zt" Apr 23 13:59:25.921278 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:59:25.921248 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/b6496b4e-654d-4b48-ad97-c2e8d97273d0-istiod-ca-cert\") pod \"router-gateway-2-openshift-default-6866b85949-6p5zt\" (UID: \"b6496b4e-654d-4b48-ad97-c2e8d97273d0\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-6p5zt" Apr 23 13:59:25.921439 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:59:25.921317 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/b6496b4e-654d-4b48-ad97-c2e8d97273d0-workload-certs\") pod \"router-gateway-2-openshift-default-6866b85949-6p5zt\" (UID: \"b6496b4e-654d-4b48-ad97-c2e8d97273d0\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-6p5zt" Apr 23 13:59:25.921595 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:59:25.921568 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/b6496b4e-654d-4b48-ad97-c2e8d97273d0-workload-socket\") pod \"router-gateway-2-openshift-default-6866b85949-6p5zt\" (UID: \"b6496b4e-654d-4b48-ad97-c2e8d97273d0\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-6p5zt" Apr 23 13:59:25.923206 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:59:25.923146 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/b6496b4e-654d-4b48-ad97-c2e8d97273d0-istio-envoy\") pod \"router-gateway-2-openshift-default-6866b85949-6p5zt\" (UID: \"b6496b4e-654d-4b48-ad97-c2e8d97273d0\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-6p5zt" Apr 23 13:59:25.923676 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:59:25.923655 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/b6496b4e-654d-4b48-ad97-c2e8d97273d0-istio-podinfo\") pod \"router-gateway-2-openshift-default-6866b85949-6p5zt\" (UID: \"b6496b4e-654d-4b48-ad97-c2e8d97273d0\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-6p5zt" Apr 23 13:59:25.928884 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:59:25.928857 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/b6496b4e-654d-4b48-ad97-c2e8d97273d0-istio-token\") pod \"router-gateway-2-openshift-default-6866b85949-6p5zt\" (UID: \"b6496b4e-654d-4b48-ad97-c2e8d97273d0\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-6p5zt" Apr 23 13:59:25.929241 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:59:25.929218 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-khtwq\" (UniqueName: \"kubernetes.io/projected/b6496b4e-654d-4b48-ad97-c2e8d97273d0-kube-api-access-khtwq\") pod \"router-gateway-2-openshift-default-6866b85949-6p5zt\" (UID: \"b6496b4e-654d-4b48-ad97-c2e8d97273d0\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-6p5zt" Apr 23 13:59:25.991897 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:59:25.991866 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-6p5zt" Apr 23 13:59:26.133518 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:59:26.133493 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-6p5zt"] Apr 23 13:59:26.136194 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:59:26.136159 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb6496b4e_654d_4b48_ad97_c2e8d97273d0.slice/crio-af2125b126c65eadbe507bd90db5001d6164fc983cc5edd6dd86f9a57967080f WatchSource:0}: Error finding container af2125b126c65eadbe507bd90db5001d6164fc983cc5edd6dd86f9a57967080f: Status 404 returned error can't find the container with id af2125b126c65eadbe507bd90db5001d6164fc983cc5edd6dd86f9a57967080f Apr 23 13:59:26.138035 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:59:26.138015 2571 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 13:59:26.138515 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:59:26.138471 2571 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 23 13:59:26.138580 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:59:26.138556 2571 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 23 13:59:26.138613 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:59:26.138601 2571 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 23 13:59:26.292086 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:59:26.292043 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-6p5zt" event={"ID":"b6496b4e-654d-4b48-ad97-c2e8d97273d0","Type":"ContainerStarted","Data":"40aefb17dd62f3d1a8b0d6fb79328aa88257c4725897b8cab4141d1507256328"} Apr 23 13:59:26.292271 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:59:26.292093 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-6p5zt" event={"ID":"b6496b4e-654d-4b48-ad97-c2e8d97273d0","Type":"ContainerStarted","Data":"af2125b126c65eadbe507bd90db5001d6164fc983cc5edd6dd86f9a57967080f"} Apr 23 13:59:26.314980 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:59:26.314924 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-6p5zt" podStartSLOduration=1.314905225 podStartE2EDuration="1.314905225s" podCreationTimestamp="2026-04-23 13:59:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 13:59:26.312357121 +0000 UTC m=+1700.450635803" watchObservedRunningTime="2026-04-23 13:59:26.314905225 +0000 UTC m=+1700.453183915" Apr 23 13:59:26.992435 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:59:26.992400 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-6p5zt" Apr 23 13:59:27.997725 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:59:27.997695 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-6p5zt" Apr 23 13:59:28.299721 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:59:28.299631 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-6p5zt" Apr 23 13:59:28.347072 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:59:28.347033 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-6p5zt" Apr 23 13:59:43.040648 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:59:43.040605 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-769694c76-wpvqk"] Apr 23 13:59:43.045200 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:59:43.045180 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-769694c76-wpvqk" Apr 23 13:59:43.047657 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:59:43.047629 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-with-refs-pd-test-kserve-self-signed-certs\"" Apr 23 13:59:43.047774 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:59:43.047686 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-with-refs-pd-test-kserve-dockercfg-nxlq8\"" Apr 23 13:59:43.053578 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:59:43.053424 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-66766f9bb7-t687q"] Apr 23 13:59:43.057742 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:59:43.057722 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-66766f9bb7-t687q" Apr 23 13:59:43.065824 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:59:43.065799 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-769694c76-wpvqk"] Apr 23 13:59:43.071313 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:59:43.071231 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-66766f9bb7-t687q"] Apr 23 13:59:43.078039 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:59:43.078015 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/853fa6ac-eeaa-424a-905d-98a6ca5d65fc-tls-certs\") pod \"router-with-refs-pd-test-kserve-prefill-66766f9bb7-t687q\" (UID: \"853fa6ac-eeaa-424a-905d-98a6ca5d65fc\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-66766f9bb7-t687q" Apr 23 13:59:43.078168 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:59:43.078076 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/49f718e6-37f4-4a2e-a9b7-99980cd55794-home\") pod \"router-with-refs-pd-test-kserve-769694c76-wpvqk\" (UID: \"49f718e6-37f4-4a2e-a9b7-99980cd55794\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-769694c76-wpvqk" Apr 23 13:59:43.078168 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:59:43.078117 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/853fa6ac-eeaa-424a-905d-98a6ca5d65fc-model-cache\") pod \"router-with-refs-pd-test-kserve-prefill-66766f9bb7-t687q\" (UID: \"853fa6ac-eeaa-424a-905d-98a6ca5d65fc\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-66766f9bb7-t687q" Apr 23 13:59:43.078168 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:59:43.078157 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/49f718e6-37f4-4a2e-a9b7-99980cd55794-model-cache\") pod \"router-with-refs-pd-test-kserve-769694c76-wpvqk\" (UID: \"49f718e6-37f4-4a2e-a9b7-99980cd55794\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-769694c76-wpvqk" Apr 23 13:59:43.078310 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:59:43.078195 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zngbx\" (UniqueName: \"kubernetes.io/projected/49f718e6-37f4-4a2e-a9b7-99980cd55794-kube-api-access-zngbx\") pod \"router-with-refs-pd-test-kserve-769694c76-wpvqk\" (UID: \"49f718e6-37f4-4a2e-a9b7-99980cd55794\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-769694c76-wpvqk" Apr 23 13:59:43.078310 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:59:43.078263 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/49f718e6-37f4-4a2e-a9b7-99980cd55794-tmp-dir\") pod \"router-with-refs-pd-test-kserve-769694c76-wpvqk\" (UID: \"49f718e6-37f4-4a2e-a9b7-99980cd55794\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-769694c76-wpvqk" Apr 23 13:59:43.078427 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:59:43.078356 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/853fa6ac-eeaa-424a-905d-98a6ca5d65fc-dshm\") pod \"router-with-refs-pd-test-kserve-prefill-66766f9bb7-t687q\" (UID: \"853fa6ac-eeaa-424a-905d-98a6ca5d65fc\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-66766f9bb7-t687q" Apr 23 13:59:43.078427 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:59:43.078412 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/853fa6ac-eeaa-424a-905d-98a6ca5d65fc-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-prefill-66766f9bb7-t687q\" (UID: \"853fa6ac-eeaa-424a-905d-98a6ca5d65fc\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-66766f9bb7-t687q" Apr 23 13:59:43.078513 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:59:43.078451 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/49f718e6-37f4-4a2e-a9b7-99980cd55794-dshm\") pod \"router-with-refs-pd-test-kserve-769694c76-wpvqk\" (UID: \"49f718e6-37f4-4a2e-a9b7-99980cd55794\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-769694c76-wpvqk" Apr 23 13:59:43.078562 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:59:43.078520 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/49f718e6-37f4-4a2e-a9b7-99980cd55794-tls-certs\") pod \"router-with-refs-pd-test-kserve-769694c76-wpvqk\" (UID: \"49f718e6-37f4-4a2e-a9b7-99980cd55794\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-769694c76-wpvqk" Apr 23 13:59:43.078609 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:59:43.078566 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/49f718e6-37f4-4a2e-a9b7-99980cd55794-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-769694c76-wpvqk\" (UID: \"49f718e6-37f4-4a2e-a9b7-99980cd55794\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-769694c76-wpvqk" Apr 23 13:59:43.078661 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:59:43.078615 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/853fa6ac-eeaa-424a-905d-98a6ca5d65fc-home\") pod \"router-with-refs-pd-test-kserve-prefill-66766f9bb7-t687q\" (UID: \"853fa6ac-eeaa-424a-905d-98a6ca5d65fc\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-66766f9bb7-t687q" Apr 23 13:59:43.078661 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:59:43.078640 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/853fa6ac-eeaa-424a-905d-98a6ca5d65fc-tmp-dir\") pod \"router-with-refs-pd-test-kserve-prefill-66766f9bb7-t687q\" (UID: \"853fa6ac-eeaa-424a-905d-98a6ca5d65fc\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-66766f9bb7-t687q" Apr 23 13:59:43.078770 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:59:43.078670 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v5bnl\" (UniqueName: \"kubernetes.io/projected/853fa6ac-eeaa-424a-905d-98a6ca5d65fc-kube-api-access-v5bnl\") pod \"router-with-refs-pd-test-kserve-prefill-66766f9bb7-t687q\" (UID: \"853fa6ac-eeaa-424a-905d-98a6ca5d65fc\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-66766f9bb7-t687q" Apr 23 13:59:43.179571 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:59:43.179515 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/853fa6ac-eeaa-424a-905d-98a6ca5d65fc-home\") pod \"router-with-refs-pd-test-kserve-prefill-66766f9bb7-t687q\" (UID: \"853fa6ac-eeaa-424a-905d-98a6ca5d65fc\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-66766f9bb7-t687q" Apr 23 13:59:43.179571 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:59:43.179572 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/853fa6ac-eeaa-424a-905d-98a6ca5d65fc-tmp-dir\") pod \"router-with-refs-pd-test-kserve-prefill-66766f9bb7-t687q\" (UID: \"853fa6ac-eeaa-424a-905d-98a6ca5d65fc\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-66766f9bb7-t687q" Apr 23 13:59:43.179827 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:59:43.179601 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v5bnl\" (UniqueName: \"kubernetes.io/projected/853fa6ac-eeaa-424a-905d-98a6ca5d65fc-kube-api-access-v5bnl\") pod \"router-with-refs-pd-test-kserve-prefill-66766f9bb7-t687q\" (UID: \"853fa6ac-eeaa-424a-905d-98a6ca5d65fc\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-66766f9bb7-t687q" Apr 23 13:59:43.179827 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:59:43.179629 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/853fa6ac-eeaa-424a-905d-98a6ca5d65fc-tls-certs\") pod \"router-with-refs-pd-test-kserve-prefill-66766f9bb7-t687q\" (UID: \"853fa6ac-eeaa-424a-905d-98a6ca5d65fc\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-66766f9bb7-t687q" Apr 23 13:59:43.179827 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:59:43.179651 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/49f718e6-37f4-4a2e-a9b7-99980cd55794-home\") pod \"router-with-refs-pd-test-kserve-769694c76-wpvqk\" (UID: \"49f718e6-37f4-4a2e-a9b7-99980cd55794\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-769694c76-wpvqk" Apr 23 13:59:43.179827 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:59:43.179678 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/853fa6ac-eeaa-424a-905d-98a6ca5d65fc-model-cache\") pod \"router-with-refs-pd-test-kserve-prefill-66766f9bb7-t687q\" (UID: \"853fa6ac-eeaa-424a-905d-98a6ca5d65fc\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-66766f9bb7-t687q" Apr 23 13:59:43.179827 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:59:43.179711 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/49f718e6-37f4-4a2e-a9b7-99980cd55794-model-cache\") pod \"router-with-refs-pd-test-kserve-769694c76-wpvqk\" (UID: \"49f718e6-37f4-4a2e-a9b7-99980cd55794\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-769694c76-wpvqk" Apr 23 13:59:43.179827 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:59:43.179749 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zngbx\" (UniqueName: \"kubernetes.io/projected/49f718e6-37f4-4a2e-a9b7-99980cd55794-kube-api-access-zngbx\") pod \"router-with-refs-pd-test-kserve-769694c76-wpvqk\" (UID: \"49f718e6-37f4-4a2e-a9b7-99980cd55794\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-769694c76-wpvqk" Apr 23 13:59:43.180148 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:59:43.179846 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/49f718e6-37f4-4a2e-a9b7-99980cd55794-tmp-dir\") pod \"router-with-refs-pd-test-kserve-769694c76-wpvqk\" (UID: \"49f718e6-37f4-4a2e-a9b7-99980cd55794\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-769694c76-wpvqk" Apr 23 13:59:43.180148 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:59:43.179895 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/853fa6ac-eeaa-424a-905d-98a6ca5d65fc-dshm\") pod \"router-with-refs-pd-test-kserve-prefill-66766f9bb7-t687q\" (UID: \"853fa6ac-eeaa-424a-905d-98a6ca5d65fc\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-66766f9bb7-t687q" Apr 23 13:59:43.180148 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:59:43.179937 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/853fa6ac-eeaa-424a-905d-98a6ca5d65fc-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-prefill-66766f9bb7-t687q\" (UID: \"853fa6ac-eeaa-424a-905d-98a6ca5d65fc\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-66766f9bb7-t687q" Apr 23 13:59:43.180148 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:59:43.179964 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/853fa6ac-eeaa-424a-905d-98a6ca5d65fc-home\") pod \"router-with-refs-pd-test-kserve-prefill-66766f9bb7-t687q\" (UID: \"853fa6ac-eeaa-424a-905d-98a6ca5d65fc\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-66766f9bb7-t687q" Apr 23 13:59:43.180148 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:59:43.179979 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/49f718e6-37f4-4a2e-a9b7-99980cd55794-dshm\") pod \"router-with-refs-pd-test-kserve-769694c76-wpvqk\" (UID: \"49f718e6-37f4-4a2e-a9b7-99980cd55794\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-769694c76-wpvqk" Apr 23 13:59:43.180148 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:59:43.180025 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/49f718e6-37f4-4a2e-a9b7-99980cd55794-home\") pod \"router-with-refs-pd-test-kserve-769694c76-wpvqk\" (UID: \"49f718e6-37f4-4a2e-a9b7-99980cd55794\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-769694c76-wpvqk" Apr 23 13:59:43.180148 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:59:43.180026 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/49f718e6-37f4-4a2e-a9b7-99980cd55794-tls-certs\") pod \"router-with-refs-pd-test-kserve-769694c76-wpvqk\" (UID: \"49f718e6-37f4-4a2e-a9b7-99980cd55794\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-769694c76-wpvqk" Apr 23 13:59:43.180148 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:59:43.180061 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/853fa6ac-eeaa-424a-905d-98a6ca5d65fc-model-cache\") pod \"router-with-refs-pd-test-kserve-prefill-66766f9bb7-t687q\" (UID: \"853fa6ac-eeaa-424a-905d-98a6ca5d65fc\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-66766f9bb7-t687q" Apr 23 13:59:43.180148 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:59:43.180110 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/49f718e6-37f4-4a2e-a9b7-99980cd55794-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-769694c76-wpvqk\" (UID: \"49f718e6-37f4-4a2e-a9b7-99980cd55794\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-769694c76-wpvqk" Apr 23 13:59:43.180656 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:59:43.180187 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/49f718e6-37f4-4a2e-a9b7-99980cd55794-tmp-dir\") pod \"router-with-refs-pd-test-kserve-769694c76-wpvqk\" (UID: \"49f718e6-37f4-4a2e-a9b7-99980cd55794\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-769694c76-wpvqk" Apr 23 13:59:43.180656 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:59:43.180291 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/853fa6ac-eeaa-424a-905d-98a6ca5d65fc-tmp-dir\") pod \"router-with-refs-pd-test-kserve-prefill-66766f9bb7-t687q\" (UID: \"853fa6ac-eeaa-424a-905d-98a6ca5d65fc\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-66766f9bb7-t687q" Apr 23 13:59:43.180656 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:59:43.180478 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/49f718e6-37f4-4a2e-a9b7-99980cd55794-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-769694c76-wpvqk\" (UID: \"49f718e6-37f4-4a2e-a9b7-99980cd55794\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-769694c76-wpvqk" Apr 23 13:59:43.180656 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:59:43.180490 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/853fa6ac-eeaa-424a-905d-98a6ca5d65fc-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-prefill-66766f9bb7-t687q\" (UID: \"853fa6ac-eeaa-424a-905d-98a6ca5d65fc\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-66766f9bb7-t687q" Apr 23 13:59:43.180656 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:59:43.180566 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/49f718e6-37f4-4a2e-a9b7-99980cd55794-model-cache\") pod \"router-with-refs-pd-test-kserve-769694c76-wpvqk\" (UID: \"49f718e6-37f4-4a2e-a9b7-99980cd55794\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-769694c76-wpvqk" Apr 23 13:59:43.182159 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:59:43.182138 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/49f718e6-37f4-4a2e-a9b7-99980cd55794-dshm\") pod \"router-with-refs-pd-test-kserve-769694c76-wpvqk\" (UID: \"49f718e6-37f4-4a2e-a9b7-99980cd55794\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-769694c76-wpvqk" Apr 23 13:59:43.182271 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:59:43.182219 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/853fa6ac-eeaa-424a-905d-98a6ca5d65fc-dshm\") pod \"router-with-refs-pd-test-kserve-prefill-66766f9bb7-t687q\" (UID: \"853fa6ac-eeaa-424a-905d-98a6ca5d65fc\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-66766f9bb7-t687q" Apr 23 13:59:43.182576 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:59:43.182557 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/853fa6ac-eeaa-424a-905d-98a6ca5d65fc-tls-certs\") pod \"router-with-refs-pd-test-kserve-prefill-66766f9bb7-t687q\" (UID: \"853fa6ac-eeaa-424a-905d-98a6ca5d65fc\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-66766f9bb7-t687q" Apr 23 13:59:43.182576 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:59:43.182567 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/49f718e6-37f4-4a2e-a9b7-99980cd55794-tls-certs\") pod \"router-with-refs-pd-test-kserve-769694c76-wpvqk\" (UID: \"49f718e6-37f4-4a2e-a9b7-99980cd55794\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-769694c76-wpvqk" Apr 23 13:59:43.188411 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:59:43.188384 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v5bnl\" (UniqueName: \"kubernetes.io/projected/853fa6ac-eeaa-424a-905d-98a6ca5d65fc-kube-api-access-v5bnl\") pod \"router-with-refs-pd-test-kserve-prefill-66766f9bb7-t687q\" (UID: \"853fa6ac-eeaa-424a-905d-98a6ca5d65fc\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-66766f9bb7-t687q" Apr 23 13:59:43.188577 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:59:43.188558 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zngbx\" (UniqueName: \"kubernetes.io/projected/49f718e6-37f4-4a2e-a9b7-99980cd55794-kube-api-access-zngbx\") pod \"router-with-refs-pd-test-kserve-769694c76-wpvqk\" (UID: \"49f718e6-37f4-4a2e-a9b7-99980cd55794\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-769694c76-wpvqk" Apr 23 13:59:43.356515 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:59:43.356428 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-769694c76-wpvqk" Apr 23 13:59:43.371131 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:59:43.371104 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-66766f9bb7-t687q" Apr 23 13:59:43.521883 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:59:43.521850 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-66766f9bb7-t687q"] Apr 23 13:59:43.523633 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:59:43.523607 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod853fa6ac_eeaa_424a_905d_98a6ca5d65fc.slice/crio-4f337178fe28698a0f5225ec603455bab02a9cf06a55c075df21e03473b3232e WatchSource:0}: Error finding container 4f337178fe28698a0f5225ec603455bab02a9cf06a55c075df21e03473b3232e: Status 404 returned error can't find the container with id 4f337178fe28698a0f5225ec603455bab02a9cf06a55c075df21e03473b3232e Apr 23 13:59:43.704578 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:59:43.704544 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-769694c76-wpvqk"] Apr 23 13:59:43.705618 ip-10-0-128-108 kubenswrapper[2571]: W0423 13:59:43.705594 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod49f718e6_37f4_4a2e_a9b7_99980cd55794.slice/crio-b7a44895454542707489b3eb4f844a3e2bfbe31f1f883a37e5cdad5032321b52 WatchSource:0}: Error finding container b7a44895454542707489b3eb4f844a3e2bfbe31f1f883a37e5cdad5032321b52: Status 404 returned error can't find the container with id b7a44895454542707489b3eb4f844a3e2bfbe31f1f883a37e5cdad5032321b52 Apr 23 13:59:44.369407 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:59:44.369365 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-769694c76-wpvqk" event={"ID":"49f718e6-37f4-4a2e-a9b7-99980cd55794","Type":"ContainerStarted","Data":"b7a44895454542707489b3eb4f844a3e2bfbe31f1f883a37e5cdad5032321b52"} Apr 23 13:59:44.370819 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:59:44.370787 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-66766f9bb7-t687q" event={"ID":"853fa6ac-eeaa-424a-905d-98a6ca5d65fc","Type":"ContainerStarted","Data":"47e97cff2ded92d1c7bfa55de5cccec6cca47e9dade7775bf58a494e22a96398"} Apr 23 13:59:44.370819 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:59:44.370825 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-66766f9bb7-t687q" event={"ID":"853fa6ac-eeaa-424a-905d-98a6ca5d65fc","Type":"ContainerStarted","Data":"4f337178fe28698a0f5225ec603455bab02a9cf06a55c075df21e03473b3232e"} Apr 23 13:59:45.376724 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:59:45.376671 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-769694c76-wpvqk" event={"ID":"49f718e6-37f4-4a2e-a9b7-99980cd55794","Type":"ContainerStarted","Data":"054d06216633bf533597b7ccdd74e459236198564da502860ca41015439ac3d1"} Apr 23 13:59:45.377142 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:59:45.376967 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-769694c76-wpvqk" Apr 23 13:59:46.383205 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:59:46.383164 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-769694c76-wpvqk" event={"ID":"49f718e6-37f4-4a2e-a9b7-99980cd55794","Type":"ContainerStarted","Data":"4e2ba7888a2ac0c1190ced48325435de5905ed23d355be54c284d8bbba8f942d"} Apr 23 13:59:48.391726 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:59:48.391691 2571 generic.go:358] "Generic (PLEG): container finished" podID="853fa6ac-eeaa-424a-905d-98a6ca5d65fc" containerID="47e97cff2ded92d1c7bfa55de5cccec6cca47e9dade7775bf58a494e22a96398" exitCode=0 Apr 23 13:59:48.392189 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:59:48.391769 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-66766f9bb7-t687q" event={"ID":"853fa6ac-eeaa-424a-905d-98a6ca5d65fc","Type":"ContainerDied","Data":"47e97cff2ded92d1c7bfa55de5cccec6cca47e9dade7775bf58a494e22a96398"} Apr 23 13:59:49.397519 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:59:49.397483 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-66766f9bb7-t687q" event={"ID":"853fa6ac-eeaa-424a-905d-98a6ca5d65fc","Type":"ContainerStarted","Data":"1d2d8f765d3b889ab515142fc5721a151fd96107f19c790f08a5328d17556771"} Apr 23 13:59:49.417869 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:59:49.417829 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-66766f9bb7-t687q" podStartSLOduration=6.417815383 podStartE2EDuration="6.417815383s" podCreationTimestamp="2026-04-23 13:59:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 13:59:49.416516939 +0000 UTC m=+1723.554795624" watchObservedRunningTime="2026-04-23 13:59:49.417815383 +0000 UTC m=+1723.556094102" Apr 23 13:59:50.403467 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:59:50.403425 2571 generic.go:358] "Generic (PLEG): container finished" podID="49f718e6-37f4-4a2e-a9b7-99980cd55794" containerID="4e2ba7888a2ac0c1190ced48325435de5905ed23d355be54c284d8bbba8f942d" exitCode=0 Apr 23 13:59:50.403958 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:59:50.403491 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-769694c76-wpvqk" event={"ID":"49f718e6-37f4-4a2e-a9b7-99980cd55794","Type":"ContainerDied","Data":"4e2ba7888a2ac0c1190ced48325435de5905ed23d355be54c284d8bbba8f942d"} Apr 23 13:59:51.409958 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:59:51.409917 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-769694c76-wpvqk" event={"ID":"49f718e6-37f4-4a2e-a9b7-99980cd55794","Type":"ContainerStarted","Data":"595addf4dfbafcf32eea2e8d6a98c593d8d336118297be05ae841756989db1fe"} Apr 23 13:59:51.436999 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:59:51.436950 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-769694c76-wpvqk" podStartSLOduration=7.396348798 podStartE2EDuration="8.436932568s" podCreationTimestamp="2026-04-23 13:59:43 +0000 UTC" firstStartedPulling="2026-04-23 13:59:43.70763328 +0000 UTC m=+1717.845911942" lastFinishedPulling="2026-04-23 13:59:44.748217048 +0000 UTC m=+1718.886495712" observedRunningTime="2026-04-23 13:59:51.433056934 +0000 UTC m=+1725.571335628" watchObservedRunningTime="2026-04-23 13:59:51.436932568 +0000 UTC m=+1725.575211253" Apr 23 13:59:52.430326 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:59:52.430287 2571 kubelet_pods.go:1019] "Unable to retrieve pull secret, the image pull may not succeed." pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" secret="" err="secret \"llmisvc-model-fb-opt-125m-route-f312f5-cb7fb8cf-dockercfg-8kjw9\" not found" Apr 23 13:59:52.471467 ip-10-0-128-108 kubenswrapper[2571]: E0423 13:59:52.471431 2571 secret.go:189] Couldn't get secret kserve-ci-e2e-test/llmisv4e643bc258191ffc517a31cd1d0ddd27-kserve-self-signed-certs: secret "llmisv4e643bc258191ffc517a31cd1d0ddd27-kserve-self-signed-certs" not found Apr 23 13:59:52.471666 ip-10-0-128-108 kubenswrapper[2571]: E0423 13:59:52.471496 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a20545ad-9664-41b0-bb23-f4c7a85b36d8-tls-certs podName:a20545ad-9664-41b0-bb23-f4c7a85b36d8 nodeName:}" failed. No retries permitted until 2026-04-23 13:59:52.971481866 +0000 UTC m=+1727.109760529 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-certs" (UniqueName: "kubernetes.io/secret/a20545ad-9664-41b0-bb23-f4c7a85b36d8-tls-certs") pod "llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" (UID: "a20545ad-9664-41b0-bb23-f4c7a85b36d8") : secret "llmisv4e643bc258191ffc517a31cd1d0ddd27-kserve-self-signed-certs" not found Apr 23 13:59:52.976439 ip-10-0-128-108 kubenswrapper[2571]: E0423 13:59:52.976392 2571 secret.go:189] Couldn't get secret kserve-ci-e2e-test/llmisv4e643bc258191ffc517a31cd1d0ddd27-kserve-self-signed-certs: secret "llmisv4e643bc258191ffc517a31cd1d0ddd27-kserve-self-signed-certs" not found Apr 23 13:59:52.976642 ip-10-0-128-108 kubenswrapper[2571]: E0423 13:59:52.976480 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a20545ad-9664-41b0-bb23-f4c7a85b36d8-tls-certs podName:a20545ad-9664-41b0-bb23-f4c7a85b36d8 nodeName:}" failed. No retries permitted until 2026-04-23 13:59:53.976461157 +0000 UTC m=+1728.114739819 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "tls-certs" (UniqueName: "kubernetes.io/secret/a20545ad-9664-41b0-bb23-f4c7a85b36d8-tls-certs") pod "llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" (UID: "a20545ad-9664-41b0-bb23-f4c7a85b36d8") : secret "llmisv4e643bc258191ffc517a31cd1d0ddd27-kserve-self-signed-certs" not found Apr 23 13:59:53.138969 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:59:53.138932 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1"] Apr 23 13:59:53.357300 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:59:53.357218 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-769694c76-wpvqk" Apr 23 13:59:53.357300 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:59:53.357270 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-769694c76-wpvqk" Apr 23 13:59:53.358633 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:59:53.358598 2571 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-769694c76-wpvqk" podUID="49f718e6-37f4-4a2e-a9b7-99980cd55794" containerName="main" probeResult="failure" output="Get \"https://10.133.0.40:8001/health\": dial tcp 10.133.0.40:8001: connect: connection refused" Apr 23 13:59:53.371711 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:59:53.371671 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-66766f9bb7-t687q" Apr 23 13:59:53.371831 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:59:53.371720 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-66766f9bb7-t687q" Apr 23 13:59:53.372793 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:59:53.372769 2571 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-66766f9bb7-t687q" podUID="853fa6ac-eeaa-424a-905d-98a6ca5d65fc" containerName="main" probeResult="failure" output="Get \"https://10.133.0.41:8000/health\": dial tcp 10.133.0.41:8000: connect: connection refused" Apr 23 13:59:53.418482 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:59:53.418430 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" podUID="a20545ad-9664-41b0-bb23-f4c7a85b36d8" containerName="main" containerID="cri-o://48fd680e23e02870a7340146b917967b289670bb21d84d677dcc93688d27a7de" gracePeriod=30 Apr 23 13:59:53.985948 ip-10-0-128-108 kubenswrapper[2571]: E0423 13:59:53.985917 2571 secret.go:189] Couldn't get secret kserve-ci-e2e-test/llmisv4e643bc258191ffc517a31cd1d0ddd27-kserve-self-signed-certs: secret "llmisv4e643bc258191ffc517a31cd1d0ddd27-kserve-self-signed-certs" not found Apr 23 13:59:53.986391 ip-10-0-128-108 kubenswrapper[2571]: E0423 13:59:53.985999 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a20545ad-9664-41b0-bb23-f4c7a85b36d8-tls-certs podName:a20545ad-9664-41b0-bb23-f4c7a85b36d8 nodeName:}" failed. No retries permitted until 2026-04-23 13:59:55.985979596 +0000 UTC m=+1730.124258259 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "tls-certs" (UniqueName: "kubernetes.io/secret/a20545ad-9664-41b0-bb23-f4c7a85b36d8-tls-certs") pod "llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" (UID: "a20545ad-9664-41b0-bb23-f4c7a85b36d8") : secret "llmisv4e643bc258191ffc517a31cd1d0ddd27-kserve-self-signed-certs" not found Apr 23 13:59:54.116260 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:59:54.116234 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 23 13:59:54.187170 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:59:54.187085 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/a20545ad-9664-41b0-bb23-f4c7a85b36d8-dshm\") pod \"a20545ad-9664-41b0-bb23-f4c7a85b36d8\" (UID: \"a20545ad-9664-41b0-bb23-f4c7a85b36d8\") " Apr 23 13:59:54.187347 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:59:54.187181 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/a20545ad-9664-41b0-bb23-f4c7a85b36d8-tmp-dir\") pod \"a20545ad-9664-41b0-bb23-f4c7a85b36d8\" (UID: \"a20545ad-9664-41b0-bb23-f4c7a85b36d8\") " Apr 23 13:59:54.187347 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:59:54.187216 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/a20545ad-9664-41b0-bb23-f4c7a85b36d8-model-cache\") pod \"a20545ad-9664-41b0-bb23-f4c7a85b36d8\" (UID: \"a20545ad-9664-41b0-bb23-f4c7a85b36d8\") " Apr 23 13:59:54.187347 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:59:54.187241 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/a20545ad-9664-41b0-bb23-f4c7a85b36d8-home\") pod \"a20545ad-9664-41b0-bb23-f4c7a85b36d8\" (UID: \"a20545ad-9664-41b0-bb23-f4c7a85b36d8\") " Apr 23 13:59:54.187347 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:59:54.187310 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a20545ad-9664-41b0-bb23-f4c7a85b36d8-kserve-provision-location\") pod \"a20545ad-9664-41b0-bb23-f4c7a85b36d8\" (UID: \"a20545ad-9664-41b0-bb23-f4c7a85b36d8\") " Apr 23 13:59:54.187619 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:59:54.187440 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gbbkk\" (UniqueName: \"kubernetes.io/projected/a20545ad-9664-41b0-bb23-f4c7a85b36d8-kube-api-access-gbbkk\") pod \"a20545ad-9664-41b0-bb23-f4c7a85b36d8\" (UID: \"a20545ad-9664-41b0-bb23-f4c7a85b36d8\") " Apr 23 13:59:54.187619 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:59:54.187483 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a20545ad-9664-41b0-bb23-f4c7a85b36d8-tls-certs\") pod \"a20545ad-9664-41b0-bb23-f4c7a85b36d8\" (UID: \"a20545ad-9664-41b0-bb23-f4c7a85b36d8\") " Apr 23 13:59:54.187619 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:59:54.187575 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a20545ad-9664-41b0-bb23-f4c7a85b36d8-model-cache" (OuterVolumeSpecName: "model-cache") pod "a20545ad-9664-41b0-bb23-f4c7a85b36d8" (UID: "a20545ad-9664-41b0-bb23-f4c7a85b36d8"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 13:59:54.187898 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:59:54.187852 2571 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/a20545ad-9664-41b0-bb23-f4c7a85b36d8-model-cache\") on node \"ip-10-0-128-108.ec2.internal\" DevicePath \"\"" Apr 23 13:59:54.188325 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:59:54.188299 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a20545ad-9664-41b0-bb23-f4c7a85b36d8-home" (OuterVolumeSpecName: "home") pod "a20545ad-9664-41b0-bb23-f4c7a85b36d8" (UID: "a20545ad-9664-41b0-bb23-f4c7a85b36d8"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 13:59:54.189669 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:59:54.189641 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a20545ad-9664-41b0-bb23-f4c7a85b36d8-dshm" (OuterVolumeSpecName: "dshm") pod "a20545ad-9664-41b0-bb23-f4c7a85b36d8" (UID: "a20545ad-9664-41b0-bb23-f4c7a85b36d8"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 13:59:54.190099 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:59:54.190066 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a20545ad-9664-41b0-bb23-f4c7a85b36d8-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "a20545ad-9664-41b0-bb23-f4c7a85b36d8" (UID: "a20545ad-9664-41b0-bb23-f4c7a85b36d8"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 13:59:54.191229 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:59:54.191200 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a20545ad-9664-41b0-bb23-f4c7a85b36d8-kube-api-access-gbbkk" (OuterVolumeSpecName: "kube-api-access-gbbkk") pod "a20545ad-9664-41b0-bb23-f4c7a85b36d8" (UID: "a20545ad-9664-41b0-bb23-f4c7a85b36d8"). InnerVolumeSpecName "kube-api-access-gbbkk". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 13:59:54.205630 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:59:54.205597 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a20545ad-9664-41b0-bb23-f4c7a85b36d8-tmp-dir" (OuterVolumeSpecName: "tmp-dir") pod "a20545ad-9664-41b0-bb23-f4c7a85b36d8" (UID: "a20545ad-9664-41b0-bb23-f4c7a85b36d8"). InnerVolumeSpecName "tmp-dir". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 13:59:54.245096 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:59:54.245048 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a20545ad-9664-41b0-bb23-f4c7a85b36d8-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "a20545ad-9664-41b0-bb23-f4c7a85b36d8" (UID: "a20545ad-9664-41b0-bb23-f4c7a85b36d8"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 13:59:54.289300 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:59:54.289269 2571 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a20545ad-9664-41b0-bb23-f4c7a85b36d8-kserve-provision-location\") on node \"ip-10-0-128-108.ec2.internal\" DevicePath \"\"" Apr 23 13:59:54.289300 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:59:54.289299 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-gbbkk\" (UniqueName: \"kubernetes.io/projected/a20545ad-9664-41b0-bb23-f4c7a85b36d8-kube-api-access-gbbkk\") on node \"ip-10-0-128-108.ec2.internal\" DevicePath \"\"" Apr 23 13:59:54.289506 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:59:54.289310 2571 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a20545ad-9664-41b0-bb23-f4c7a85b36d8-tls-certs\") on node \"ip-10-0-128-108.ec2.internal\" DevicePath \"\"" Apr 23 13:59:54.289506 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:59:54.289319 2571 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/a20545ad-9664-41b0-bb23-f4c7a85b36d8-dshm\") on node \"ip-10-0-128-108.ec2.internal\" DevicePath \"\"" Apr 23 13:59:54.289506 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:59:54.289345 2571 reconciler_common.go:299] "Volume detached for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/a20545ad-9664-41b0-bb23-f4c7a85b36d8-tmp-dir\") on node \"ip-10-0-128-108.ec2.internal\" DevicePath \"\"" Apr 23 13:59:54.289506 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:59:54.289356 2571 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/a20545ad-9664-41b0-bb23-f4c7a85b36d8-home\") on node \"ip-10-0-128-108.ec2.internal\" DevicePath \"\"" Apr 23 13:59:54.423565 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:59:54.423517 2571 generic.go:358] "Generic (PLEG): container finished" podID="a20545ad-9664-41b0-bb23-f4c7a85b36d8" containerID="48fd680e23e02870a7340146b917967b289670bb21d84d677dcc93688d27a7de" exitCode=0 Apr 23 13:59:54.423736 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:59:54.423602 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" event={"ID":"a20545ad-9664-41b0-bb23-f4c7a85b36d8","Type":"ContainerDied","Data":"48fd680e23e02870a7340146b917967b289670bb21d84d677dcc93688d27a7de"} Apr 23 13:59:54.423736 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:59:54.423641 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" event={"ID":"a20545ad-9664-41b0-bb23-f4c7a85b36d8","Type":"ContainerDied","Data":"4965fb7b87ba2db016a02bcc7823a7010629f68024e0abfc2f4f6112e2ebae95"} Apr 23 13:59:54.423736 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:59:54.423644 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 23 13:59:54.423736 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:59:54.423660 2571 scope.go:117] "RemoveContainer" containerID="48fd680e23e02870a7340146b917967b289670bb21d84d677dcc93688d27a7de" Apr 23 13:59:54.437371 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:59:54.437242 2571 scope.go:117] "RemoveContainer" containerID="4984180a7224af02bf12aabd83bb3bf9d955974b2778c3da0d6731bbd957d53b" Apr 23 13:59:54.449468 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:59:54.449445 2571 scope.go:117] "RemoveContainer" containerID="48fd680e23e02870a7340146b917967b289670bb21d84d677dcc93688d27a7de" Apr 23 13:59:54.449750 ip-10-0-128-108 kubenswrapper[2571]: E0423 13:59:54.449731 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"48fd680e23e02870a7340146b917967b289670bb21d84d677dcc93688d27a7de\": container with ID starting with 48fd680e23e02870a7340146b917967b289670bb21d84d677dcc93688d27a7de not found: ID does not exist" containerID="48fd680e23e02870a7340146b917967b289670bb21d84d677dcc93688d27a7de" Apr 23 13:59:54.449799 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:59:54.449766 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"48fd680e23e02870a7340146b917967b289670bb21d84d677dcc93688d27a7de"} err="failed to get container status \"48fd680e23e02870a7340146b917967b289670bb21d84d677dcc93688d27a7de\": rpc error: code = NotFound desc = could not find container \"48fd680e23e02870a7340146b917967b289670bb21d84d677dcc93688d27a7de\": container with ID starting with 48fd680e23e02870a7340146b917967b289670bb21d84d677dcc93688d27a7de not found: ID does not exist" Apr 23 13:59:54.449799 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:59:54.449785 2571 scope.go:117] "RemoveContainer" containerID="4984180a7224af02bf12aabd83bb3bf9d955974b2778c3da0d6731bbd957d53b" Apr 23 13:59:54.450041 ip-10-0-128-108 kubenswrapper[2571]: E0423 13:59:54.450025 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4984180a7224af02bf12aabd83bb3bf9d955974b2778c3da0d6731bbd957d53b\": container with ID starting with 4984180a7224af02bf12aabd83bb3bf9d955974b2778c3da0d6731bbd957d53b not found: ID does not exist" containerID="4984180a7224af02bf12aabd83bb3bf9d955974b2778c3da0d6731bbd957d53b" Apr 23 13:59:54.450096 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:59:54.450045 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4984180a7224af02bf12aabd83bb3bf9d955974b2778c3da0d6731bbd957d53b"} err="failed to get container status \"4984180a7224af02bf12aabd83bb3bf9d955974b2778c3da0d6731bbd957d53b\": rpc error: code = NotFound desc = could not find container \"4984180a7224af02bf12aabd83bb3bf9d955974b2778c3da0d6731bbd957d53b\": container with ID starting with 4984180a7224af02bf12aabd83bb3bf9d955974b2778c3da0d6731bbd957d53b not found: ID does not exist" Apr 23 13:59:54.457765 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:59:54.457728 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1"] Apr 23 13:59:54.462138 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:59:54.462114 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1"] Apr 23 13:59:56.434842 ip-10-0-128-108 kubenswrapper[2571]: I0423 13:59:56.434809 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a20545ad-9664-41b0-bb23-f4c7a85b36d8" path="/var/lib/kubelet/pods/a20545ad-9664-41b0-bb23-f4c7a85b36d8/volumes" Apr 23 14:00:03.357000 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:00:03.356941 2571 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-769694c76-wpvqk" podUID="49f718e6-37f4-4a2e-a9b7-99980cd55794" containerName="main" probeResult="failure" output="Get \"https://10.133.0.40:8001/health\": dial tcp 10.133.0.40:8001: connect: connection refused" Apr 23 14:00:03.372034 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:00:03.371998 2571 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-66766f9bb7-t687q" podUID="853fa6ac-eeaa-424a-905d-98a6ca5d65fc" containerName="main" probeResult="failure" output="Get \"https://10.133.0.41:8000/health\": dial tcp 10.133.0.41:8000: connect: connection refused" Apr 23 14:00:03.375210 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:00:03.375184 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-769694c76-wpvqk" Apr 23 14:00:13.357459 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:00:13.357406 2571 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-769694c76-wpvqk" podUID="49f718e6-37f4-4a2e-a9b7-99980cd55794" containerName="main" probeResult="failure" output="Get \"https://10.133.0.40:8001/health\": dial tcp 10.133.0.40:8001: connect: connection refused" Apr 23 14:00:13.371716 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:00:13.371681 2571 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-66766f9bb7-t687q" podUID="853fa6ac-eeaa-424a-905d-98a6ca5d65fc" containerName="main" probeResult="failure" output="Get \"https://10.133.0.41:8000/health\": dial tcp 10.133.0.41:8000: connect: connection refused" Apr 23 14:00:23.357732 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:00:23.357683 2571 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-769694c76-wpvqk" podUID="49f718e6-37f4-4a2e-a9b7-99980cd55794" containerName="main" probeResult="failure" output="Get \"https://10.133.0.40:8001/health\": dial tcp 10.133.0.40:8001: connect: connection refused" Apr 23 14:00:23.371492 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:00:23.371454 2571 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-66766f9bb7-t687q" podUID="853fa6ac-eeaa-424a-905d-98a6ca5d65fc" containerName="main" probeResult="failure" output="Get \"https://10.133.0.41:8000/health\": dial tcp 10.133.0.41:8000: connect: connection refused" Apr 23 14:00:33.357770 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:00:33.357718 2571 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-769694c76-wpvqk" podUID="49f718e6-37f4-4a2e-a9b7-99980cd55794" containerName="main" probeResult="failure" output="Get \"https://10.133.0.40:8001/health\": dial tcp 10.133.0.40:8001: connect: connection refused" Apr 23 14:00:33.372279 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:00:33.372242 2571 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-66766f9bb7-t687q" podUID="853fa6ac-eeaa-424a-905d-98a6ca5d65fc" containerName="main" probeResult="failure" output="Get \"https://10.133.0.41:8000/health\": dial tcp 10.133.0.41:8000: connect: connection refused" Apr 23 14:00:43.357071 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:00:43.357020 2571 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-769694c76-wpvqk" podUID="49f718e6-37f4-4a2e-a9b7-99980cd55794" containerName="main" probeResult="failure" output="Get \"https://10.133.0.40:8001/health\": dial tcp 10.133.0.40:8001: connect: connection refused" Apr 23 14:00:43.372269 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:00:43.372242 2571 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-66766f9bb7-t687q" podUID="853fa6ac-eeaa-424a-905d-98a6ca5d65fc" containerName="main" probeResult="failure" output="Get \"https://10.133.0.41:8000/health\": dial tcp 10.133.0.41:8000: connect: connection refused" Apr 23 14:00:53.357223 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:00:53.357097 2571 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-769694c76-wpvqk" podUID="49f718e6-37f4-4a2e-a9b7-99980cd55794" containerName="main" probeResult="failure" output="Get \"https://10.133.0.40:8001/health\": dial tcp 10.133.0.40:8001: connect: connection refused" Apr 23 14:00:53.371921 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:00:53.371882 2571 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-66766f9bb7-t687q" podUID="853fa6ac-eeaa-424a-905d-98a6ca5d65fc" containerName="main" probeResult="failure" output="Get \"https://10.133.0.41:8000/health\": dial tcp 10.133.0.41:8000: connect: connection refused" Apr 23 14:01:03.357641 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:01:03.357590 2571 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-769694c76-wpvqk" podUID="49f718e6-37f4-4a2e-a9b7-99980cd55794" containerName="main" probeResult="failure" output="Get \"https://10.133.0.40:8001/health\": dial tcp 10.133.0.40:8001: connect: connection refused" Apr 23 14:01:03.372264 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:01:03.372218 2571 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-66766f9bb7-t687q" podUID="853fa6ac-eeaa-424a-905d-98a6ca5d65fc" containerName="main" probeResult="failure" output="Get \"https://10.133.0.41:8000/health\": dial tcp 10.133.0.41:8000: connect: connection refused" Apr 23 14:01:06.499306 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:01:06.499273 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zk8wt_a5619baf-099b-4d83-ad43-fd7d0083f57b/ovn-acl-logging/0.log" Apr 23 14:01:06.503653 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:01:06.503632 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zk8wt_a5619baf-099b-4d83-ad43-fd7d0083f57b/ovn-acl-logging/0.log" Apr 23 14:01:13.356882 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:01:13.356832 2571 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-769694c76-wpvqk" podUID="49f718e6-37f4-4a2e-a9b7-99980cd55794" containerName="main" probeResult="failure" output="Get \"https://10.133.0.40:8001/health\": dial tcp 10.133.0.40:8001: connect: connection refused" Apr 23 14:01:13.371969 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:01:13.371930 2571 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-66766f9bb7-t687q" podUID="853fa6ac-eeaa-424a-905d-98a6ca5d65fc" containerName="main" probeResult="failure" output="Get \"https://10.133.0.41:8000/health\": dial tcp 10.133.0.41:8000: connect: connection refused" Apr 23 14:01:23.357596 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:01:23.357518 2571 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-769694c76-wpvqk" podUID="49f718e6-37f4-4a2e-a9b7-99980cd55794" containerName="main" probeResult="failure" output="Get \"https://10.133.0.40:8001/health\": dial tcp 10.133.0.40:8001: connect: connection refused" Apr 23 14:01:23.372505 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:01:23.372453 2571 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-66766f9bb7-t687q" podUID="853fa6ac-eeaa-424a-905d-98a6ca5d65fc" containerName="main" probeResult="failure" output="Get \"https://10.133.0.41:8000/health\": dial tcp 10.133.0.41:8000: connect: connection refused" Apr 23 14:01:33.357149 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:01:33.357104 2571 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-769694c76-wpvqk" podUID="49f718e6-37f4-4a2e-a9b7-99980cd55794" containerName="main" probeResult="failure" output="Get \"https://10.133.0.40:8001/health\": dial tcp 10.133.0.40:8001: connect: connection refused" Apr 23 14:01:33.371539 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:01:33.371488 2571 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-66766f9bb7-t687q" podUID="853fa6ac-eeaa-424a-905d-98a6ca5d65fc" containerName="main" probeResult="failure" output="Get \"https://10.133.0.41:8000/health\": dial tcp 10.133.0.41:8000: connect: connection refused" Apr 23 14:01:43.357576 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:01:43.357523 2571 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-769694c76-wpvqk" podUID="49f718e6-37f4-4a2e-a9b7-99980cd55794" containerName="main" probeResult="failure" output="Get \"https://10.133.0.40:8001/health\": dial tcp 10.133.0.40:8001: connect: connection refused" Apr 23 14:01:43.372502 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:01:43.372464 2571 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-66766f9bb7-t687q" podUID="853fa6ac-eeaa-424a-905d-98a6ca5d65fc" containerName="main" probeResult="failure" output="Get \"https://10.133.0.41:8000/health\": dial tcp 10.133.0.41:8000: connect: connection refused" Apr 23 14:01:53.357068 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:01:53.357015 2571 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-769694c76-wpvqk" podUID="49f718e6-37f4-4a2e-a9b7-99980cd55794" containerName="main" probeResult="failure" output="Get \"https://10.133.0.40:8001/health\": dial tcp 10.133.0.40:8001: connect: connection refused" Apr 23 14:01:53.371514 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:01:53.371480 2571 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-66766f9bb7-t687q" podUID="853fa6ac-eeaa-424a-905d-98a6ca5d65fc" containerName="main" probeResult="failure" output="Get \"https://10.133.0.41:8000/health\": dial tcp 10.133.0.41:8000: connect: connection refused" Apr 23 14:02:03.366913 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:02:03.366877 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-769694c76-wpvqk" Apr 23 14:02:03.379458 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:02:03.379437 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-769694c76-wpvqk" Apr 23 14:02:03.385364 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:02:03.385311 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-66766f9bb7-t687q" Apr 23 14:02:03.392781 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:02:03.392761 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-66766f9bb7-t687q" Apr 23 14:02:14.693396 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:02:14.693324 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-769694c76-wpvqk"] Apr 23 14:02:14.694574 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:02:14.694495 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-769694c76-wpvqk" podUID="49f718e6-37f4-4a2e-a9b7-99980cd55794" containerName="main" containerID="cri-o://595addf4dfbafcf32eea2e8d6a98c593d8d336118297be05ae841756989db1fe" gracePeriod=30 Apr 23 14:02:14.701497 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:02:14.701473 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-66766f9bb7-t687q"] Apr 23 14:02:14.701762 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:02:14.701738 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-66766f9bb7-t687q" podUID="853fa6ac-eeaa-424a-905d-98a6ca5d65fc" containerName="main" containerID="cri-o://1d2d8f765d3b889ab515142fc5721a151fd96107f19c790f08a5328d17556771" gracePeriod=30 Apr 23 14:02:29.643715 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:02:29.643631 2571 ???:1] "http: TLS handshake error from 10.0.135.229:43500: EOF" Apr 23 14:02:29.644173 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:02:29.643996 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-l2djv_a8ae7600-6eac-46dd-b8c2-1f96063d95a6/istio-proxy/0.log" Apr 23 14:02:29.700887 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:02:29.700859 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-6p5zt_b6496b4e-654d-4b48-ad97-c2e8d97273d0/istio-proxy/0.log" Apr 23 14:02:29.741545 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:02:29.741519 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-769694c76-wpvqk_49f718e6-37f4-4a2e-a9b7-99980cd55794/main/0.log" Apr 23 14:02:29.775146 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:02:29.775124 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-769694c76-wpvqk_49f718e6-37f4-4a2e-a9b7-99980cd55794/llm-d-routing-sidecar/0.log" Apr 23 14:02:29.807998 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:02:29.807980 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-769694c76-wpvqk_49f718e6-37f4-4a2e-a9b7-99980cd55794/storage-initializer/0.log" Apr 23 14:02:29.834280 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:02:29.834261 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-66766f9bb7-t687q_853fa6ac-eeaa-424a-905d-98a6ca5d65fc/main/0.log" Apr 23 14:02:29.846832 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:02:29.846813 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-66766f9bb7-t687q_853fa6ac-eeaa-424a-905d-98a6ca5d65fc/storage-initializer/0.log" Apr 23 14:02:30.764970 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:02:30.764943 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-l2djv_a8ae7600-6eac-46dd-b8c2-1f96063d95a6/istio-proxy/0.log" Apr 23 14:02:30.781380 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:02:30.781353 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-6p5zt_b6496b4e-654d-4b48-ad97-c2e8d97273d0/istio-proxy/0.log" Apr 23 14:02:30.804867 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:02:30.804845 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-769694c76-wpvqk_49f718e6-37f4-4a2e-a9b7-99980cd55794/main/0.log" Apr 23 14:02:30.812096 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:02:30.812079 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-769694c76-wpvqk_49f718e6-37f4-4a2e-a9b7-99980cd55794/llm-d-routing-sidecar/0.log" Apr 23 14:02:30.823074 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:02:30.823055 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-769694c76-wpvqk_49f718e6-37f4-4a2e-a9b7-99980cd55794/storage-initializer/0.log" Apr 23 14:02:30.843042 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:02:30.843021 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-66766f9bb7-t687q_853fa6ac-eeaa-424a-905d-98a6ca5d65fc/main/0.log" Apr 23 14:02:30.855379 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:02:30.855361 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-66766f9bb7-t687q_853fa6ac-eeaa-424a-905d-98a6ca5d65fc/storage-initializer/0.log" Apr 23 14:02:31.771808 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:02:31.771781 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-l2djv_a8ae7600-6eac-46dd-b8c2-1f96063d95a6/istio-proxy/0.log" Apr 23 14:02:31.786073 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:02:31.786047 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-6p5zt_b6496b4e-654d-4b48-ad97-c2e8d97273d0/istio-proxy/0.log" Apr 23 14:02:31.807165 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:02:31.807142 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-769694c76-wpvqk_49f718e6-37f4-4a2e-a9b7-99980cd55794/main/0.log" Apr 23 14:02:31.815069 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:02:31.815050 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-769694c76-wpvqk_49f718e6-37f4-4a2e-a9b7-99980cd55794/llm-d-routing-sidecar/0.log" Apr 23 14:02:31.828239 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:02:31.828222 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-769694c76-wpvqk_49f718e6-37f4-4a2e-a9b7-99980cd55794/storage-initializer/0.log" Apr 23 14:02:31.848968 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:02:31.848946 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-66766f9bb7-t687q_853fa6ac-eeaa-424a-905d-98a6ca5d65fc/main/0.log" Apr 23 14:02:31.861044 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:02:31.861009 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-66766f9bb7-t687q_853fa6ac-eeaa-424a-905d-98a6ca5d65fc/storage-initializer/0.log" Apr 23 14:02:32.792552 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:02:32.792505 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-l2djv_a8ae7600-6eac-46dd-b8c2-1f96063d95a6/istio-proxy/0.log" Apr 23 14:02:32.808489 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:02:32.808469 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-6p5zt_b6496b4e-654d-4b48-ad97-c2e8d97273d0/istio-proxy/0.log" Apr 23 14:02:32.832280 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:02:32.832254 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-769694c76-wpvqk_49f718e6-37f4-4a2e-a9b7-99980cd55794/main/0.log" Apr 23 14:02:32.840938 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:02:32.840919 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-769694c76-wpvqk_49f718e6-37f4-4a2e-a9b7-99980cd55794/llm-d-routing-sidecar/0.log" Apr 23 14:02:32.856047 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:02:32.856029 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-769694c76-wpvqk_49f718e6-37f4-4a2e-a9b7-99980cd55794/storage-initializer/0.log" Apr 23 14:02:32.875631 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:02:32.875614 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-66766f9bb7-t687q_853fa6ac-eeaa-424a-905d-98a6ca5d65fc/main/0.log" Apr 23 14:02:32.885900 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:02:32.885871 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-66766f9bb7-t687q_853fa6ac-eeaa-424a-905d-98a6ca5d65fc/storage-initializer/0.log" Apr 23 14:02:33.813518 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:02:33.813474 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-l2djv_a8ae7600-6eac-46dd-b8c2-1f96063d95a6/istio-proxy/0.log" Apr 23 14:02:33.834496 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:02:33.834472 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-6p5zt_b6496b4e-654d-4b48-ad97-c2e8d97273d0/istio-proxy/0.log" Apr 23 14:02:33.855646 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:02:33.855622 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-769694c76-wpvqk_49f718e6-37f4-4a2e-a9b7-99980cd55794/main/0.log" Apr 23 14:02:33.868124 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:02:33.868102 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-769694c76-wpvqk_49f718e6-37f4-4a2e-a9b7-99980cd55794/llm-d-routing-sidecar/0.log" Apr 23 14:02:33.881056 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:02:33.881022 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-769694c76-wpvqk_49f718e6-37f4-4a2e-a9b7-99980cd55794/storage-initializer/0.log" Apr 23 14:02:33.903207 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:02:33.903189 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-66766f9bb7-t687q_853fa6ac-eeaa-424a-905d-98a6ca5d65fc/main/0.log" Apr 23 14:02:33.913715 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:02:33.913698 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-66766f9bb7-t687q_853fa6ac-eeaa-424a-905d-98a6ca5d65fc/storage-initializer/0.log" Apr 23 14:02:34.837940 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:02:34.837908 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-l2djv_a8ae7600-6eac-46dd-b8c2-1f96063d95a6/istio-proxy/0.log" Apr 23 14:02:34.855932 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:02:34.855906 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-6p5zt_b6496b4e-654d-4b48-ad97-c2e8d97273d0/istio-proxy/0.log" Apr 23 14:02:34.878075 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:02:34.878054 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-769694c76-wpvqk_49f718e6-37f4-4a2e-a9b7-99980cd55794/main/0.log" Apr 23 14:02:34.887295 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:02:34.887272 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-769694c76-wpvqk_49f718e6-37f4-4a2e-a9b7-99980cd55794/llm-d-routing-sidecar/0.log" Apr 23 14:02:34.898793 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:02:34.898776 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-769694c76-wpvqk_49f718e6-37f4-4a2e-a9b7-99980cd55794/storage-initializer/0.log" Apr 23 14:02:34.922206 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:02:34.922184 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-66766f9bb7-t687q_853fa6ac-eeaa-424a-905d-98a6ca5d65fc/main/0.log" Apr 23 14:02:34.934964 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:02:34.934943 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-66766f9bb7-t687q_853fa6ac-eeaa-424a-905d-98a6ca5d65fc/storage-initializer/0.log" Apr 23 14:02:35.867178 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:02:35.867146 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-l2djv_a8ae7600-6eac-46dd-b8c2-1f96063d95a6/istio-proxy/0.log" Apr 23 14:02:35.884707 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:02:35.884687 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-6p5zt_b6496b4e-654d-4b48-ad97-c2e8d97273d0/istio-proxy/0.log" Apr 23 14:02:35.907161 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:02:35.907144 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-769694c76-wpvqk_49f718e6-37f4-4a2e-a9b7-99980cd55794/main/0.log" Apr 23 14:02:35.920313 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:02:35.920291 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-769694c76-wpvqk_49f718e6-37f4-4a2e-a9b7-99980cd55794/llm-d-routing-sidecar/0.log" Apr 23 14:02:35.938099 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:02:35.938085 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-769694c76-wpvqk_49f718e6-37f4-4a2e-a9b7-99980cd55794/storage-initializer/0.log" Apr 23 14:02:35.957289 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:02:35.957266 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-66766f9bb7-t687q_853fa6ac-eeaa-424a-905d-98a6ca5d65fc/main/0.log" Apr 23 14:02:35.970450 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:02:35.970434 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-66766f9bb7-t687q_853fa6ac-eeaa-424a-905d-98a6ca5d65fc/storage-initializer/0.log" Apr 23 14:02:36.919564 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:02:36.919533 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-l2djv_a8ae7600-6eac-46dd-b8c2-1f96063d95a6/istio-proxy/0.log" Apr 23 14:02:36.935893 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:02:36.935868 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-6p5zt_b6496b4e-654d-4b48-ad97-c2e8d97273d0/istio-proxy/0.log" Apr 23 14:02:36.960482 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:02:36.960463 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-769694c76-wpvqk_49f718e6-37f4-4a2e-a9b7-99980cd55794/main/0.log" Apr 23 14:02:36.968507 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:02:36.968480 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-769694c76-wpvqk_49f718e6-37f4-4a2e-a9b7-99980cd55794/llm-d-routing-sidecar/0.log" Apr 23 14:02:36.979120 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:02:36.979099 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-769694c76-wpvqk_49f718e6-37f4-4a2e-a9b7-99980cd55794/storage-initializer/0.log" Apr 23 14:02:36.996218 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:02:36.996198 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-66766f9bb7-t687q_853fa6ac-eeaa-424a-905d-98a6ca5d65fc/main/0.log" Apr 23 14:02:37.004918 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:02:37.004902 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-66766f9bb7-t687q_853fa6ac-eeaa-424a-905d-98a6ca5d65fc/storage-initializer/0.log" Apr 23 14:02:37.914428 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:02:37.914395 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-l2djv_a8ae7600-6eac-46dd-b8c2-1f96063d95a6/istio-proxy/0.log" Apr 23 14:02:37.928920 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:02:37.928879 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-6p5zt_b6496b4e-654d-4b48-ad97-c2e8d97273d0/istio-proxy/0.log" Apr 23 14:02:37.951968 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:02:37.951948 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-769694c76-wpvqk_49f718e6-37f4-4a2e-a9b7-99980cd55794/main/0.log" Apr 23 14:02:37.960012 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:02:37.959995 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-769694c76-wpvqk_49f718e6-37f4-4a2e-a9b7-99980cd55794/llm-d-routing-sidecar/0.log" Apr 23 14:02:37.970670 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:02:37.970648 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-769694c76-wpvqk_49f718e6-37f4-4a2e-a9b7-99980cd55794/storage-initializer/0.log" Apr 23 14:02:37.988310 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:02:37.988286 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-66766f9bb7-t687q_853fa6ac-eeaa-424a-905d-98a6ca5d65fc/main/0.log" Apr 23 14:02:37.997961 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:02:37.997941 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-66766f9bb7-t687q_853fa6ac-eeaa-424a-905d-98a6ca5d65fc/storage-initializer/0.log" Apr 23 14:02:38.911883 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:02:38.911843 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-l2djv_a8ae7600-6eac-46dd-b8c2-1f96063d95a6/istio-proxy/0.log" Apr 23 14:02:38.928885 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:02:38.928865 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-6p5zt_b6496b4e-654d-4b48-ad97-c2e8d97273d0/istio-proxy/0.log" Apr 23 14:02:38.954657 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:02:38.954633 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-769694c76-wpvqk_49f718e6-37f4-4a2e-a9b7-99980cd55794/main/0.log" Apr 23 14:02:38.965465 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:02:38.965445 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-769694c76-wpvqk_49f718e6-37f4-4a2e-a9b7-99980cd55794/llm-d-routing-sidecar/0.log" Apr 23 14:02:38.982490 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:02:38.982471 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-769694c76-wpvqk_49f718e6-37f4-4a2e-a9b7-99980cd55794/storage-initializer/0.log" Apr 23 14:02:39.000655 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:02:39.000637 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-66766f9bb7-t687q_853fa6ac-eeaa-424a-905d-98a6ca5d65fc/main/0.log" Apr 23 14:02:39.009293 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:02:39.009275 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-66766f9bb7-t687q_853fa6ac-eeaa-424a-905d-98a6ca5d65fc/storage-initializer/0.log" Apr 23 14:02:39.904949 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:02:39.904907 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-l2djv_a8ae7600-6eac-46dd-b8c2-1f96063d95a6/istio-proxy/0.log" Apr 23 14:02:39.922577 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:02:39.922553 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-6p5zt_b6496b4e-654d-4b48-ad97-c2e8d97273d0/istio-proxy/0.log" Apr 23 14:02:39.942570 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:02:39.942553 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-769694c76-wpvqk_49f718e6-37f4-4a2e-a9b7-99980cd55794/main/0.log" Apr 23 14:02:39.950406 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:02:39.950387 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-769694c76-wpvqk_49f718e6-37f4-4a2e-a9b7-99980cd55794/llm-d-routing-sidecar/0.log" Apr 23 14:02:39.962269 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:02:39.962245 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-769694c76-wpvqk_49f718e6-37f4-4a2e-a9b7-99980cd55794/storage-initializer/0.log" Apr 23 14:02:39.982193 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:02:39.982173 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-66766f9bb7-t687q_853fa6ac-eeaa-424a-905d-98a6ca5d65fc/main/0.log" Apr 23 14:02:39.992070 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:02:39.992042 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-66766f9bb7-t687q_853fa6ac-eeaa-424a-905d-98a6ca5d65fc/storage-initializer/0.log" Apr 23 14:02:40.966683 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:02:40.966656 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-l2djv_a8ae7600-6eac-46dd-b8c2-1f96063d95a6/istio-proxy/0.log" Apr 23 14:02:40.984015 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:02:40.983993 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-6p5zt_b6496b4e-654d-4b48-ad97-c2e8d97273d0/istio-proxy/0.log" Apr 23 14:02:41.011116 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:02:41.011093 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-769694c76-wpvqk_49f718e6-37f4-4a2e-a9b7-99980cd55794/main/0.log" Apr 23 14:02:41.020716 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:02:41.020692 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-769694c76-wpvqk_49f718e6-37f4-4a2e-a9b7-99980cd55794/llm-d-routing-sidecar/0.log" Apr 23 14:02:41.033529 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:02:41.033502 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-769694c76-wpvqk_49f718e6-37f4-4a2e-a9b7-99980cd55794/storage-initializer/0.log" Apr 23 14:02:41.050316 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:02:41.050297 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-66766f9bb7-t687q_853fa6ac-eeaa-424a-905d-98a6ca5d65fc/main/0.log" Apr 23 14:02:41.061070 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:02:41.061054 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-66766f9bb7-t687q_853fa6ac-eeaa-424a-905d-98a6ca5d65fc/storage-initializer/0.log" Apr 23 14:02:41.975254 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:02:41.975223 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-l2djv_a8ae7600-6eac-46dd-b8c2-1f96063d95a6/istio-proxy/0.log" Apr 23 14:02:41.995307 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:02:41.995283 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-6p5zt_b6496b4e-654d-4b48-ad97-c2e8d97273d0/istio-proxy/0.log" Apr 23 14:02:42.017427 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:02:42.017405 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-769694c76-wpvqk_49f718e6-37f4-4a2e-a9b7-99980cd55794/main/0.log" Apr 23 14:02:42.033166 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:02:42.033145 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-769694c76-wpvqk_49f718e6-37f4-4a2e-a9b7-99980cd55794/llm-d-routing-sidecar/0.log" Apr 23 14:02:42.063907 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:02:42.063887 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-769694c76-wpvqk_49f718e6-37f4-4a2e-a9b7-99980cd55794/storage-initializer/0.log" Apr 23 14:02:42.107143 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:02:42.107119 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-66766f9bb7-t687q_853fa6ac-eeaa-424a-905d-98a6ca5d65fc/main/0.log" Apr 23 14:02:42.152831 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:02:42.152811 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-66766f9bb7-t687q_853fa6ac-eeaa-424a-905d-98a6ca5d65fc/storage-initializer/0.log" Apr 23 14:02:43.145653 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:02:43.145628 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-l2djv_a8ae7600-6eac-46dd-b8c2-1f96063d95a6/istio-proxy/0.log" Apr 23 14:02:43.164063 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:02:43.164042 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-6p5zt_b6496b4e-654d-4b48-ad97-c2e8d97273d0/istio-proxy/0.log" Apr 23 14:02:43.190443 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:02:43.190422 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-769694c76-wpvqk_49f718e6-37f4-4a2e-a9b7-99980cd55794/main/0.log" Apr 23 14:02:43.209766 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:02:43.209747 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-769694c76-wpvqk_49f718e6-37f4-4a2e-a9b7-99980cd55794/llm-d-routing-sidecar/0.log" Apr 23 14:02:43.229615 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:02:43.229596 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-769694c76-wpvqk_49f718e6-37f4-4a2e-a9b7-99980cd55794/storage-initializer/0.log" Apr 23 14:02:43.255710 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:02:43.255693 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-66766f9bb7-t687q_853fa6ac-eeaa-424a-905d-98a6ca5d65fc/main/0.log" Apr 23 14:02:43.273765 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:02:43.273742 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-66766f9bb7-t687q_853fa6ac-eeaa-424a-905d-98a6ca5d65fc/storage-initializer/0.log" Apr 23 14:02:44.328165 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:02:44.328131 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_istiod-openshift-gateway-55ff986f96-rmkrp_139e0319-1cde-4812-8f88-f87a37388144/discovery/0.log" Apr 23 14:02:44.343218 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:02:44.343193 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_openshift-ai-inference-openshift-default-7c5447bb76-nsr6v_340b1383-bc96-4f31-a84c-6b1b58369cbe/istio-proxy/0.log" Apr 23 14:02:44.695212 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:02:44.695104 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-769694c76-wpvqk" podUID="49f718e6-37f4-4a2e-a9b7-99980cd55794" containerName="llm-d-routing-sidecar" containerID="cri-o://054d06216633bf533597b7ccdd74e459236198564da502860ca41015439ac3d1" gracePeriod=2 Apr 23 14:02:44.995881 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:02:44.995860 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-769694c76-wpvqk_49f718e6-37f4-4a2e-a9b7-99980cd55794/main/0.log" Apr 23 14:02:44.996605 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:02:44.996586 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-769694c76-wpvqk" Apr 23 14:02:44.999064 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:02:44.999047 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-66766f9bb7-t687q" Apr 23 14:02:45.013508 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:02:45.013483 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/853fa6ac-eeaa-424a-905d-98a6ca5d65fc-tls-certs\") pod \"853fa6ac-eeaa-424a-905d-98a6ca5d65fc\" (UID: \"853fa6ac-eeaa-424a-905d-98a6ca5d65fc\") " Apr 23 14:02:45.013630 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:02:45.013522 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/853fa6ac-eeaa-424a-905d-98a6ca5d65fc-kserve-provision-location\") pod \"853fa6ac-eeaa-424a-905d-98a6ca5d65fc\" (UID: \"853fa6ac-eeaa-424a-905d-98a6ca5d65fc\") " Apr 23 14:02:45.013630 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:02:45.013560 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/49f718e6-37f4-4a2e-a9b7-99980cd55794-kserve-provision-location\") pod \"49f718e6-37f4-4a2e-a9b7-99980cd55794\" (UID: \"49f718e6-37f4-4a2e-a9b7-99980cd55794\") " Apr 23 14:02:45.013630 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:02:45.013588 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/853fa6ac-eeaa-424a-905d-98a6ca5d65fc-model-cache\") pod \"853fa6ac-eeaa-424a-905d-98a6ca5d65fc\" (UID: \"853fa6ac-eeaa-424a-905d-98a6ca5d65fc\") " Apr 23 14:02:45.013630 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:02:45.013613 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/49f718e6-37f4-4a2e-a9b7-99980cd55794-model-cache\") pod \"49f718e6-37f4-4a2e-a9b7-99980cd55794\" (UID: \"49f718e6-37f4-4a2e-a9b7-99980cd55794\") " Apr 23 14:02:45.013851 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:02:45.013636 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/853fa6ac-eeaa-424a-905d-98a6ca5d65fc-tmp-dir\") pod \"853fa6ac-eeaa-424a-905d-98a6ca5d65fc\" (UID: \"853fa6ac-eeaa-424a-905d-98a6ca5d65fc\") " Apr 23 14:02:45.013851 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:02:45.013672 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/49f718e6-37f4-4a2e-a9b7-99980cd55794-home\") pod \"49f718e6-37f4-4a2e-a9b7-99980cd55794\" (UID: \"49f718e6-37f4-4a2e-a9b7-99980cd55794\") " Apr 23 14:02:45.013851 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:02:45.013711 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/853fa6ac-eeaa-424a-905d-98a6ca5d65fc-dshm\") pod \"853fa6ac-eeaa-424a-905d-98a6ca5d65fc\" (UID: \"853fa6ac-eeaa-424a-905d-98a6ca5d65fc\") " Apr 23 14:02:45.013851 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:02:45.013740 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v5bnl\" (UniqueName: \"kubernetes.io/projected/853fa6ac-eeaa-424a-905d-98a6ca5d65fc-kube-api-access-v5bnl\") pod \"853fa6ac-eeaa-424a-905d-98a6ca5d65fc\" (UID: \"853fa6ac-eeaa-424a-905d-98a6ca5d65fc\") " Apr 23 14:02:45.013851 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:02:45.013783 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/49f718e6-37f4-4a2e-a9b7-99980cd55794-dshm\") pod \"49f718e6-37f4-4a2e-a9b7-99980cd55794\" (UID: \"49f718e6-37f4-4a2e-a9b7-99980cd55794\") " Apr 23 14:02:45.013851 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:02:45.013831 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zngbx\" (UniqueName: \"kubernetes.io/projected/49f718e6-37f4-4a2e-a9b7-99980cd55794-kube-api-access-zngbx\") pod \"49f718e6-37f4-4a2e-a9b7-99980cd55794\" (UID: \"49f718e6-37f4-4a2e-a9b7-99980cd55794\") " Apr 23 14:02:45.013851 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:02:45.013836 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/853fa6ac-eeaa-424a-905d-98a6ca5d65fc-model-cache" (OuterVolumeSpecName: "model-cache") pod "853fa6ac-eeaa-424a-905d-98a6ca5d65fc" (UID: "853fa6ac-eeaa-424a-905d-98a6ca5d65fc"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 14:02:45.014180 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:02:45.013870 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/49f718e6-37f4-4a2e-a9b7-99980cd55794-tmp-dir\") pod \"49f718e6-37f4-4a2e-a9b7-99980cd55794\" (UID: \"49f718e6-37f4-4a2e-a9b7-99980cd55794\") " Apr 23 14:02:45.014180 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:02:45.013929 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/49f718e6-37f4-4a2e-a9b7-99980cd55794-tls-certs\") pod \"49f718e6-37f4-4a2e-a9b7-99980cd55794\" (UID: \"49f718e6-37f4-4a2e-a9b7-99980cd55794\") " Apr 23 14:02:45.014180 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:02:45.013997 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/853fa6ac-eeaa-424a-905d-98a6ca5d65fc-home\") pod \"853fa6ac-eeaa-424a-905d-98a6ca5d65fc\" (UID: \"853fa6ac-eeaa-424a-905d-98a6ca5d65fc\") " Apr 23 14:02:45.014322 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:02:45.014252 2571 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/853fa6ac-eeaa-424a-905d-98a6ca5d65fc-model-cache\") on node \"ip-10-0-128-108.ec2.internal\" DevicePath \"\"" Apr 23 14:02:45.014510 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:02:45.014479 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/49f718e6-37f4-4a2e-a9b7-99980cd55794-home" (OuterVolumeSpecName: "home") pod "49f718e6-37f4-4a2e-a9b7-99980cd55794" (UID: "49f718e6-37f4-4a2e-a9b7-99980cd55794"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 14:02:45.015173 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:02:45.014933 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/853fa6ac-eeaa-424a-905d-98a6ca5d65fc-home" (OuterVolumeSpecName: "home") pod "853fa6ac-eeaa-424a-905d-98a6ca5d65fc" (UID: "853fa6ac-eeaa-424a-905d-98a6ca5d65fc"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 14:02:45.015316 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:02:45.015261 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/49f718e6-37f4-4a2e-a9b7-99980cd55794-model-cache" (OuterVolumeSpecName: "model-cache") pod "49f718e6-37f4-4a2e-a9b7-99980cd55794" (UID: "49f718e6-37f4-4a2e-a9b7-99980cd55794"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 14:02:45.016455 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:02:45.016358 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/853fa6ac-eeaa-424a-905d-98a6ca5d65fc-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "853fa6ac-eeaa-424a-905d-98a6ca5d65fc" (UID: "853fa6ac-eeaa-424a-905d-98a6ca5d65fc"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 14:02:45.017177 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:02:45.017140 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/853fa6ac-eeaa-424a-905d-98a6ca5d65fc-kube-api-access-v5bnl" (OuterVolumeSpecName: "kube-api-access-v5bnl") pod "853fa6ac-eeaa-424a-905d-98a6ca5d65fc" (UID: "853fa6ac-eeaa-424a-905d-98a6ca5d65fc"). InnerVolumeSpecName "kube-api-access-v5bnl". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 14:02:45.017596 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:02:45.017568 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49f718e6-37f4-4a2e-a9b7-99980cd55794-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "49f718e6-37f4-4a2e-a9b7-99980cd55794" (UID: "49f718e6-37f4-4a2e-a9b7-99980cd55794"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 14:02:45.017900 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:02:45.017877 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/853fa6ac-eeaa-424a-905d-98a6ca5d65fc-dshm" (OuterVolumeSpecName: "dshm") pod "853fa6ac-eeaa-424a-905d-98a6ca5d65fc" (UID: "853fa6ac-eeaa-424a-905d-98a6ca5d65fc"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 14:02:45.018207 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:02:45.018189 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49f718e6-37f4-4a2e-a9b7-99980cd55794-kube-api-access-zngbx" (OuterVolumeSpecName: "kube-api-access-zngbx") pod "49f718e6-37f4-4a2e-a9b7-99980cd55794" (UID: "49f718e6-37f4-4a2e-a9b7-99980cd55794"). InnerVolumeSpecName "kube-api-access-zngbx". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 14:02:45.019522 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:02:45.019486 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/49f718e6-37f4-4a2e-a9b7-99980cd55794-dshm" (OuterVolumeSpecName: "dshm") pod "49f718e6-37f4-4a2e-a9b7-99980cd55794" (UID: "49f718e6-37f4-4a2e-a9b7-99980cd55794"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 14:02:45.033694 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:02:45.033667 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/49f718e6-37f4-4a2e-a9b7-99980cd55794-tmp-dir" (OuterVolumeSpecName: "tmp-dir") pod "49f718e6-37f4-4a2e-a9b7-99980cd55794" (UID: "49f718e6-37f4-4a2e-a9b7-99980cd55794"). InnerVolumeSpecName "tmp-dir". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 14:02:45.035473 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:02:45.035443 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/853fa6ac-eeaa-424a-905d-98a6ca5d65fc-tmp-dir" (OuterVolumeSpecName: "tmp-dir") pod "853fa6ac-eeaa-424a-905d-98a6ca5d65fc" (UID: "853fa6ac-eeaa-424a-905d-98a6ca5d65fc"). InnerVolumeSpecName "tmp-dir". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 14:02:45.082774 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:02:45.082722 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/49f718e6-37f4-4a2e-a9b7-99980cd55794-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "49f718e6-37f4-4a2e-a9b7-99980cd55794" (UID: "49f718e6-37f4-4a2e-a9b7-99980cd55794"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 14:02:45.083827 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:02:45.083802 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/853fa6ac-eeaa-424a-905d-98a6ca5d65fc-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "853fa6ac-eeaa-424a-905d-98a6ca5d65fc" (UID: "853fa6ac-eeaa-424a-905d-98a6ca5d65fc"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 14:02:45.106295 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:02:45.106263 2571 generic.go:358] "Generic (PLEG): container finished" podID="853fa6ac-eeaa-424a-905d-98a6ca5d65fc" containerID="1d2d8f765d3b889ab515142fc5721a151fd96107f19c790f08a5328d17556771" exitCode=137 Apr 23 14:02:45.106444 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:02:45.106324 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-66766f9bb7-t687q" Apr 23 14:02:45.106444 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:02:45.106367 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-66766f9bb7-t687q" event={"ID":"853fa6ac-eeaa-424a-905d-98a6ca5d65fc","Type":"ContainerDied","Data":"1d2d8f765d3b889ab515142fc5721a151fd96107f19c790f08a5328d17556771"} Apr 23 14:02:45.106444 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:02:45.106407 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-66766f9bb7-t687q" event={"ID":"853fa6ac-eeaa-424a-905d-98a6ca5d65fc","Type":"ContainerDied","Data":"4f337178fe28698a0f5225ec603455bab02a9cf06a55c075df21e03473b3232e"} Apr 23 14:02:45.106444 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:02:45.106425 2571 scope.go:117] "RemoveContainer" containerID="1d2d8f765d3b889ab515142fc5721a151fd96107f19c790f08a5328d17556771" Apr 23 14:02:45.107635 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:02:45.107617 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-769694c76-wpvqk_49f718e6-37f4-4a2e-a9b7-99980cd55794/main/0.log" Apr 23 14:02:45.108132 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:02:45.108111 2571 generic.go:358] "Generic (PLEG): container finished" podID="49f718e6-37f4-4a2e-a9b7-99980cd55794" containerID="595addf4dfbafcf32eea2e8d6a98c593d8d336118297be05ae841756989db1fe" exitCode=137 Apr 23 14:02:45.108220 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:02:45.108132 2571 generic.go:358] "Generic (PLEG): container finished" podID="49f718e6-37f4-4a2e-a9b7-99980cd55794" containerID="054d06216633bf533597b7ccdd74e459236198564da502860ca41015439ac3d1" exitCode=0 Apr 23 14:02:45.108220 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:02:45.108184 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-769694c76-wpvqk" Apr 23 14:02:45.108220 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:02:45.108200 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-769694c76-wpvqk" event={"ID":"49f718e6-37f4-4a2e-a9b7-99980cd55794","Type":"ContainerDied","Data":"595addf4dfbafcf32eea2e8d6a98c593d8d336118297be05ae841756989db1fe"} Apr 23 14:02:45.108373 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:02:45.108224 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-769694c76-wpvqk" event={"ID":"49f718e6-37f4-4a2e-a9b7-99980cd55794","Type":"ContainerDied","Data":"054d06216633bf533597b7ccdd74e459236198564da502860ca41015439ac3d1"} Apr 23 14:02:45.108373 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:02:45.108238 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-769694c76-wpvqk" event={"ID":"49f718e6-37f4-4a2e-a9b7-99980cd55794","Type":"ContainerDied","Data":"b7a44895454542707489b3eb4f844a3e2bfbe31f1f883a37e5cdad5032321b52"} Apr 23 14:02:45.114889 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:02:45.114867 2571 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/49f718e6-37f4-4a2e-a9b7-99980cd55794-home\") on node \"ip-10-0-128-108.ec2.internal\" DevicePath \"\"" Apr 23 14:02:45.114999 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:02:45.114893 2571 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/853fa6ac-eeaa-424a-905d-98a6ca5d65fc-dshm\") on node \"ip-10-0-128-108.ec2.internal\" DevicePath \"\"" Apr 23 14:02:45.114999 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:02:45.114907 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-v5bnl\" (UniqueName: \"kubernetes.io/projected/853fa6ac-eeaa-424a-905d-98a6ca5d65fc-kube-api-access-v5bnl\") on node \"ip-10-0-128-108.ec2.internal\" DevicePath \"\"" Apr 23 14:02:45.114999 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:02:45.114920 2571 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/49f718e6-37f4-4a2e-a9b7-99980cd55794-dshm\") on node \"ip-10-0-128-108.ec2.internal\" DevicePath \"\"" Apr 23 14:02:45.114999 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:02:45.114933 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-zngbx\" (UniqueName: \"kubernetes.io/projected/49f718e6-37f4-4a2e-a9b7-99980cd55794-kube-api-access-zngbx\") on node \"ip-10-0-128-108.ec2.internal\" DevicePath \"\"" Apr 23 14:02:45.114999 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:02:45.114946 2571 reconciler_common.go:299] "Volume detached for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/49f718e6-37f4-4a2e-a9b7-99980cd55794-tmp-dir\") on node \"ip-10-0-128-108.ec2.internal\" DevicePath \"\"" Apr 23 14:02:45.114999 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:02:45.114960 2571 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/49f718e6-37f4-4a2e-a9b7-99980cd55794-tls-certs\") on node \"ip-10-0-128-108.ec2.internal\" DevicePath \"\"" Apr 23 14:02:45.114999 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:02:45.114973 2571 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/853fa6ac-eeaa-424a-905d-98a6ca5d65fc-home\") on node \"ip-10-0-128-108.ec2.internal\" DevicePath \"\"" Apr 23 14:02:45.114999 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:02:45.114985 2571 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/853fa6ac-eeaa-424a-905d-98a6ca5d65fc-tls-certs\") on node \"ip-10-0-128-108.ec2.internal\" DevicePath \"\"" Apr 23 14:02:45.114999 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:02:45.114997 2571 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/853fa6ac-eeaa-424a-905d-98a6ca5d65fc-kserve-provision-location\") on node \"ip-10-0-128-108.ec2.internal\" DevicePath \"\"" Apr 23 14:02:45.115387 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:02:45.115013 2571 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/49f718e6-37f4-4a2e-a9b7-99980cd55794-kserve-provision-location\") on node \"ip-10-0-128-108.ec2.internal\" DevicePath \"\"" Apr 23 14:02:45.115387 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:02:45.115027 2571 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/49f718e6-37f4-4a2e-a9b7-99980cd55794-model-cache\") on node \"ip-10-0-128-108.ec2.internal\" DevicePath \"\"" Apr 23 14:02:45.115387 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:02:45.115042 2571 reconciler_common.go:299] "Volume detached for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/853fa6ac-eeaa-424a-905d-98a6ca5d65fc-tmp-dir\") on node \"ip-10-0-128-108.ec2.internal\" DevicePath \"\"" Apr 23 14:02:45.115387 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:02:45.115287 2571 scope.go:117] "RemoveContainer" containerID="47e97cff2ded92d1c7bfa55de5cccec6cca47e9dade7775bf58a494e22a96398" Apr 23 14:02:45.129754 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:02:45.129738 2571 scope.go:117] "RemoveContainer" containerID="1d2d8f765d3b889ab515142fc5721a151fd96107f19c790f08a5328d17556771" Apr 23 14:02:45.129975 ip-10-0-128-108 kubenswrapper[2571]: E0423 14:02:45.129957 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1d2d8f765d3b889ab515142fc5721a151fd96107f19c790f08a5328d17556771\": container with ID starting with 1d2d8f765d3b889ab515142fc5721a151fd96107f19c790f08a5328d17556771 not found: ID does not exist" containerID="1d2d8f765d3b889ab515142fc5721a151fd96107f19c790f08a5328d17556771" Apr 23 14:02:45.130020 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:02:45.129982 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d2d8f765d3b889ab515142fc5721a151fd96107f19c790f08a5328d17556771"} err="failed to get container status \"1d2d8f765d3b889ab515142fc5721a151fd96107f19c790f08a5328d17556771\": rpc error: code = NotFound desc = could not find container \"1d2d8f765d3b889ab515142fc5721a151fd96107f19c790f08a5328d17556771\": container with ID starting with 1d2d8f765d3b889ab515142fc5721a151fd96107f19c790f08a5328d17556771 not found: ID does not exist" Apr 23 14:02:45.130020 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:02:45.129997 2571 scope.go:117] "RemoveContainer" containerID="47e97cff2ded92d1c7bfa55de5cccec6cca47e9dade7775bf58a494e22a96398" Apr 23 14:02:45.130187 ip-10-0-128-108 kubenswrapper[2571]: E0423 14:02:45.130174 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"47e97cff2ded92d1c7bfa55de5cccec6cca47e9dade7775bf58a494e22a96398\": container with ID starting with 47e97cff2ded92d1c7bfa55de5cccec6cca47e9dade7775bf58a494e22a96398 not found: ID does not exist" containerID="47e97cff2ded92d1c7bfa55de5cccec6cca47e9dade7775bf58a494e22a96398" Apr 23 14:02:45.130222 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:02:45.130191 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"47e97cff2ded92d1c7bfa55de5cccec6cca47e9dade7775bf58a494e22a96398"} err="failed to get container status \"47e97cff2ded92d1c7bfa55de5cccec6cca47e9dade7775bf58a494e22a96398\": rpc error: code = NotFound desc = could not find container \"47e97cff2ded92d1c7bfa55de5cccec6cca47e9dade7775bf58a494e22a96398\": container with ID starting with 47e97cff2ded92d1c7bfa55de5cccec6cca47e9dade7775bf58a494e22a96398 not found: ID does not exist" Apr 23 14:02:45.130222 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:02:45.130201 2571 scope.go:117] "RemoveContainer" containerID="595addf4dfbafcf32eea2e8d6a98c593d8d336118297be05ae841756989db1fe" Apr 23 14:02:45.136621 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:02:45.136594 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-769694c76-wpvqk"] Apr 23 14:02:45.138271 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:02:45.138259 2571 scope.go:117] "RemoveContainer" containerID="4e2ba7888a2ac0c1190ced48325435de5905ed23d355be54c284d8bbba8f942d" Apr 23 14:02:45.142765 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:02:45.142746 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-769694c76-wpvqk"] Apr 23 14:02:45.148067 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:02:45.148047 2571 scope.go:117] "RemoveContainer" containerID="054d06216633bf533597b7ccdd74e459236198564da502860ca41015439ac3d1" Apr 23 14:02:45.154852 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:02:45.154828 2571 scope.go:117] "RemoveContainer" containerID="595addf4dfbafcf32eea2e8d6a98c593d8d336118297be05ae841756989db1fe" Apr 23 14:02:45.155083 ip-10-0-128-108 kubenswrapper[2571]: E0423 14:02:45.155066 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"595addf4dfbafcf32eea2e8d6a98c593d8d336118297be05ae841756989db1fe\": container with ID starting with 595addf4dfbafcf32eea2e8d6a98c593d8d336118297be05ae841756989db1fe not found: ID does not exist" containerID="595addf4dfbafcf32eea2e8d6a98c593d8d336118297be05ae841756989db1fe" Apr 23 14:02:45.155149 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:02:45.155088 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"595addf4dfbafcf32eea2e8d6a98c593d8d336118297be05ae841756989db1fe"} err="failed to get container status \"595addf4dfbafcf32eea2e8d6a98c593d8d336118297be05ae841756989db1fe\": rpc error: code = NotFound desc = could not find container \"595addf4dfbafcf32eea2e8d6a98c593d8d336118297be05ae841756989db1fe\": container with ID starting with 595addf4dfbafcf32eea2e8d6a98c593d8d336118297be05ae841756989db1fe not found: ID does not exist" Apr 23 14:02:45.155149 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:02:45.155107 2571 scope.go:117] "RemoveContainer" containerID="4e2ba7888a2ac0c1190ced48325435de5905ed23d355be54c284d8bbba8f942d" Apr 23 14:02:45.155293 ip-10-0-128-108 kubenswrapper[2571]: E0423 14:02:45.155278 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4e2ba7888a2ac0c1190ced48325435de5905ed23d355be54c284d8bbba8f942d\": container with ID starting with 4e2ba7888a2ac0c1190ced48325435de5905ed23d355be54c284d8bbba8f942d not found: ID does not exist" containerID="4e2ba7888a2ac0c1190ced48325435de5905ed23d355be54c284d8bbba8f942d" Apr 23 14:02:45.155371 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:02:45.155296 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4e2ba7888a2ac0c1190ced48325435de5905ed23d355be54c284d8bbba8f942d"} err="failed to get container status \"4e2ba7888a2ac0c1190ced48325435de5905ed23d355be54c284d8bbba8f942d\": rpc error: code = NotFound desc = could not find container \"4e2ba7888a2ac0c1190ced48325435de5905ed23d355be54c284d8bbba8f942d\": container with ID starting with 4e2ba7888a2ac0c1190ced48325435de5905ed23d355be54c284d8bbba8f942d not found: ID does not exist" Apr 23 14:02:45.155371 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:02:45.155309 2571 scope.go:117] "RemoveContainer" containerID="054d06216633bf533597b7ccdd74e459236198564da502860ca41015439ac3d1" Apr 23 14:02:45.155545 ip-10-0-128-108 kubenswrapper[2571]: E0423 14:02:45.155530 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"054d06216633bf533597b7ccdd74e459236198564da502860ca41015439ac3d1\": container with ID starting with 054d06216633bf533597b7ccdd74e459236198564da502860ca41015439ac3d1 not found: ID does not exist" containerID="054d06216633bf533597b7ccdd74e459236198564da502860ca41015439ac3d1" Apr 23 14:02:45.155607 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:02:45.155548 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"054d06216633bf533597b7ccdd74e459236198564da502860ca41015439ac3d1"} err="failed to get container status \"054d06216633bf533597b7ccdd74e459236198564da502860ca41015439ac3d1\": rpc error: code = NotFound desc = could not find container \"054d06216633bf533597b7ccdd74e459236198564da502860ca41015439ac3d1\": container with ID starting with 054d06216633bf533597b7ccdd74e459236198564da502860ca41015439ac3d1 not found: ID does not exist" Apr 23 14:02:45.155607 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:02:45.155563 2571 scope.go:117] "RemoveContainer" containerID="595addf4dfbafcf32eea2e8d6a98c593d8d336118297be05ae841756989db1fe" Apr 23 14:02:45.155768 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:02:45.155749 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"595addf4dfbafcf32eea2e8d6a98c593d8d336118297be05ae841756989db1fe"} err="failed to get container status \"595addf4dfbafcf32eea2e8d6a98c593d8d336118297be05ae841756989db1fe\": rpc error: code = NotFound desc = could not find container \"595addf4dfbafcf32eea2e8d6a98c593d8d336118297be05ae841756989db1fe\": container with ID starting with 595addf4dfbafcf32eea2e8d6a98c593d8d336118297be05ae841756989db1fe not found: ID does not exist" Apr 23 14:02:45.155828 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:02:45.155768 2571 scope.go:117] "RemoveContainer" containerID="4e2ba7888a2ac0c1190ced48325435de5905ed23d355be54c284d8bbba8f942d" Apr 23 14:02:45.155950 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:02:45.155934 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4e2ba7888a2ac0c1190ced48325435de5905ed23d355be54c284d8bbba8f942d"} err="failed to get container status \"4e2ba7888a2ac0c1190ced48325435de5905ed23d355be54c284d8bbba8f942d\": rpc error: code = NotFound desc = could not find container \"4e2ba7888a2ac0c1190ced48325435de5905ed23d355be54c284d8bbba8f942d\": container with ID starting with 4e2ba7888a2ac0c1190ced48325435de5905ed23d355be54c284d8bbba8f942d not found: ID does not exist" Apr 23 14:02:45.156011 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:02:45.155950 2571 scope.go:117] "RemoveContainer" containerID="054d06216633bf533597b7ccdd74e459236198564da502860ca41015439ac3d1" Apr 23 14:02:45.156145 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:02:45.156130 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"054d06216633bf533597b7ccdd74e459236198564da502860ca41015439ac3d1"} err="failed to get container status \"054d06216633bf533597b7ccdd74e459236198564da502860ca41015439ac3d1\": rpc error: code = NotFound desc = could not find container \"054d06216633bf533597b7ccdd74e459236198564da502860ca41015439ac3d1\": container with ID starting with 054d06216633bf533597b7ccdd74e459236198564da502860ca41015439ac3d1 not found: ID does not exist" Apr 23 14:02:45.160975 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:02:45.160951 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-66766f9bb7-t687q"] Apr 23 14:02:45.167260 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:02:45.167241 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-66766f9bb7-t687q"] Apr 23 14:02:45.207848 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:02:45.207820 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_istiod-openshift-gateway-55ff986f96-rmkrp_139e0319-1cde-4812-8f88-f87a37388144/discovery/0.log" Apr 23 14:02:45.222257 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:02:45.222238 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_openshift-ai-inference-openshift-default-7c5447bb76-nsr6v_340b1383-bc96-4f31-a84c-6b1b58369cbe/istio-proxy/0.log" Apr 23 14:02:46.042977 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:02:46.042894 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-operator-7587b89b76-f4rbk_97a95b10-b768-476e-998a-5cfd6ae0c06c/manager/0.log" Apr 23 14:02:46.057539 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:02:46.057516 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_dns-operator-controller-manager-844548ff4c-pbnzh_e4a6a5ec-8f9e-462f-b00b-667f60065b16/manager/0.log" Apr 23 14:02:46.121210 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:02:46.121164 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-controller-manager-6ddf9554fc-j6x6p_9a520890-080f-4739-bbea-32aea118ade2/manager/0.log" Apr 23 14:02:46.139304 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:02:46.139284 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-limitador-67566c68b4-sh5bk_c74e27bd-11d7-4cf1-a944-09e65e8f6fe3/limitador/0.log" Apr 23 14:02:46.434368 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:02:46.434272 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49f718e6-37f4-4a2e-a9b7-99980cd55794" path="/var/lib/kubelet/pods/49f718e6-37f4-4a2e-a9b7-99980cd55794/volumes" Apr 23 14:02:46.434767 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:02:46.434753 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="853fa6ac-eeaa-424a-905d-98a6ca5d65fc" path="/var/lib/kubelet/pods/853fa6ac-eeaa-424a-905d-98a6ca5d65fc/volumes" Apr 23 14:02:51.572436 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:02:51.572404 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-lv2lp_1e27565d-df42-41a7-9d41-eb8595cf751e/global-pull-secret-syncer/0.log" Apr 23 14:02:51.652470 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:02:51.652440 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-5kfq8_c3276fec-f4f2-47b0-bc26-38fc5eea9ab7/konnectivity-agent/0.log" Apr 23 14:02:51.739163 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:02:51.739122 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-128-108.ec2.internal_95cd19913bda1121c44084ba2ccca700/haproxy/0.log" Apr 23 14:02:55.604703 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:02:55.604668 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-operator-7587b89b76-f4rbk_97a95b10-b768-476e-998a-5cfd6ae0c06c/manager/0.log" Apr 23 14:02:55.658461 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:02:55.658436 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_dns-operator-controller-manager-844548ff4c-pbnzh_e4a6a5ec-8f9e-462f-b00b-667f60065b16/manager/0.log" Apr 23 14:02:55.740197 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:02:55.740167 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-controller-manager-6ddf9554fc-j6x6p_9a520890-080f-4739-bbea-32aea118ade2/manager/0.log" Apr 23 14:02:55.769099 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:02:55.769068 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-limitador-67566c68b4-sh5bk_c74e27bd-11d7-4cf1-a944-09e65e8f6fe3/limitador/0.log" Apr 23 14:02:57.288747 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:02:57.288718 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-cxbrs_743cfb34-da8a-415a-8dbe-192227645691/node-exporter/0.log" Apr 23 14:02:57.318541 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:02:57.318514 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-cxbrs_743cfb34-da8a-415a-8dbe-192227645691/kube-rbac-proxy/0.log" Apr 23 14:02:57.342107 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:02:57.342086 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-cxbrs_743cfb34-da8a-415a-8dbe-192227645691/init-textfile/0.log" Apr 23 14:03:00.707155 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:03:00.707128 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-7s9qb/perf-node-gather-daemonset-8ggtq"] Apr 23 14:03:00.707515 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:03:00.707459 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="49f718e6-37f4-4a2e-a9b7-99980cd55794" containerName="main" Apr 23 14:03:00.707515 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:03:00.707471 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="49f718e6-37f4-4a2e-a9b7-99980cd55794" containerName="main" Apr 23 14:03:00.707515 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:03:00.707481 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="853fa6ac-eeaa-424a-905d-98a6ca5d65fc" containerName="storage-initializer" Apr 23 14:03:00.707515 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:03:00.707487 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="853fa6ac-eeaa-424a-905d-98a6ca5d65fc" containerName="storage-initializer" Apr 23 14:03:00.707515 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:03:00.707495 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="49f718e6-37f4-4a2e-a9b7-99980cd55794" containerName="storage-initializer" Apr 23 14:03:00.707515 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:03:00.707501 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="49f718e6-37f4-4a2e-a9b7-99980cd55794" containerName="storage-initializer" Apr 23 14:03:00.707515 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:03:00.707507 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a20545ad-9664-41b0-bb23-f4c7a85b36d8" containerName="main" Apr 23 14:03:00.707515 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:03:00.707512 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="a20545ad-9664-41b0-bb23-f4c7a85b36d8" containerName="main" Apr 23 14:03:00.707515 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:03:00.707517 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="49f718e6-37f4-4a2e-a9b7-99980cd55794" containerName="llm-d-routing-sidecar" Apr 23 14:03:00.707515 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:03:00.707522 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="49f718e6-37f4-4a2e-a9b7-99980cd55794" containerName="llm-d-routing-sidecar" Apr 23 14:03:00.707860 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:03:00.707529 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a20545ad-9664-41b0-bb23-f4c7a85b36d8" containerName="storage-initializer" Apr 23 14:03:00.707860 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:03:00.707534 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="a20545ad-9664-41b0-bb23-f4c7a85b36d8" containerName="storage-initializer" Apr 23 14:03:00.707860 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:03:00.707541 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="853fa6ac-eeaa-424a-905d-98a6ca5d65fc" containerName="main" Apr 23 14:03:00.707860 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:03:00.707545 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="853fa6ac-eeaa-424a-905d-98a6ca5d65fc" containerName="main" Apr 23 14:03:00.707860 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:03:00.707599 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="49f718e6-37f4-4a2e-a9b7-99980cd55794" containerName="main" Apr 23 14:03:00.707860 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:03:00.707607 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="49f718e6-37f4-4a2e-a9b7-99980cd55794" containerName="llm-d-routing-sidecar" Apr 23 14:03:00.707860 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:03:00.707614 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="853fa6ac-eeaa-424a-905d-98a6ca5d65fc" containerName="main" Apr 23 14:03:00.707860 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:03:00.707621 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="a20545ad-9664-41b0-bb23-f4c7a85b36d8" containerName="main" Apr 23 14:03:00.711622 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:03:00.711608 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7s9qb/perf-node-gather-daemonset-8ggtq" Apr 23 14:03:00.714663 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:03:00.714632 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-7s9qb\"/\"kube-root-ca.crt\"" Apr 23 14:03:00.714975 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:03:00.714954 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-7s9qb\"/\"openshift-service-ca.crt\"" Apr 23 14:03:00.715650 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:03:00.715612 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-7s9qb\"/\"default-dockercfg-b5cf6\"" Apr 23 14:03:00.723250 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:03:00.723225 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-7s9qb/perf-node-gather-daemonset-8ggtq"] Apr 23 14:03:00.734049 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:03:00.734028 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/18b42646-3458-4e34-b32c-88ea146c2dfd-lib-modules\") pod \"perf-node-gather-daemonset-8ggtq\" (UID: \"18b42646-3458-4e34-b32c-88ea146c2dfd\") " pod="openshift-must-gather-7s9qb/perf-node-gather-daemonset-8ggtq" Apr 23 14:03:00.734283 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:03:00.734262 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w5cff\" (UniqueName: \"kubernetes.io/projected/18b42646-3458-4e34-b32c-88ea146c2dfd-kube-api-access-w5cff\") pod \"perf-node-gather-daemonset-8ggtq\" (UID: \"18b42646-3458-4e34-b32c-88ea146c2dfd\") " pod="openshift-must-gather-7s9qb/perf-node-gather-daemonset-8ggtq" Apr 23 14:03:00.734519 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:03:00.734498 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/18b42646-3458-4e34-b32c-88ea146c2dfd-sys\") pod \"perf-node-gather-daemonset-8ggtq\" (UID: \"18b42646-3458-4e34-b32c-88ea146c2dfd\") " pod="openshift-must-gather-7s9qb/perf-node-gather-daemonset-8ggtq" Apr 23 14:03:00.734613 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:03:00.734536 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/18b42646-3458-4e34-b32c-88ea146c2dfd-podres\") pod \"perf-node-gather-daemonset-8ggtq\" (UID: \"18b42646-3458-4e34-b32c-88ea146c2dfd\") " pod="openshift-must-gather-7s9qb/perf-node-gather-daemonset-8ggtq" Apr 23 14:03:00.734613 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:03:00.734562 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/18b42646-3458-4e34-b32c-88ea146c2dfd-proc\") pod \"perf-node-gather-daemonset-8ggtq\" (UID: \"18b42646-3458-4e34-b32c-88ea146c2dfd\") " pod="openshift-must-gather-7s9qb/perf-node-gather-daemonset-8ggtq" Apr 23 14:03:00.835497 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:03:00.835465 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/18b42646-3458-4e34-b32c-88ea146c2dfd-lib-modules\") pod \"perf-node-gather-daemonset-8ggtq\" (UID: \"18b42646-3458-4e34-b32c-88ea146c2dfd\") " pod="openshift-must-gather-7s9qb/perf-node-gather-daemonset-8ggtq" Apr 23 14:03:00.835652 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:03:00.835524 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w5cff\" (UniqueName: \"kubernetes.io/projected/18b42646-3458-4e34-b32c-88ea146c2dfd-kube-api-access-w5cff\") pod \"perf-node-gather-daemonset-8ggtq\" (UID: \"18b42646-3458-4e34-b32c-88ea146c2dfd\") " pod="openshift-must-gather-7s9qb/perf-node-gather-daemonset-8ggtq" Apr 23 14:03:00.835652 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:03:00.835551 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/18b42646-3458-4e34-b32c-88ea146c2dfd-sys\") pod \"perf-node-gather-daemonset-8ggtq\" (UID: \"18b42646-3458-4e34-b32c-88ea146c2dfd\") " pod="openshift-must-gather-7s9qb/perf-node-gather-daemonset-8ggtq" Apr 23 14:03:00.835652 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:03:00.835571 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/18b42646-3458-4e34-b32c-88ea146c2dfd-podres\") pod \"perf-node-gather-daemonset-8ggtq\" (UID: \"18b42646-3458-4e34-b32c-88ea146c2dfd\") " pod="openshift-must-gather-7s9qb/perf-node-gather-daemonset-8ggtq" Apr 23 14:03:00.835652 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:03:00.835588 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/18b42646-3458-4e34-b32c-88ea146c2dfd-proc\") pod \"perf-node-gather-daemonset-8ggtq\" (UID: \"18b42646-3458-4e34-b32c-88ea146c2dfd\") " pod="openshift-must-gather-7s9qb/perf-node-gather-daemonset-8ggtq" Apr 23 14:03:00.835652 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:03:00.835633 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/18b42646-3458-4e34-b32c-88ea146c2dfd-lib-modules\") pod \"perf-node-gather-daemonset-8ggtq\" (UID: \"18b42646-3458-4e34-b32c-88ea146c2dfd\") " pod="openshift-must-gather-7s9qb/perf-node-gather-daemonset-8ggtq" Apr 23 14:03:00.835936 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:03:00.835659 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/18b42646-3458-4e34-b32c-88ea146c2dfd-sys\") pod \"perf-node-gather-daemonset-8ggtq\" (UID: \"18b42646-3458-4e34-b32c-88ea146c2dfd\") " pod="openshift-must-gather-7s9qb/perf-node-gather-daemonset-8ggtq" Apr 23 14:03:00.835936 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:03:00.835700 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/18b42646-3458-4e34-b32c-88ea146c2dfd-proc\") pod \"perf-node-gather-daemonset-8ggtq\" (UID: \"18b42646-3458-4e34-b32c-88ea146c2dfd\") " pod="openshift-must-gather-7s9qb/perf-node-gather-daemonset-8ggtq" Apr 23 14:03:00.835936 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:03:00.835742 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/18b42646-3458-4e34-b32c-88ea146c2dfd-podres\") pod \"perf-node-gather-daemonset-8ggtq\" (UID: \"18b42646-3458-4e34-b32c-88ea146c2dfd\") " pod="openshift-must-gather-7s9qb/perf-node-gather-daemonset-8ggtq" Apr 23 14:03:00.845591 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:03:00.845567 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w5cff\" (UniqueName: \"kubernetes.io/projected/18b42646-3458-4e34-b32c-88ea146c2dfd-kube-api-access-w5cff\") pod \"perf-node-gather-daemonset-8ggtq\" (UID: \"18b42646-3458-4e34-b32c-88ea146c2dfd\") " pod="openshift-must-gather-7s9qb/perf-node-gather-daemonset-8ggtq" Apr 23 14:03:01.024093 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:03:01.023981 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7s9qb/perf-node-gather-daemonset-8ggtq" Apr 23 14:03:01.145568 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:03:01.145540 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-7s9qb/perf-node-gather-daemonset-8ggtq"] Apr 23 14:03:01.146960 ip-10-0-128-108 kubenswrapper[2571]: W0423 14:03:01.146932 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod18b42646_3458_4e34_b32c_88ea146c2dfd.slice/crio-812fb32fee690032f7b898fc0acdc3a4da3c2bbd7d6cc7330e3b5e8289bfb9b9 WatchSource:0}: Error finding container 812fb32fee690032f7b898fc0acdc3a4da3c2bbd7d6cc7330e3b5e8289bfb9b9: Status 404 returned error can't find the container with id 812fb32fee690032f7b898fc0acdc3a4da3c2bbd7d6cc7330e3b5e8289bfb9b9 Apr 23 14:03:01.172190 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:03:01.172160 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7s9qb/perf-node-gather-daemonset-8ggtq" event={"ID":"18b42646-3458-4e34-b32c-88ea146c2dfd","Type":"ContainerStarted","Data":"812fb32fee690032f7b898fc0acdc3a4da3c2bbd7d6cc7330e3b5e8289bfb9b9"} Apr 23 14:03:01.986653 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:03:01.986626 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-vf5fb_495e17e1-7de5-454d-a2b3-240f3cf879a4/dns/0.log" Apr 23 14:03:02.022650 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:03:02.022622 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-vf5fb_495e17e1-7de5-454d-a2b3-240f3cf879a4/kube-rbac-proxy/0.log" Apr 23 14:03:02.172397 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:03:02.172368 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-nbf64_cecba8e3-e60c-4053-96af-5ab1c4960855/dns-node-resolver/0.log" Apr 23 14:03:02.176425 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:03:02.176395 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7s9qb/perf-node-gather-daemonset-8ggtq" event={"ID":"18b42646-3458-4e34-b32c-88ea146c2dfd","Type":"ContainerStarted","Data":"dde33431fa3ce9968ab20946bbd1db079944592da7d94b63e2b2fb618ff9bb49"} Apr 23 14:03:02.176544 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:03:02.176450 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-7s9qb/perf-node-gather-daemonset-8ggtq" Apr 23 14:03:02.198466 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:03:02.198425 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-7s9qb/perf-node-gather-daemonset-8ggtq" podStartSLOduration=2.198413923 podStartE2EDuration="2.198413923s" podCreationTimestamp="2026-04-23 14:03:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 14:03:02.19558189 +0000 UTC m=+1916.333860582" watchObservedRunningTime="2026-04-23 14:03:02.198413923 +0000 UTC m=+1916.336692608" Apr 23 14:03:02.671137 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:03:02.671105 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-2rlxj_c524008d-56ba-4f94-99b4-bd3ef55ba66f/node-ca/0.log" Apr 23 14:03:03.595424 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:03:03.595391 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_istiod-openshift-gateway-55ff986f96-rmkrp_139e0319-1cde-4812-8f88-f87a37388144/discovery/0.log" Apr 23 14:03:03.624436 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:03:03.624402 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_openshift-ai-inference-openshift-default-7c5447bb76-nsr6v_340b1383-bc96-4f31-a84c-6b1b58369cbe/istio-proxy/0.log" Apr 23 14:03:04.188133 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:03:04.188093 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-wtf4m_6566a883-ca9d-4251-b7f3-7e0e087e3020/serve-healthcheck-canary/0.log" Apr 23 14:03:04.817728 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:03:04.817698 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-pn526_12e5c21a-627a-4ad7-b68e-6300b337cf27/kube-rbac-proxy/0.log" Apr 23 14:03:04.846346 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:03:04.846306 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-pn526_12e5c21a-627a-4ad7-b68e-6300b337cf27/exporter/0.log" Apr 23 14:03:04.875141 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:03:04.875119 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-pn526_12e5c21a-627a-4ad7-b68e-6300b337cf27/extractor/0.log" Apr 23 14:03:08.189905 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:03:08.189878 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-7s9qb/perf-node-gather-daemonset-8ggtq" Apr 23 14:03:08.385357 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:03:08.385321 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_llmisvc-controller-manager-5495d444b8-sfnwn_301592a3-8dfc-4cbf-99df-25de2040fb34/manager/0.log" Apr 23 14:03:15.622504 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:03:15.622475 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-2g7ct_abbb4dd8-8bc4-4850-bdc6-2b1ed2494101/kube-multus/0.log" Apr 23 14:03:16.308457 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:03:16.308426 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-rm6cm_c1494fbd-44b0-417d-857b-089a1705bbe9/kube-multus-additional-cni-plugins/0.log" Apr 23 14:03:16.348386 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:03:16.348357 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-rm6cm_c1494fbd-44b0-417d-857b-089a1705bbe9/egress-router-binary-copy/0.log" Apr 23 14:03:16.392497 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:03:16.392469 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-rm6cm_c1494fbd-44b0-417d-857b-089a1705bbe9/cni-plugins/0.log" Apr 23 14:03:16.438301 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:03:16.438278 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-rm6cm_c1494fbd-44b0-417d-857b-089a1705bbe9/bond-cni-plugin/0.log" Apr 23 14:03:16.480385 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:03:16.480356 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-rm6cm_c1494fbd-44b0-417d-857b-089a1705bbe9/routeoverride-cni/0.log" Apr 23 14:03:16.524014 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:03:16.523990 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-rm6cm_c1494fbd-44b0-417d-857b-089a1705bbe9/whereabouts-cni-bincopy/0.log" Apr 23 14:03:16.556268 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:03:16.556246 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-rm6cm_c1494fbd-44b0-417d-857b-089a1705bbe9/whereabouts-cni/0.log" Apr 23 14:03:16.774535 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:03:16.774494 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-k958q_6866a2aa-1943-4e03-a99a-8b054a2434c8/network-metrics-daemon/0.log" Apr 23 14:03:16.803326 ip-10-0-128-108 kubenswrapper[2571]: I0423 14:03:16.803297 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-k958q_6866a2aa-1943-4e03-a99a-8b054a2434c8/kube-rbac-proxy/0.log"