Apr 16 19:53:52.712352 ip-10-0-128-48 systemd[1]: Starting Kubernetes Kubelet... Apr 16 19:53:53.105844 ip-10-0-128-48 kubenswrapper[2572]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 19:53:53.105844 ip-10-0-128-48 kubenswrapper[2572]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 16 19:53:53.105844 ip-10-0-128-48 kubenswrapper[2572]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 19:53:53.105844 ip-10-0-128-48 kubenswrapper[2572]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 16 19:53:53.105844 ip-10-0-128-48 kubenswrapper[2572]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 19:53:53.108082 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.107938 2572 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 16 19:53:53.113377 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.113355 2572 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 19:53:53.113377 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.113373 2572 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 19:53:53.113377 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.113377 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 19:53:53.113377 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.113380 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 19:53:53.113377 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.113384 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 19:53:53.113560 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.113387 2572 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 19:53:53.113560 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.113390 2572 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 19:53:53.113560 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.113393 2572 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 19:53:53.113560 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.113397 2572 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 19:53:53.113560 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.113400 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 19:53:53.113560 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.113403 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 19:53:53.113560 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.113405 2572 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 19:53:53.113560 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.113408 2572 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 19:53:53.113560 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.113411 2572 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 19:53:53.113560 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.113413 2572 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 19:53:53.113560 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.113416 2572 feature_gate.go:328] unrecognized feature gate: Example Apr 16 19:53:53.113560 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.113419 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 19:53:53.113560 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.113421 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 19:53:53.113560 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.113424 2572 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 19:53:53.113560 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.113427 2572 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 19:53:53.113560 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.113429 2572 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 19:53:53.113560 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.113432 2572 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 19:53:53.113560 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.113434 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 19:53:53.113560 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.113437 2572 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 19:53:53.113560 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.113444 2572 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 19:53:53.114030 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.113447 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 19:53:53.114030 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.113451 2572 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 19:53:53.114030 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.113453 2572 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 19:53:53.114030 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.113456 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 19:53:53.114030 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.113458 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 19:53:53.114030 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.113461 2572 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 19:53:53.114030 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.113463 2572 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 19:53:53.114030 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.113466 2572 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 19:53:53.114030 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.113468 2572 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 19:53:53.114030 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.113473 2572 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 19:53:53.114030 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.113477 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 19:53:53.114030 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.113480 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 19:53:53.114030 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.113483 2572 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 19:53:53.114030 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.113487 2572 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 19:53:53.114030 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.113490 2572 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 19:53:53.114030 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.113493 2572 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 19:53:53.114030 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.113496 2572 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 19:53:53.114030 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.113498 2572 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 19:53:53.114030 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.113501 2572 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 19:53:53.114522 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.113504 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 19:53:53.114522 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.113506 2572 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 19:53:53.114522 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.113509 2572 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 19:53:53.114522 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.113512 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 19:53:53.114522 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.113514 2572 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 19:53:53.114522 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.113517 2572 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 19:53:53.114522 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.113520 2572 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 19:53:53.114522 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.113523 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 19:53:53.114522 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.113527 2572 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 19:53:53.114522 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.113531 2572 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 19:53:53.114522 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.113534 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 19:53:53.114522 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.113537 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 19:53:53.114522 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.113540 2572 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 19:53:53.114522 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.113543 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 19:53:53.114522 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.113545 2572 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 19:53:53.114522 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.113548 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 19:53:53.114522 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.113550 2572 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 19:53:53.114522 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.113553 2572 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 19:53:53.114522 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.113556 2572 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 19:53:53.115001 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.113559 2572 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 19:53:53.115001 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.113562 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 19:53:53.115001 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.113565 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 19:53:53.115001 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.113568 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 19:53:53.115001 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.113571 2572 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 19:53:53.115001 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.113574 2572 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 19:53:53.115001 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.113576 2572 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 19:53:53.115001 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.113579 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 19:53:53.115001 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.113582 2572 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 19:53:53.115001 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.113585 2572 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 19:53:53.115001 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.113588 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 19:53:53.115001 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.113591 2572 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 19:53:53.115001 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.113594 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 19:53:53.115001 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.113596 2572 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 19:53:53.115001 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.113599 2572 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 19:53:53.115001 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.113603 2572 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 19:53:53.115001 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.113606 2572 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 19:53:53.115001 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.113608 2572 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 19:53:53.115001 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.113611 2572 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 19:53:53.115495 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.113613 2572 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 19:53:53.115495 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.113616 2572 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 19:53:53.115495 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.113619 2572 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 19:53:53.115495 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.113621 2572 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 19:53:53.115495 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.114665 2572 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 19:53:53.115495 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.114672 2572 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 19:53:53.115495 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.114674 2572 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 19:53:53.115495 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.114678 2572 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 19:53:53.115495 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.114681 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 19:53:53.115495 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.114684 2572 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 19:53:53.115495 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.114686 2572 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 19:53:53.115495 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.114689 2572 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 19:53:53.115495 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.114692 2572 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 19:53:53.115495 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.114695 2572 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 19:53:53.115495 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.114697 2572 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 19:53:53.115495 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.114700 2572 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 19:53:53.115495 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.114702 2572 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 19:53:53.115495 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.114705 2572 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 19:53:53.115495 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.114708 2572 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 19:53:53.115495 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.114710 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 19:53:53.116009 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.114713 2572 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 19:53:53.116009 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.114716 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 19:53:53.116009 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.114718 2572 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 19:53:53.116009 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.114721 2572 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 19:53:53.116009 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.114723 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 19:53:53.116009 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.114726 2572 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 19:53:53.116009 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.114731 2572 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 19:53:53.116009 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.114734 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 19:53:53.116009 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.114738 2572 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 19:53:53.116009 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.114740 2572 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 19:53:53.116009 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.114743 2572 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 19:53:53.116009 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.114746 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 19:53:53.116009 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.114748 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 19:53:53.116009 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.114751 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 19:53:53.116009 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.114753 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 19:53:53.116009 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.114756 2572 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 19:53:53.116009 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.114759 2572 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 19:53:53.116009 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.114761 2572 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 19:53:53.116009 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.114764 2572 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 19:53:53.116495 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.114766 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 19:53:53.116495 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.114769 2572 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 19:53:53.116495 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.114771 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 19:53:53.116495 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.114774 2572 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 19:53:53.116495 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.114777 2572 feature_gate.go:328] unrecognized feature gate: Example Apr 16 19:53:53.116495 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.114779 2572 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 19:53:53.116495 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.114782 2572 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 19:53:53.116495 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.114784 2572 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 19:53:53.116495 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.114787 2572 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 19:53:53.116495 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.114790 2572 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 19:53:53.116495 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.114792 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 19:53:53.116495 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.114795 2572 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 19:53:53.116495 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.114797 2572 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 19:53:53.116495 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.114801 2572 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 19:53:53.116495 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.114804 2572 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 19:53:53.116495 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.114807 2572 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 19:53:53.116495 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.114809 2572 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 19:53:53.116495 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.114812 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 19:53:53.116495 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.114815 2572 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 19:53:53.116971 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.114819 2572 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 19:53:53.116971 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.114822 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 19:53:53.116971 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.114825 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 19:53:53.116971 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.114828 2572 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 19:53:53.116971 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.114831 2572 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 19:53:53.116971 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.114833 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 19:53:53.116971 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.114835 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 19:53:53.116971 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.114838 2572 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 19:53:53.116971 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.114840 2572 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 19:53:53.116971 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.114843 2572 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 19:53:53.116971 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.114845 2572 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 19:53:53.116971 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.114848 2572 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 19:53:53.116971 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.114850 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 19:53:53.116971 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.114853 2572 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 19:53:53.116971 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.114855 2572 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 19:53:53.116971 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.114858 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 19:53:53.116971 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.114861 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 19:53:53.116971 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.114863 2572 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 19:53:53.116971 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.114866 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 19:53:53.116971 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.114869 2572 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 19:53:53.117470 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.114871 2572 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 19:53:53.117470 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.114874 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 19:53:53.117470 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.114877 2572 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 19:53:53.117470 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.114881 2572 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 19:53:53.117470 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.114884 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 19:53:53.117470 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.114886 2572 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 19:53:53.117470 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.114889 2572 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 19:53:53.117470 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.114892 2572 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 19:53:53.117470 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.114894 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 19:53:53.117470 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.114897 2572 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 19:53:53.117470 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.114899 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 19:53:53.117470 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.114902 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 19:53:53.117470 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.114974 2572 flags.go:64] FLAG: --address="0.0.0.0" Apr 16 19:53:53.117470 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.114981 2572 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 16 19:53:53.117470 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.114987 2572 flags.go:64] FLAG: --anonymous-auth="true" Apr 16 19:53:53.117470 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.114992 2572 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 16 19:53:53.117470 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.114997 2572 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 16 19:53:53.117470 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.115001 2572 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 16 19:53:53.117470 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.115005 2572 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 16 19:53:53.117470 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.115010 2572 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 16 19:53:53.117470 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.115013 2572 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 16 19:53:53.117984 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.115017 2572 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 16 19:53:53.117984 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.115020 2572 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 16 19:53:53.117984 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.115024 2572 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 16 19:53:53.117984 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.115027 2572 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 16 19:53:53.117984 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.115030 2572 flags.go:64] FLAG: --cgroup-root="" Apr 16 19:53:53.117984 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.115033 2572 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 16 19:53:53.117984 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.115035 2572 flags.go:64] FLAG: --client-ca-file="" Apr 16 19:53:53.117984 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.115039 2572 flags.go:64] FLAG: --cloud-config="" Apr 16 19:53:53.117984 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.115041 2572 flags.go:64] FLAG: --cloud-provider="external" Apr 16 19:53:53.117984 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.115044 2572 flags.go:64] FLAG: --cluster-dns="[]" Apr 16 19:53:53.117984 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.115049 2572 flags.go:64] FLAG: --cluster-domain="" Apr 16 19:53:53.117984 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.115052 2572 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 16 19:53:53.117984 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.115055 2572 flags.go:64] FLAG: --config-dir="" Apr 16 19:53:53.117984 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.115058 2572 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 16 19:53:53.117984 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.115062 2572 flags.go:64] FLAG: --container-log-max-files="5" Apr 16 19:53:53.117984 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.115066 2572 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 16 19:53:53.117984 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.115069 2572 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 16 19:53:53.117984 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.115072 2572 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 16 19:53:53.117984 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.115075 2572 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 16 19:53:53.117984 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.115078 2572 flags.go:64] FLAG: --contention-profiling="false" Apr 16 19:53:53.117984 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.115081 2572 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 16 19:53:53.117984 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.115084 2572 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 16 19:53:53.117984 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.115087 2572 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 16 19:53:53.117984 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.115104 2572 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 16 19:53:53.117984 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.115108 2572 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 16 19:53:53.118620 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.115111 2572 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 16 19:53:53.118620 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.115114 2572 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 16 19:53:53.118620 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.115117 2572 flags.go:64] FLAG: --enable-load-reader="false" Apr 16 19:53:53.118620 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.115121 2572 flags.go:64] FLAG: --enable-server="true" Apr 16 19:53:53.118620 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.115124 2572 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 16 19:53:53.118620 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.115128 2572 flags.go:64] FLAG: --event-burst="100" Apr 16 19:53:53.118620 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.115131 2572 flags.go:64] FLAG: --event-qps="50" Apr 16 19:53:53.118620 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.115134 2572 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 16 19:53:53.118620 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.115138 2572 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 16 19:53:53.118620 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.115141 2572 flags.go:64] FLAG: --eviction-hard="" Apr 16 19:53:53.118620 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.115144 2572 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 16 19:53:53.118620 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.115147 2572 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 16 19:53:53.118620 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.115150 2572 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 16 19:53:53.118620 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.115154 2572 flags.go:64] FLAG: --eviction-soft="" Apr 16 19:53:53.118620 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.115157 2572 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 16 19:53:53.118620 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.115160 2572 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 16 19:53:53.118620 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.115163 2572 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 16 19:53:53.118620 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.115166 2572 flags.go:64] FLAG: --experimental-mounter-path="" Apr 16 19:53:53.118620 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.115169 2572 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 16 19:53:53.118620 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.115172 2572 flags.go:64] FLAG: --fail-swap-on="true" Apr 16 19:53:53.118620 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.115175 2572 flags.go:64] FLAG: --feature-gates="" Apr 16 19:53:53.118620 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.115179 2572 flags.go:64] FLAG: --file-check-frequency="20s" Apr 16 19:53:53.118620 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.115182 2572 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 16 19:53:53.118620 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.115185 2572 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 16 19:53:53.118620 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.115188 2572 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 16 19:53:53.119238 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.115191 2572 flags.go:64] FLAG: --healthz-port="10248" Apr 16 19:53:53.119238 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.115194 2572 flags.go:64] FLAG: --help="false" Apr 16 19:53:53.119238 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.115197 2572 flags.go:64] FLAG: --hostname-override="ip-10-0-128-48.ec2.internal" Apr 16 19:53:53.119238 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.115200 2572 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 16 19:53:53.119238 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.115204 2572 flags.go:64] FLAG: --http-check-frequency="20s" Apr 16 19:53:53.119238 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.115207 2572 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 16 19:53:53.119238 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.115210 2572 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 16 19:53:53.119238 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.115213 2572 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 16 19:53:53.119238 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.115216 2572 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 16 19:53:53.119238 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.115219 2572 flags.go:64] FLAG: --image-service-endpoint="" Apr 16 19:53:53.119238 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.115222 2572 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 16 19:53:53.119238 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.115225 2572 flags.go:64] FLAG: --kube-api-burst="100" Apr 16 19:53:53.119238 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.115228 2572 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 16 19:53:53.119238 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.115232 2572 flags.go:64] FLAG: --kube-api-qps="50" Apr 16 19:53:53.119238 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.115234 2572 flags.go:64] FLAG: --kube-reserved="" Apr 16 19:53:53.119238 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.115237 2572 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 16 19:53:53.119238 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.115240 2572 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 16 19:53:53.119238 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.115243 2572 flags.go:64] FLAG: --kubelet-cgroups="" Apr 16 19:53:53.119238 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.115246 2572 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 16 19:53:53.119238 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.115249 2572 flags.go:64] FLAG: --lock-file="" Apr 16 19:53:53.119238 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.115252 2572 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 16 19:53:53.119238 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.115255 2572 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 16 19:53:53.119238 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.115258 2572 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 16 19:53:53.119238 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.115264 2572 flags.go:64] FLAG: --log-json-split-stream="false" Apr 16 19:53:53.119813 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.115267 2572 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 16 19:53:53.119813 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.115269 2572 flags.go:64] FLAG: --log-text-split-stream="false" Apr 16 19:53:53.119813 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.115272 2572 flags.go:64] FLAG: --logging-format="text" Apr 16 19:53:53.119813 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.115275 2572 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 16 19:53:53.119813 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.115278 2572 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 16 19:53:53.119813 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.115281 2572 flags.go:64] FLAG: --manifest-url="" Apr 16 19:53:53.119813 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.115284 2572 flags.go:64] FLAG: --manifest-url-header="" Apr 16 19:53:53.119813 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.115295 2572 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 16 19:53:53.119813 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.115298 2572 flags.go:64] FLAG: --max-open-files="1000000" Apr 16 19:53:53.119813 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.115302 2572 flags.go:64] FLAG: --max-pods="110" Apr 16 19:53:53.119813 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.115305 2572 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 16 19:53:53.119813 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.115308 2572 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 16 19:53:53.119813 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.115311 2572 flags.go:64] FLAG: --memory-manager-policy="None" Apr 16 19:53:53.119813 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.115314 2572 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 16 19:53:53.119813 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.115317 2572 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 16 19:53:53.119813 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.115320 2572 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 16 19:53:53.119813 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.115323 2572 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 16 19:53:53.119813 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.115330 2572 flags.go:64] FLAG: --node-status-max-images="50" Apr 16 19:53:53.119813 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.115333 2572 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 16 19:53:53.119813 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.115336 2572 flags.go:64] FLAG: --oom-score-adj="-999" Apr 16 19:53:53.119813 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.115339 2572 flags.go:64] FLAG: --pod-cidr="" Apr 16 19:53:53.119813 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.115343 2572 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 16 19:53:53.119813 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.115348 2572 flags.go:64] FLAG: --pod-manifest-path="" Apr 16 19:53:53.120402 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.115351 2572 flags.go:64] FLAG: --pod-max-pids="-1" Apr 16 19:53:53.120402 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.115354 2572 flags.go:64] FLAG: --pods-per-core="0" Apr 16 19:53:53.120402 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.115357 2572 flags.go:64] FLAG: --port="10250" Apr 16 19:53:53.120402 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.115360 2572 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 16 19:53:53.120402 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.115363 2572 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-002ca34025dcf6f31" Apr 16 19:53:53.120402 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.115369 2572 flags.go:64] FLAG: --qos-reserved="" Apr 16 19:53:53.120402 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.115384 2572 flags.go:64] FLAG: --read-only-port="10255" Apr 16 19:53:53.120402 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.115388 2572 flags.go:64] FLAG: --register-node="true" Apr 16 19:53:53.120402 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.115391 2572 flags.go:64] FLAG: --register-schedulable="true" Apr 16 19:53:53.120402 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.115395 2572 flags.go:64] FLAG: --register-with-taints="" Apr 16 19:53:53.120402 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.115399 2572 flags.go:64] FLAG: --registry-burst="10" Apr 16 19:53:53.120402 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.115402 2572 flags.go:64] FLAG: --registry-qps="5" Apr 16 19:53:53.120402 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.115405 2572 flags.go:64] FLAG: --reserved-cpus="" Apr 16 19:53:53.120402 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.115408 2572 flags.go:64] FLAG: --reserved-memory="" Apr 16 19:53:53.120402 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.115411 2572 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 16 19:53:53.120402 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.115415 2572 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 16 19:53:53.120402 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.115418 2572 flags.go:64] FLAG: --rotate-certificates="false" Apr 16 19:53:53.120402 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.115421 2572 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 16 19:53:53.120402 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.115423 2572 flags.go:64] FLAG: --runonce="false" Apr 16 19:53:53.120402 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.115426 2572 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 16 19:53:53.120402 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.115429 2572 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 16 19:53:53.120402 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.115432 2572 flags.go:64] FLAG: --seccomp-default="false" Apr 16 19:53:53.120402 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.115436 2572 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 16 19:53:53.120402 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.115439 2572 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 16 19:53:53.120402 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.115442 2572 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 16 19:53:53.120402 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.115445 2572 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 16 19:53:53.121073 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.115448 2572 flags.go:64] FLAG: --storage-driver-password="root" Apr 16 19:53:53.121073 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.115451 2572 flags.go:64] FLAG: --storage-driver-secure="false" Apr 16 19:53:53.121073 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.115454 2572 flags.go:64] FLAG: --storage-driver-table="stats" Apr 16 19:53:53.121073 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.115457 2572 flags.go:64] FLAG: --storage-driver-user="root" Apr 16 19:53:53.121073 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.115460 2572 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 16 19:53:53.121073 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.115463 2572 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 16 19:53:53.121073 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.115466 2572 flags.go:64] FLAG: --system-cgroups="" Apr 16 19:53:53.121073 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.115468 2572 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 16 19:53:53.121073 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.115474 2572 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 16 19:53:53.121073 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.115477 2572 flags.go:64] FLAG: --tls-cert-file="" Apr 16 19:53:53.121073 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.115479 2572 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 16 19:53:53.121073 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.115485 2572 flags.go:64] FLAG: --tls-min-version="" Apr 16 19:53:53.121073 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.115488 2572 flags.go:64] FLAG: --tls-private-key-file="" Apr 16 19:53:53.121073 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.115490 2572 flags.go:64] FLAG: --topology-manager-policy="none" Apr 16 19:53:53.121073 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.115493 2572 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 16 19:53:53.121073 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.115497 2572 flags.go:64] FLAG: --topology-manager-scope="container" Apr 16 19:53:53.121073 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.115500 2572 flags.go:64] FLAG: --v="2" Apr 16 19:53:53.121073 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.115504 2572 flags.go:64] FLAG: --version="false" Apr 16 19:53:53.121073 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.115509 2572 flags.go:64] FLAG: --vmodule="" Apr 16 19:53:53.121073 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.115513 2572 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 16 19:53:53.121073 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.115516 2572 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 16 19:53:53.121073 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.115618 2572 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 19:53:53.121073 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.115622 2572 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 19:53:53.121073 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.115625 2572 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 19:53:53.121675 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.115628 2572 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 19:53:53.121675 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.115630 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 19:53:53.121675 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.115634 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 19:53:53.121675 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.115636 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 19:53:53.121675 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.115641 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 19:53:53.121675 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.115644 2572 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 19:53:53.121675 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.115647 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 19:53:53.121675 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.115650 2572 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 19:53:53.121675 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.115652 2572 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 19:53:53.121675 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.115656 2572 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 19:53:53.121675 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.115659 2572 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 19:53:53.121675 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.115662 2572 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 19:53:53.121675 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.115665 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 19:53:53.121675 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.115668 2572 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 19:53:53.121675 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.115670 2572 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 19:53:53.121675 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.115673 2572 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 19:53:53.121675 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.115675 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 19:53:53.121675 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.115678 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 19:53:53.121675 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.115681 2572 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 19:53:53.122190 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.115684 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 19:53:53.122190 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.115687 2572 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 19:53:53.122190 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.115689 2572 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 19:53:53.122190 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.115692 2572 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 19:53:53.122190 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.115695 2572 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 19:53:53.122190 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.115697 2572 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 19:53:53.122190 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.115700 2572 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 19:53:53.122190 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.115703 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 19:53:53.122190 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.115705 2572 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 19:53:53.122190 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.115708 2572 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 19:53:53.122190 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.115710 2572 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 19:53:53.122190 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.115713 2572 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 19:53:53.122190 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.115715 2572 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 19:53:53.122190 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.115718 2572 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 19:53:53.122190 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.115721 2572 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 19:53:53.122190 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.115723 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 19:53:53.122190 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.115726 2572 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 19:53:53.122190 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.115730 2572 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 19:53:53.122190 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.115732 2572 feature_gate.go:328] unrecognized feature gate: Example Apr 16 19:53:53.122190 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.115735 2572 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 19:53:53.122688 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.115737 2572 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 19:53:53.122688 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.115740 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 19:53:53.122688 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.115743 2572 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 19:53:53.122688 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.115745 2572 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 19:53:53.122688 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.115748 2572 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 19:53:53.122688 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.115751 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 19:53:53.122688 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.115753 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 19:53:53.122688 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.115756 2572 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 19:53:53.122688 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.115759 2572 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 19:53:53.122688 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.115762 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 19:53:53.122688 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.115764 2572 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 19:53:53.122688 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.115767 2572 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 19:53:53.122688 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.115770 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 19:53:53.122688 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.115773 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 19:53:53.122688 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.115776 2572 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 19:53:53.122688 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.115778 2572 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 19:53:53.122688 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.115781 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 19:53:53.122688 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.115783 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 19:53:53.122688 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.115785 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 19:53:53.122688 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.115788 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 19:53:53.123324 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.115791 2572 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 19:53:53.123324 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.115795 2572 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 19:53:53.123324 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.115798 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 19:53:53.123324 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.115801 2572 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 19:53:53.123324 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.115803 2572 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 19:53:53.123324 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.115806 2572 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 19:53:53.123324 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.115809 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 19:53:53.123324 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.115811 2572 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 19:53:53.123324 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.115814 2572 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 19:53:53.123324 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.115821 2572 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 19:53:53.123324 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.115823 2572 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 19:53:53.123324 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.115826 2572 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 19:53:53.123324 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.115828 2572 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 19:53:53.123324 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.115831 2572 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 19:53:53.123324 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.115833 2572 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 19:53:53.123324 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.115836 2572 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 19:53:53.123324 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.115838 2572 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 19:53:53.123324 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.115841 2572 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 19:53:53.123324 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.115844 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 19:53:53.123796 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.115847 2572 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 19:53:53.123796 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.115850 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 19:53:53.123796 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.115852 2572 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 19:53:53.123796 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.115855 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 19:53:53.123796 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.115857 2572 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 19:53:53.123796 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.116609 2572 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 19:53:53.123796 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.123760 2572 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 16 19:53:53.123796 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.123785 2572 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 16 19:53:53.124006 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.123840 2572 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 19:53:53.124006 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.123845 2572 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 19:53:53.124006 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.123849 2572 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 19:53:53.124006 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.123852 2572 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 19:53:53.124006 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.123854 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 19:53:53.124006 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.123857 2572 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 19:53:53.124006 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.123860 2572 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 19:53:53.124006 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.123862 2572 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 19:53:53.124006 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.123865 2572 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 19:53:53.124006 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.123868 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 19:53:53.124006 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.123870 2572 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 19:53:53.124006 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.123873 2572 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 19:53:53.124006 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.123875 2572 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 19:53:53.124006 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.123878 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 19:53:53.124006 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.123881 2572 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 19:53:53.124006 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.123884 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 19:53:53.124006 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.123886 2572 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 19:53:53.124006 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.123889 2572 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 19:53:53.124006 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.123892 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 19:53:53.124006 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.123894 2572 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 19:53:53.124658 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.123897 2572 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 19:53:53.124658 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.123900 2572 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 19:53:53.124658 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.123902 2572 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 19:53:53.124658 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.123905 2572 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 19:53:53.124658 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.123907 2572 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 19:53:53.124658 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.123910 2572 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 19:53:53.124658 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.123913 2572 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 19:53:53.124658 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.123915 2572 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 19:53:53.124658 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.123917 2572 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 19:53:53.124658 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.123920 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 19:53:53.124658 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.123923 2572 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 19:53:53.124658 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.123926 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 19:53:53.124658 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.123928 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 19:53:53.124658 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.123932 2572 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 19:53:53.124658 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.123935 2572 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 19:53:53.124658 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.123937 2572 feature_gate.go:328] unrecognized feature gate: Example Apr 16 19:53:53.124658 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.123940 2572 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 19:53:53.124658 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.123943 2572 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 19:53:53.124658 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.123945 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 19:53:53.124658 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.123948 2572 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 19:53:53.125164 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.123951 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 19:53:53.125164 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.123953 2572 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 19:53:53.125164 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.123956 2572 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 19:53:53.125164 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.123958 2572 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 19:53:53.125164 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.123961 2572 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 19:53:53.125164 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.123964 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 19:53:53.125164 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.123966 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 19:53:53.125164 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.123969 2572 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 19:53:53.125164 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.123971 2572 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 19:53:53.125164 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.123974 2572 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 19:53:53.125164 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.123976 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 19:53:53.125164 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.123979 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 19:53:53.125164 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.123982 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 19:53:53.125164 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.123985 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 19:53:53.125164 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.123988 2572 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 19:53:53.125164 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.123990 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 19:53:53.125164 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.123993 2572 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 19:53:53.125164 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.123996 2572 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 19:53:53.125164 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.123999 2572 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 19:53:53.125164 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.124001 2572 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 19:53:53.125659 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.124004 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 19:53:53.125659 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.124006 2572 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 19:53:53.125659 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.124009 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 19:53:53.125659 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.124012 2572 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 19:53:53.125659 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.124015 2572 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 19:53:53.125659 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.124018 2572 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 19:53:53.125659 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.124021 2572 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 19:53:53.125659 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.124024 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 19:53:53.125659 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.124026 2572 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 19:53:53.125659 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.124029 2572 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 19:53:53.125659 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.124031 2572 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 19:53:53.125659 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.124034 2572 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 19:53:53.125659 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.124038 2572 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 19:53:53.125659 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.124042 2572 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 19:53:53.125659 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.124045 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 19:53:53.125659 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.124048 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 19:53:53.125659 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.124051 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 19:53:53.125659 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.124053 2572 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 19:53:53.125659 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.124056 2572 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 19:53:53.126138 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.124060 2572 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 19:53:53.126138 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.124063 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 19:53:53.126138 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.124065 2572 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 19:53:53.126138 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.124068 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 19:53:53.126138 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.124070 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 19:53:53.126138 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.124073 2572 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 19:53:53.126138 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.124077 2572 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 19:53:53.126138 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.124082 2572 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 19:53:53.126138 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.124199 2572 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 19:53:53.126138 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.124204 2572 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 19:53:53.126138 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.124207 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 19:53:53.126138 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.124210 2572 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 19:53:53.126138 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.124213 2572 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 19:53:53.126138 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.124216 2572 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 19:53:53.126138 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.124219 2572 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 19:53:53.126503 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.124221 2572 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 19:53:53.126503 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.124224 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 19:53:53.126503 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.124226 2572 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 19:53:53.126503 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.124229 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 19:53:53.126503 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.124232 2572 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 19:53:53.126503 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.124235 2572 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 19:53:53.126503 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.124237 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 19:53:53.126503 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.124240 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 19:53:53.126503 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.124242 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 19:53:53.126503 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.124246 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 19:53:53.126503 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.124248 2572 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 19:53:53.126503 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.124251 2572 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 19:53:53.126503 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.124253 2572 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 19:53:53.126503 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.124256 2572 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 19:53:53.126503 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.124259 2572 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 19:53:53.126503 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.124261 2572 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 19:53:53.126503 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.124263 2572 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 19:53:53.126503 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.124266 2572 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 19:53:53.126503 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.124268 2572 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 19:53:53.126503 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.124271 2572 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 19:53:53.126978 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.124273 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 19:53:53.126978 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.124276 2572 feature_gate.go:328] unrecognized feature gate: Example Apr 16 19:53:53.126978 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.124278 2572 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 19:53:53.126978 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.124282 2572 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 19:53:53.126978 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.124286 2572 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 19:53:53.126978 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.124288 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 19:53:53.126978 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.124291 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 19:53:53.126978 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.124294 2572 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 19:53:53.126978 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.124297 2572 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 19:53:53.126978 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.124300 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 19:53:53.126978 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.124303 2572 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 19:53:53.126978 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.124306 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 19:53:53.126978 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.124309 2572 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 19:53:53.126978 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.124311 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 19:53:53.126978 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.124314 2572 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 19:53:53.126978 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.124317 2572 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 19:53:53.126978 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.124320 2572 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 19:53:53.126978 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.124323 2572 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 19:53:53.126978 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.124325 2572 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 19:53:53.127465 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.124328 2572 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 19:53:53.127465 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.124330 2572 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 19:53:53.127465 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.124333 2572 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 19:53:53.127465 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.124335 2572 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 19:53:53.127465 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.124338 2572 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 19:53:53.127465 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.124340 2572 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 19:53:53.127465 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.124343 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 19:53:53.127465 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.124345 2572 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 19:53:53.127465 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.124348 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 19:53:53.127465 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.124350 2572 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 19:53:53.127465 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.124353 2572 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 19:53:53.127465 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.124355 2572 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 19:53:53.127465 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.124358 2572 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 19:53:53.127465 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.124360 2572 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 19:53:53.127465 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.124362 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 19:53:53.127465 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.124365 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 19:53:53.127465 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.124368 2572 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 19:53:53.127465 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.124370 2572 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 19:53:53.127465 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.124372 2572 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 19:53:53.127465 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.124375 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 19:53:53.128031 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.124377 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 19:53:53.128031 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.124379 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 19:53:53.128031 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.124382 2572 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 19:53:53.128031 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.124385 2572 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 19:53:53.128031 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.124387 2572 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 19:53:53.128031 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.124389 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 19:53:53.128031 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.124392 2572 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 19:53:53.128031 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.124394 2572 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 19:53:53.128031 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.124397 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 19:53:53.128031 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.124400 2572 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 19:53:53.128031 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.124402 2572 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 19:53:53.128031 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.124405 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 19:53:53.128031 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.124407 2572 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 19:53:53.128031 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.124411 2572 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 19:53:53.128031 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.124415 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 19:53:53.128031 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.124417 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 19:53:53.128031 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.124420 2572 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 19:53:53.128031 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.124422 2572 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 19:53:53.128031 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.124425 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 19:53:53.128624 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:53.124427 2572 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 19:53:53.128624 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.124433 2572 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 19:53:53.128624 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.125109 2572 server.go:962] "Client rotation is on, will bootstrap in background" Apr 16 19:53:53.128624 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.127384 2572 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 16 19:53:53.128624 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.128198 2572 server.go:1019] "Starting client certificate rotation" Apr 16 19:53:53.128624 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.128305 2572 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 19:53:53.128624 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.128355 2572 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 19:53:53.150204 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.150181 2572 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 19:53:53.152449 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.152428 2572 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 19:53:53.166478 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.166458 2572 log.go:25] "Validated CRI v1 runtime API" Apr 16 19:53:53.171603 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.171586 2572 log.go:25] "Validated CRI v1 image API" Apr 16 19:53:53.172812 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.172798 2572 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 16 19:53:53.176937 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.176916 2572 fs.go:135] Filesystem UUIDs: map[0f9db906-f544-436f-a4d1-24477f822ac7:/dev/nvme0n1p3 7B77-95E7:/dev/nvme0n1p2 8ea3b3b2-ceae-4ea7-9c99-b52ed3bacc15:/dev/nvme0n1p4] Apr 16 19:53:53.177019 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.176935 2572 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 16 19:53:53.181263 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.181243 2572 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 19:53:53.182977 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.182876 2572 manager.go:217] Machine: {Timestamp:2026-04-16 19:53:53.181054803 +0000 UTC m=+0.359816823 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3100403 MemoryCapacity:33164496896 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec26bfda573eb9d55660325704b18d59 SystemUUID:ec26bfda-573e-b9d5-5660-325704b18d59 BootID:d6c20084-1247-4e66-adf9-87ca7ea044c7 Filesystems:[{Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632902656 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582250496 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:4c:03:00:e2:2d Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:4c:03:00:e2:2d Speed:0 Mtu:9001} {Name:ovs-system MacAddress:5a:71:00:7b:de:1e Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164496896 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 16 19:53:53.182977 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.182970 2572 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 16 19:53:53.183119 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.183044 2572 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 16 19:53:53.183998 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.183973 2572 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 16 19:53:53.184151 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.184000 2572 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-128-48.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 16 19:53:53.184201 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.184160 2572 topology_manager.go:138] "Creating topology manager with none policy" Apr 16 19:53:53.184201 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.184168 2572 container_manager_linux.go:306] "Creating device plugin manager" Apr 16 19:53:53.184201 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.184185 2572 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 19:53:53.184901 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.184890 2572 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 19:53:53.186358 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.186347 2572 state_mem.go:36] "Initialized new in-memory state store" Apr 16 19:53:53.186469 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.186460 2572 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 16 19:53:53.188831 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.188823 2572 kubelet.go:491] "Attempting to sync node with API server" Apr 16 19:53:53.188865 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.188835 2572 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 16 19:53:53.188865 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.188846 2572 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 16 19:53:53.188865 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.188854 2572 kubelet.go:397] "Adding apiserver pod source" Apr 16 19:53:53.188865 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.188863 2572 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 16 19:53:53.189897 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.189884 2572 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 19:53:53.189938 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.189903 2572 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 19:53:53.192417 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.192401 2572 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 16 19:53:53.196567 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.196542 2572 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 16 19:53:53.197865 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.197848 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 16 19:53:53.197865 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.197867 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 16 19:53:53.197956 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.197874 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 16 19:53:53.197956 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.197881 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 16 19:53:53.197956 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.197889 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 16 19:53:53.197956 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.197895 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 16 19:53:53.197956 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.197901 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 16 19:53:53.197956 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.197906 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 16 19:53:53.197956 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.197914 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 16 19:53:53.197956 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.197920 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 16 19:53:53.197956 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.197937 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 16 19:53:53.197956 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.197948 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 16 19:53:53.198824 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.198812 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 16 19:53:53.198824 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.198823 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 16 19:53:53.200242 ip-10-0-128-48 kubenswrapper[2572]: E0416 19:53:53.200220 2572 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 16 19:53:53.200332 ip-10-0-128-48 kubenswrapper[2572]: E0416 19:53:53.200240 2572 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-128-48.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 16 19:53:53.202733 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.202721 2572 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 16 19:53:53.202802 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.202756 2572 server.go:1295] "Started kubelet" Apr 16 19:53:53.202864 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.202829 2572 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 16 19:53:53.202916 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.202844 2572 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 16 19:53:53.202916 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.202900 2572 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 16 19:53:53.203522 ip-10-0-128-48 systemd[1]: Started Kubernetes Kubelet. Apr 16 19:53:53.204535 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.204506 2572 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 16 19:53:53.205543 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.205524 2572 server.go:317] "Adding debug handlers to kubelet server" Apr 16 19:53:53.209403 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.209386 2572 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 16 19:53:53.209947 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.209928 2572 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 16 19:53:53.210756 ip-10-0-128-48 kubenswrapper[2572]: E0416 19:53:53.210725 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-48.ec2.internal\" not found" Apr 16 19:53:53.210756 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.210747 2572 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 16 19:53:53.210883 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.210790 2572 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 16 19:53:53.210883 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.210804 2572 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 16 19:53:53.210960 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.210921 2572 factory.go:153] Registering CRI-O factory Apr 16 19:53:53.210960 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.210960 2572 factory.go:223] Registration of the crio container factory successfully Apr 16 19:53:53.211045 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.210975 2572 reconstruct.go:97] "Volume reconstruction finished" Apr 16 19:53:53.211045 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.210984 2572 reconciler.go:26] "Reconciler: start to sync state" Apr 16 19:53:53.211045 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.211031 2572 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 16 19:53:53.211045 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.211042 2572 factory.go:55] Registering systemd factory Apr 16 19:53:53.211247 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.211050 2572 factory.go:223] Registration of the systemd container factory successfully Apr 16 19:53:53.211247 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.211069 2572 factory.go:103] Registering Raw factory Apr 16 19:53:53.211247 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.211080 2572 manager.go:1196] Started watching for new ooms in manager Apr 16 19:53:53.211247 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.211184 2572 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-128-48.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 16 19:53:53.211721 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.211701 2572 manager.go:319] Starting recovery of all containers Apr 16 19:53:53.212730 ip-10-0-128-48 kubenswrapper[2572]: E0416 19:53:53.211291 2572 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-128-48.ec2.internal.18a6ee67eefb5fc9 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-128-48.ec2.internal,UID:ip-10-0-128-48.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-128-48.ec2.internal,},FirstTimestamp:2026-04-16 19:53:53.202733001 +0000 UTC m=+0.381495016,LastTimestamp:2026-04-16 19:53:53.202733001 +0000 UTC m=+0.381495016,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-128-48.ec2.internal,}" Apr 16 19:53:53.212730 ip-10-0-128-48 kubenswrapper[2572]: E0416 19:53:53.212562 2572 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 16 19:53:53.215691 ip-10-0-128-48 kubenswrapper[2572]: E0416 19:53:53.215666 2572 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 16 19:53:53.216051 ip-10-0-128-48 kubenswrapper[2572]: E0416 19:53:53.216025 2572 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-128-48.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 16 19:53:53.221933 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.221914 2572 manager.go:324] Recovery completed Apr 16 19:53:53.224110 ip-10-0-128-48 kubenswrapper[2572]: E0416 19:53:53.223988 2572 watcher.go:152] Failed to watch directory "/sys/fs/cgroup/system.slice/systemd-update-utmp-runlevel.service": inotify_add_watch /sys/fs/cgroup/system.slice/systemd-update-utmp-runlevel.service: no such file or directory Apr 16 19:53:53.226870 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.226858 2572 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 19:53:53.229062 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.229038 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-48.ec2.internal" event="NodeHasSufficientMemory" Apr 16 19:53:53.229171 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.229072 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-48.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 19:53:53.229171 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.229083 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-48.ec2.internal" event="NodeHasSufficientPID" Apr 16 19:53:53.229565 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.229550 2572 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 16 19:53:53.229565 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.229564 2572 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 16 19:53:53.229648 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.229579 2572 state_mem.go:36] "Initialized new in-memory state store" Apr 16 19:53:53.230499 ip-10-0-128-48 kubenswrapper[2572]: E0416 19:53:53.230441 2572 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-128-48.ec2.internal.18a6ee67f08d1876 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-128-48.ec2.internal,UID:ip-10-0-128-48.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-128-48.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-128-48.ec2.internal,},FirstTimestamp:2026-04-16 19:53:53.229060214 +0000 UTC m=+0.407822230,LastTimestamp:2026-04-16 19:53:53.229060214 +0000 UTC m=+0.407822230,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-128-48.ec2.internal,}" Apr 16 19:53:53.231739 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.231728 2572 policy_none.go:49] "None policy: Start" Apr 16 19:53:53.231784 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.231743 2572 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 16 19:53:53.231784 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.231753 2572 state_mem.go:35] "Initializing new in-memory state store" Apr 16 19:53:53.232011 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.231996 2572 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-rgct8" Apr 16 19:53:53.237814 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.237796 2572 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-rgct8" Apr 16 19:53:53.266793 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.266771 2572 manager.go:341] "Starting Device Plugin manager" Apr 16 19:53:53.266870 ip-10-0-128-48 kubenswrapper[2572]: E0416 19:53:53.266803 2572 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 16 19:53:53.266870 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.266815 2572 server.go:85] "Starting device plugin registration server" Apr 16 19:53:53.267053 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.267034 2572 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 16 19:53:53.267146 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.267049 2572 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 16 19:53:53.267215 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.267146 2572 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 16 19:53:53.267277 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.267266 2572 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 16 19:53:53.267315 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.267278 2572 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 16 19:53:53.267663 ip-10-0-128-48 kubenswrapper[2572]: E0416 19:53:53.267645 2572 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 16 19:53:53.267727 ip-10-0-128-48 kubenswrapper[2572]: E0416 19:53:53.267676 2572 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-128-48.ec2.internal\" not found" Apr 16 19:53:53.359627 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.359557 2572 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 16 19:53:53.360742 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.360729 2572 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 16 19:53:53.360801 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.360753 2572 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 16 19:53:53.360801 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.360774 2572 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 16 19:53:53.360801 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.360780 2572 kubelet.go:2451] "Starting kubelet main sync loop" Apr 16 19:53:53.360945 ip-10-0-128-48 kubenswrapper[2572]: E0416 19:53:53.360818 2572 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 16 19:53:53.363163 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.363144 2572 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 19:53:53.367141 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.367123 2572 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 19:53:53.367891 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.367876 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-48.ec2.internal" event="NodeHasSufficientMemory" Apr 16 19:53:53.367968 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.367910 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-48.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 19:53:53.367968 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.367926 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-48.ec2.internal" event="NodeHasSufficientPID" Apr 16 19:53:53.367968 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.367956 2572 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-128-48.ec2.internal" Apr 16 19:53:53.378650 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.378633 2572 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-128-48.ec2.internal" Apr 16 19:53:53.378706 ip-10-0-128-48 kubenswrapper[2572]: E0416 19:53:53.378651 2572 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-128-48.ec2.internal\": node \"ip-10-0-128-48.ec2.internal\" not found" Apr 16 19:53:53.402745 ip-10-0-128-48 kubenswrapper[2572]: E0416 19:53:53.402723 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-48.ec2.internal\" not found" Apr 16 19:53:53.461782 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.461764 2572 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-48.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-128-48.ec2.internal"] Apr 16 19:53:53.461843 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.461834 2572 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 19:53:53.463214 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.463201 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-48.ec2.internal" event="NodeHasSufficientMemory" Apr 16 19:53:53.463275 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.463225 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-48.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 19:53:53.463275 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.463235 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-48.ec2.internal" event="NodeHasSufficientPID" Apr 16 19:53:53.464409 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.464397 2572 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 19:53:53.464563 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.464551 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-48.ec2.internal" Apr 16 19:53:53.464600 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.464576 2572 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 19:53:53.465001 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.464986 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-48.ec2.internal" event="NodeHasSufficientMemory" Apr 16 19:53:53.465001 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.464994 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-48.ec2.internal" event="NodeHasSufficientMemory" Apr 16 19:53:53.465138 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.465012 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-48.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 19:53:53.465138 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.465015 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-48.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 19:53:53.465138 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.465025 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-48.ec2.internal" event="NodeHasSufficientPID" Apr 16 19:53:53.465138 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.465031 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-48.ec2.internal" event="NodeHasSufficientPID" Apr 16 19:53:53.466227 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.466210 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-48.ec2.internal" Apr 16 19:53:53.466311 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.466235 2572 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 19:53:53.466797 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.466784 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-48.ec2.internal" event="NodeHasSufficientMemory" Apr 16 19:53:53.466861 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.466810 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-48.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 19:53:53.466861 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.466823 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-48.ec2.internal" event="NodeHasSufficientPID" Apr 16 19:53:53.481614 ip-10-0-128-48 kubenswrapper[2572]: E0416 19:53:53.481591 2572 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-128-48.ec2.internal\" not found" node="ip-10-0-128-48.ec2.internal" Apr 16 19:53:53.485793 ip-10-0-128-48 kubenswrapper[2572]: E0416 19:53:53.485778 2572 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-128-48.ec2.internal\" not found" node="ip-10-0-128-48.ec2.internal" Apr 16 19:53:53.503272 ip-10-0-128-48 kubenswrapper[2572]: E0416 19:53:53.503255 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-48.ec2.internal\" not found" Apr 16 19:53:53.512121 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.512085 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/a59663744879090502bab85c5c499c1b-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-128-48.ec2.internal\" (UID: \"a59663744879090502bab85c5c499c1b\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-48.ec2.internal" Apr 16 19:53:53.512167 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.512128 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a59663744879090502bab85c5c499c1b-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-128-48.ec2.internal\" (UID: \"a59663744879090502bab85c5c499c1b\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-48.ec2.internal" Apr 16 19:53:53.512167 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.512146 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/0ef9ecbb9850ae7e0cccd45a7695b7be-config\") pod \"kube-apiserver-proxy-ip-10-0-128-48.ec2.internal\" (UID: \"0ef9ecbb9850ae7e0cccd45a7695b7be\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-128-48.ec2.internal" Apr 16 19:53:53.604308 ip-10-0-128-48 kubenswrapper[2572]: E0416 19:53:53.604275 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-48.ec2.internal\" not found" Apr 16 19:53:53.612684 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.612636 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a59663744879090502bab85c5c499c1b-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-128-48.ec2.internal\" (UID: \"a59663744879090502bab85c5c499c1b\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-48.ec2.internal" Apr 16 19:53:53.612684 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.612663 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/0ef9ecbb9850ae7e0cccd45a7695b7be-config\") pod \"kube-apiserver-proxy-ip-10-0-128-48.ec2.internal\" (UID: \"0ef9ecbb9850ae7e0cccd45a7695b7be\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-128-48.ec2.internal" Apr 16 19:53:53.612776 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.612690 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/a59663744879090502bab85c5c499c1b-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-128-48.ec2.internal\" (UID: \"a59663744879090502bab85c5c499c1b\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-48.ec2.internal" Apr 16 19:53:53.612776 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.612735 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/0ef9ecbb9850ae7e0cccd45a7695b7be-config\") pod \"kube-apiserver-proxy-ip-10-0-128-48.ec2.internal\" (UID: \"0ef9ecbb9850ae7e0cccd45a7695b7be\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-128-48.ec2.internal" Apr 16 19:53:53.612776 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.612755 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/a59663744879090502bab85c5c499c1b-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-128-48.ec2.internal\" (UID: \"a59663744879090502bab85c5c499c1b\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-48.ec2.internal" Apr 16 19:53:53.612776 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.612736 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a59663744879090502bab85c5c499c1b-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-128-48.ec2.internal\" (UID: \"a59663744879090502bab85c5c499c1b\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-48.ec2.internal" Apr 16 19:53:53.704988 ip-10-0-128-48 kubenswrapper[2572]: E0416 19:53:53.704952 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-48.ec2.internal\" not found" Apr 16 19:53:53.783459 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.783436 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-48.ec2.internal" Apr 16 19:53:53.787992 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:53.787976 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-48.ec2.internal" Apr 16 19:53:53.805783 ip-10-0-128-48 kubenswrapper[2572]: E0416 19:53:53.805766 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-48.ec2.internal\" not found" Apr 16 19:53:53.906412 ip-10-0-128-48 kubenswrapper[2572]: E0416 19:53:53.906323 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-48.ec2.internal\" not found" Apr 16 19:53:54.006896 ip-10-0-128-48 kubenswrapper[2572]: E0416 19:53:54.006867 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-48.ec2.internal\" not found" Apr 16 19:53:54.028810 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.028784 2572 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 19:53:54.110166 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.110144 2572 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-48.ec2.internal" Apr 16 19:53:54.122670 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.122648 2572 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 19:53:54.124376 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.124357 2572 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-48.ec2.internal" Apr 16 19:53:54.128917 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.128901 2572 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 16 19:53:54.129030 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.129013 2572 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 19:53:54.129105 ip-10-0-128-48 kubenswrapper[2572]: E0416 19:53:54.129074 2572 kubelet.go:3342] "Failed creating a mirror pod" err="Post \"https://abb9c8bb95cf54d749dff3ad7b93926c-55fde18ec7c9d109.elb.us-east-1.amazonaws.com:6443/api/v1/namespaces/kube-system/pods\": read tcp 10.0.128.48:35398->34.236.127.74:6443: use of closed network connection" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-48.ec2.internal" Apr 16 19:53:54.189497 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.189470 2572 apiserver.go:52] "Watching apiserver" Apr 16 19:53:54.191886 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:54.191862 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0ef9ecbb9850ae7e0cccd45a7695b7be.slice/crio-6d50ab3929e9c0ce1605aec217682a1643380aed4a25951d41ba30baed3fbf28 WatchSource:0}: Error finding container 6d50ab3929e9c0ce1605aec217682a1643380aed4a25951d41ba30baed3fbf28: Status 404 returned error can't find the container with id 6d50ab3929e9c0ce1605aec217682a1643380aed4a25951d41ba30baed3fbf28 Apr 16 19:53:54.192313 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:54.192294 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda59663744879090502bab85c5c499c1b.slice/crio-fa83fd5379315de8b636531d88f7903c001d022e2df6860722b5d8e75ee09091 WatchSource:0}: Error finding container fa83fd5379315de8b636531d88f7903c001d022e2df6860722b5d8e75ee09091: Status 404 returned error can't find the container with id fa83fd5379315de8b636531d88f7903c001d022e2df6860722b5d8e75ee09091 Apr 16 19:53:54.196289 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.196274 2572 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 19:53:54.198773 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.198757 2572 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 16 19:53:54.201074 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.201042 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-48.ec2.internal","openshift-network-diagnostics/network-check-target-qvsgg","openshift-ovn-kubernetes/ovnkube-node-bsbmz","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lnmvl","openshift-multus/multus-additional-cni-plugins-j48bx","openshift-multus/multus-mjhhc","openshift-multus/network-metrics-daemon-5j478","openshift-network-operator/iptables-alerter-mlhp9","kube-system/konnectivity-agent-dxbfj","kube-system/kube-apiserver-proxy-ip-10-0-128-48.ec2.internal","openshift-cluster-node-tuning-operator/tuned-d2lfx","openshift-image-registry/node-ca-2jzjc"] Apr 16 19:53:54.203957 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.203934 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qvsgg" Apr 16 19:53:54.204035 ip-10-0-128-48 kubenswrapper[2572]: E0416 19:53:54.204018 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qvsgg" podUID="dc2e7638-ebb5-4713-a221-8c885ed0b19d" Apr 16 19:53:54.205338 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.205315 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-bsbmz" Apr 16 19:53:54.206297 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.206279 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lnmvl" Apr 16 19:53:54.207516 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.207497 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-j48bx" Apr 16 19:53:54.208836 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.208810 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-mjhhc" Apr 16 19:53:54.209538 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.209340 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-s7ssk\"" Apr 16 19:53:54.209538 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.209369 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 16 19:53:54.209538 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.209400 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 16 19:53:54.209538 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.209411 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 16 19:53:54.209538 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.209473 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 16 19:53:54.209538 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.209484 2572 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 16 19:53:54.209538 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.209403 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 16 19:53:54.209933 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.209407 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 16 19:53:54.209933 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.209897 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-rklwx\"" Apr 16 19:53:54.209933 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.209913 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 16 19:53:54.210779 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.210224 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 16 19:53:54.210779 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.210279 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 16 19:53:54.210779 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.210355 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5j478" Apr 16 19:53:54.210779 ip-10-0-128-48 kubenswrapper[2572]: E0416 19:53:54.210416 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5j478" podUID="458f83e2-e97a-457a-9081-a5ae099b6973" Apr 16 19:53:54.210779 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.210524 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 16 19:53:54.210779 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.210728 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 16 19:53:54.210779 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.210760 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-ts624\"" Apr 16 19:53:54.211150 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.210902 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 16 19:53:54.211150 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.210970 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 16 19:53:54.211150 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.210981 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 16 19:53:54.211716 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.211699 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 16 19:53:54.211716 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.211710 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-qnz9r\"" Apr 16 19:53:54.211980 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.211964 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-mlhp9" Apr 16 19:53:54.213690 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.213284 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-dxbfj" Apr 16 19:53:54.214470 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.214453 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-d2lfx" Apr 16 19:53:54.214580 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.214564 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 16 19:53:54.214650 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.214581 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 16 19:53:54.214824 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.214798 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 16 19:53:54.214890 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.214854 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-nqwng\"" Apr 16 19:53:54.215660 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.215646 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-2jzjc" Apr 16 19:53:54.216005 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.215990 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 16 19:53:54.216108 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.215994 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/74091848-cf67-4c6a-9ad9-adc12a5e47ad-cni-binary-copy\") pod \"multus-mjhhc\" (UID: \"74091848-cf67-4c6a-9ad9-adc12a5e47ad\") " pod="openshift-multus/multus-mjhhc" Apr 16 19:53:54.216108 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.216019 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 16 19:53:54.216108 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.216065 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/40cddc46-c9be-411b-92b8-a65e04009fc9-host-cni-netd\") pod \"ovnkube-node-bsbmz\" (UID: \"40cddc46-c9be-411b-92b8-a65e04009fc9\") " pod="openshift-ovn-kubernetes/ovnkube-node-bsbmz" Apr 16 19:53:54.216270 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.216113 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/40cddc46-c9be-411b-92b8-a65e04009fc9-host-slash\") pod \"ovnkube-node-bsbmz\" (UID: \"40cddc46-c9be-411b-92b8-a65e04009fc9\") " pod="openshift-ovn-kubernetes/ovnkube-node-bsbmz" Apr 16 19:53:54.216270 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.216124 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-jm84x\"" Apr 16 19:53:54.216270 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.216138 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/40cddc46-c9be-411b-92b8-a65e04009fc9-ovnkube-config\") pod \"ovnkube-node-bsbmz\" (UID: \"40cddc46-c9be-411b-92b8-a65e04009fc9\") " pod="openshift-ovn-kubernetes/ovnkube-node-bsbmz" Apr 16 19:53:54.216270 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.216165 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/74091848-cf67-4c6a-9ad9-adc12a5e47ad-system-cni-dir\") pod \"multus-mjhhc\" (UID: \"74091848-cf67-4c6a-9ad9-adc12a5e47ad\") " pod="openshift-multus/multus-mjhhc" Apr 16 19:53:54.216270 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.216191 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/40cddc46-c9be-411b-92b8-a65e04009fc9-etc-openvswitch\") pod \"ovnkube-node-bsbmz\" (UID: \"40cddc46-c9be-411b-92b8-a65e04009fc9\") " pod="openshift-ovn-kubernetes/ovnkube-node-bsbmz" Apr 16 19:53:54.216510 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.216277 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2cd1935c-d57b-4e12-881b-0c81444e85ac-system-cni-dir\") pod \"multus-additional-cni-plugins-j48bx\" (UID: \"2cd1935c-d57b-4e12-881b-0c81444e85ac\") " pod="openshift-multus/multus-additional-cni-plugins-j48bx" Apr 16 19:53:54.216510 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.216310 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/40cddc46-c9be-411b-92b8-a65e04009fc9-host-kubelet\") pod \"ovnkube-node-bsbmz\" (UID: \"40cddc46-c9be-411b-92b8-a65e04009fc9\") " pod="openshift-ovn-kubernetes/ovnkube-node-bsbmz" Apr 16 19:53:54.216510 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.216363 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/74091848-cf67-4c6a-9ad9-adc12a5e47ad-os-release\") pod \"multus-mjhhc\" (UID: \"74091848-cf67-4c6a-9ad9-adc12a5e47ad\") " pod="openshift-multus/multus-mjhhc" Apr 16 19:53:54.216510 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.216393 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/74091848-cf67-4c6a-9ad9-adc12a5e47ad-multus-daemon-config\") pod \"multus-mjhhc\" (UID: \"74091848-cf67-4c6a-9ad9-adc12a5e47ad\") " pod="openshift-multus/multus-mjhhc" Apr 16 19:53:54.216510 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.216439 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/40cddc46-c9be-411b-92b8-a65e04009fc9-log-socket\") pod \"ovnkube-node-bsbmz\" (UID: \"40cddc46-c9be-411b-92b8-a65e04009fc9\") " pod="openshift-ovn-kubernetes/ovnkube-node-bsbmz" Apr 16 19:53:54.216510 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.216493 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/40cddc46-c9be-411b-92b8-a65e04009fc9-env-overrides\") pod \"ovnkube-node-bsbmz\" (UID: \"40cddc46-c9be-411b-92b8-a65e04009fc9\") " pod="openshift-ovn-kubernetes/ovnkube-node-bsbmz" Apr 16 19:53:54.216753 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.216527 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-724dh\" (UniqueName: \"kubernetes.io/projected/40cddc46-c9be-411b-92b8-a65e04009fc9-kube-api-access-724dh\") pod \"ovnkube-node-bsbmz\" (UID: \"40cddc46-c9be-411b-92b8-a65e04009fc9\") " pod="openshift-ovn-kubernetes/ovnkube-node-bsbmz" Apr 16 19:53:54.216753 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.216555 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/2cd1935c-d57b-4e12-881b-0c81444e85ac-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-j48bx\" (UID: \"2cd1935c-d57b-4e12-881b-0c81444e85ac\") " pod="openshift-multus/multus-additional-cni-plugins-j48bx" Apr 16 19:53:54.216753 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.216595 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/40cddc46-c9be-411b-92b8-a65e04009fc9-ovnkube-script-lib\") pod \"ovnkube-node-bsbmz\" (UID: \"40cddc46-c9be-411b-92b8-a65e04009fc9\") " pod="openshift-ovn-kubernetes/ovnkube-node-bsbmz" Apr 16 19:53:54.216753 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.216631 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/4ee2a187-ea67-4cf0-a11b-3bf8404d67b6-iptables-alerter-script\") pod \"iptables-alerter-mlhp9\" (UID: \"4ee2a187-ea67-4cf0-a11b-3bf8404d67b6\") " pod="openshift-network-operator/iptables-alerter-mlhp9" Apr 16 19:53:54.216753 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.216655 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/ffafc088-a777-4b43-b168-c2fdb6bdbbab-socket-dir\") pod \"aws-ebs-csi-driver-node-lnmvl\" (UID: \"ffafc088-a777-4b43-b168-c2fdb6bdbbab\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lnmvl" Apr 16 19:53:54.216753 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.216680 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/40cddc46-c9be-411b-92b8-a65e04009fc9-run-openvswitch\") pod \"ovnkube-node-bsbmz\" (UID: \"40cddc46-c9be-411b-92b8-a65e04009fc9\") " pod="openshift-ovn-kubernetes/ovnkube-node-bsbmz" Apr 16 19:53:54.216753 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.216707 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h4682\" (UniqueName: \"kubernetes.io/projected/4ee2a187-ea67-4cf0-a11b-3bf8404d67b6-kube-api-access-h4682\") pod \"iptables-alerter-mlhp9\" (UID: \"4ee2a187-ea67-4cf0-a11b-3bf8404d67b6\") " pod="openshift-network-operator/iptables-alerter-mlhp9" Apr 16 19:53:54.216753 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.216730 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/74091848-cf67-4c6a-9ad9-adc12a5e47ad-cnibin\") pod \"multus-mjhhc\" (UID: \"74091848-cf67-4c6a-9ad9-adc12a5e47ad\") " pod="openshift-multus/multus-mjhhc" Apr 16 19:53:54.216753 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.216754 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/74091848-cf67-4c6a-9ad9-adc12a5e47ad-host-run-k8s-cni-cncf-io\") pod \"multus-mjhhc\" (UID: \"74091848-cf67-4c6a-9ad9-adc12a5e47ad\") " pod="openshift-multus/multus-mjhhc" Apr 16 19:53:54.217138 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.216788 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/74091848-cf67-4c6a-9ad9-adc12a5e47ad-host-var-lib-cni-multus\") pod \"multus-mjhhc\" (UID: \"74091848-cf67-4c6a-9ad9-adc12a5e47ad\") " pod="openshift-multus/multus-mjhhc" Apr 16 19:53:54.217138 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.216838 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/74091848-cf67-4c6a-9ad9-adc12a5e47ad-hostroot\") pod \"multus-mjhhc\" (UID: \"74091848-cf67-4c6a-9ad9-adc12a5e47ad\") " pod="openshift-multus/multus-mjhhc" Apr 16 19:53:54.217138 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.216878 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/74091848-cf67-4c6a-9ad9-adc12a5e47ad-host-run-multus-certs\") pod \"multus-mjhhc\" (UID: \"74091848-cf67-4c6a-9ad9-adc12a5e47ad\") " pod="openshift-multus/multus-mjhhc" Apr 16 19:53:54.217138 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.216919 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lk7dq\" (UniqueName: \"kubernetes.io/projected/dc2e7638-ebb5-4713-a221-8c885ed0b19d-kube-api-access-lk7dq\") pod \"network-check-target-qvsgg\" (UID: \"dc2e7638-ebb5-4713-a221-8c885ed0b19d\") " pod="openshift-network-diagnostics/network-check-target-qvsgg" Apr 16 19:53:54.217138 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.216948 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dg7hq\" (UniqueName: \"kubernetes.io/projected/458f83e2-e97a-457a-9081-a5ae099b6973-kube-api-access-dg7hq\") pod \"network-metrics-daemon-5j478\" (UID: \"458f83e2-e97a-457a-9081-a5ae099b6973\") " pod="openshift-multus/network-metrics-daemon-5j478" Apr 16 19:53:54.217138 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.216973 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/ffafc088-a777-4b43-b168-c2fdb6bdbbab-registration-dir\") pod \"aws-ebs-csi-driver-node-lnmvl\" (UID: \"ffafc088-a777-4b43-b168-c2fdb6bdbbab\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lnmvl" Apr 16 19:53:54.217138 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.216994 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/74091848-cf67-4c6a-9ad9-adc12a5e47ad-host-var-lib-kubelet\") pod \"multus-mjhhc\" (UID: \"74091848-cf67-4c6a-9ad9-adc12a5e47ad\") " pod="openshift-multus/multus-mjhhc" Apr 16 19:53:54.217138 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.216996 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-k2m7p\"" Apr 16 19:53:54.217138 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.217032 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/2cd1935c-d57b-4e12-881b-0c81444e85ac-tuning-conf-dir\") pod \"multus-additional-cni-plugins-j48bx\" (UID: \"2cd1935c-d57b-4e12-881b-0c81444e85ac\") " pod="openshift-multus/multus-additional-cni-plugins-j48bx" Apr 16 19:53:54.217138 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.217055 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/2cd1935c-d57b-4e12-881b-0c81444e85ac-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-j48bx\" (UID: \"2cd1935c-d57b-4e12-881b-0c81444e85ac\") " pod="openshift-multus/multus-additional-cni-plugins-j48bx" Apr 16 19:53:54.217138 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.217088 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/ffafc088-a777-4b43-b168-c2fdb6bdbbab-device-dir\") pod \"aws-ebs-csi-driver-node-lnmvl\" (UID: \"ffafc088-a777-4b43-b168-c2fdb6bdbbab\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lnmvl" Apr 16 19:53:54.217138 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.217114 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 16 19:53:54.217138 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.217128 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/ffafc088-a777-4b43-b168-c2fdb6bdbbab-etc-selinux\") pod \"aws-ebs-csi-driver-node-lnmvl\" (UID: \"ffafc088-a777-4b43-b168-c2fdb6bdbbab\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lnmvl" Apr 16 19:53:54.217719 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.217157 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/ffafc088-a777-4b43-b168-c2fdb6bdbbab-sys-fs\") pod \"aws-ebs-csi-driver-node-lnmvl\" (UID: \"ffafc088-a777-4b43-b168-c2fdb6bdbbab\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lnmvl" Apr 16 19:53:54.217719 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.217211 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/40cddc46-c9be-411b-92b8-a65e04009fc9-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-bsbmz\" (UID: \"40cddc46-c9be-411b-92b8-a65e04009fc9\") " pod="openshift-ovn-kubernetes/ovnkube-node-bsbmz" Apr 16 19:53:54.217719 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.217246 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/74091848-cf67-4c6a-9ad9-adc12a5e47ad-multus-conf-dir\") pod \"multus-mjhhc\" (UID: \"74091848-cf67-4c6a-9ad9-adc12a5e47ad\") " pod="openshift-multus/multus-mjhhc" Apr 16 19:53:54.217719 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.217270 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/74091848-cf67-4c6a-9ad9-adc12a5e47ad-etc-kubernetes\") pod \"multus-mjhhc\" (UID: \"74091848-cf67-4c6a-9ad9-adc12a5e47ad\") " pod="openshift-multus/multus-mjhhc" Apr 16 19:53:54.217719 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.217313 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ffafc088-a777-4b43-b168-c2fdb6bdbbab-kubelet-dir\") pod \"aws-ebs-csi-driver-node-lnmvl\" (UID: \"ffafc088-a777-4b43-b168-c2fdb6bdbbab\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lnmvl" Apr 16 19:53:54.217719 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.217330 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 16 19:53:54.217719 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.217338 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d2ktz\" (UniqueName: \"kubernetes.io/projected/ffafc088-a777-4b43-b168-c2fdb6bdbbab-kube-api-access-d2ktz\") pod \"aws-ebs-csi-driver-node-lnmvl\" (UID: \"ffafc088-a777-4b43-b168-c2fdb6bdbbab\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lnmvl" Apr 16 19:53:54.217719 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.217402 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7jwt9\" (UniqueName: \"kubernetes.io/projected/74091848-cf67-4c6a-9ad9-adc12a5e47ad-kube-api-access-7jwt9\") pod \"multus-mjhhc\" (UID: \"74091848-cf67-4c6a-9ad9-adc12a5e47ad\") " pod="openshift-multus/multus-mjhhc" Apr 16 19:53:54.217719 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.217425 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/40cddc46-c9be-411b-92b8-a65e04009fc9-run-systemd\") pod \"ovnkube-node-bsbmz\" (UID: \"40cddc46-c9be-411b-92b8-a65e04009fc9\") " pod="openshift-ovn-kubernetes/ovnkube-node-bsbmz" Apr 16 19:53:54.217719 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.217452 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/40cddc46-c9be-411b-92b8-a65e04009fc9-node-log\") pod \"ovnkube-node-bsbmz\" (UID: \"40cddc46-c9be-411b-92b8-a65e04009fc9\") " pod="openshift-ovn-kubernetes/ovnkube-node-bsbmz" Apr 16 19:53:54.217719 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.217474 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/40cddc46-c9be-411b-92b8-a65e04009fc9-var-lib-openvswitch\") pod \"ovnkube-node-bsbmz\" (UID: \"40cddc46-c9be-411b-92b8-a65e04009fc9\") " pod="openshift-ovn-kubernetes/ovnkube-node-bsbmz" Apr 16 19:53:54.217719 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.217522 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/40cddc46-c9be-411b-92b8-a65e04009fc9-host-cni-bin\") pod \"ovnkube-node-bsbmz\" (UID: \"40cddc46-c9be-411b-92b8-a65e04009fc9\") " pod="openshift-ovn-kubernetes/ovnkube-node-bsbmz" Apr 16 19:53:54.217719 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.217620 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/2cd1935c-d57b-4e12-881b-0c81444e85ac-cni-binary-copy\") pod \"multus-additional-cni-plugins-j48bx\" (UID: \"2cd1935c-d57b-4e12-881b-0c81444e85ac\") " pod="openshift-multus/multus-additional-cni-plugins-j48bx" Apr 16 19:53:54.217719 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.217643 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/458f83e2-e97a-457a-9081-a5ae099b6973-metrics-certs\") pod \"network-metrics-daemon-5j478\" (UID: \"458f83e2-e97a-457a-9081-a5ae099b6973\") " pod="openshift-multus/network-metrics-daemon-5j478" Apr 16 19:53:54.217719 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.217668 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/74091848-cf67-4c6a-9ad9-adc12a5e47ad-host-var-lib-cni-bin\") pod \"multus-mjhhc\" (UID: \"74091848-cf67-4c6a-9ad9-adc12a5e47ad\") " pod="openshift-multus/multus-mjhhc" Apr 16 19:53:54.217719 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.217695 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/40cddc46-c9be-411b-92b8-a65e04009fc9-systemd-units\") pod \"ovnkube-node-bsbmz\" (UID: \"40cddc46-c9be-411b-92b8-a65e04009fc9\") " pod="openshift-ovn-kubernetes/ovnkube-node-bsbmz" Apr 16 19:53:54.217719 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.217713 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/40cddc46-c9be-411b-92b8-a65e04009fc9-host-run-ovn-kubernetes\") pod \"ovnkube-node-bsbmz\" (UID: \"40cddc46-c9be-411b-92b8-a65e04009fc9\") " pod="openshift-ovn-kubernetes/ovnkube-node-bsbmz" Apr 16 19:53:54.218523 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.217727 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/40cddc46-c9be-411b-92b8-a65e04009fc9-ovn-node-metrics-cert\") pod \"ovnkube-node-bsbmz\" (UID: \"40cddc46-c9be-411b-92b8-a65e04009fc9\") " pod="openshift-ovn-kubernetes/ovnkube-node-bsbmz" Apr 16 19:53:54.218523 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.217756 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4ee2a187-ea67-4cf0-a11b-3bf8404d67b6-host-slash\") pod \"iptables-alerter-mlhp9\" (UID: \"4ee2a187-ea67-4cf0-a11b-3bf8404d67b6\") " pod="openshift-network-operator/iptables-alerter-mlhp9" Apr 16 19:53:54.218523 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.217772 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-6fbqq\"" Apr 16 19:53:54.218523 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.217780 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/74091848-cf67-4c6a-9ad9-adc12a5e47ad-multus-cni-dir\") pod \"multus-mjhhc\" (UID: \"74091848-cf67-4c6a-9ad9-adc12a5e47ad\") " pod="openshift-multus/multus-mjhhc" Apr 16 19:53:54.218523 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.217796 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/40cddc46-c9be-411b-92b8-a65e04009fc9-host-run-netns\") pod \"ovnkube-node-bsbmz\" (UID: \"40cddc46-c9be-411b-92b8-a65e04009fc9\") " pod="openshift-ovn-kubernetes/ovnkube-node-bsbmz" Apr 16 19:53:54.218523 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.217811 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6f74t\" (UniqueName: \"kubernetes.io/projected/2cd1935c-d57b-4e12-881b-0c81444e85ac-kube-api-access-6f74t\") pod \"multus-additional-cni-plugins-j48bx\" (UID: \"2cd1935c-d57b-4e12-881b-0c81444e85ac\") " pod="openshift-multus/multus-additional-cni-plugins-j48bx" Apr 16 19:53:54.218523 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.218124 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/2cd1935c-d57b-4e12-881b-0c81444e85ac-os-release\") pod \"multus-additional-cni-plugins-j48bx\" (UID: \"2cd1935c-d57b-4e12-881b-0c81444e85ac\") " pod="openshift-multus/multus-additional-cni-plugins-j48bx" Apr 16 19:53:54.218523 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.218174 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/74091848-cf67-4c6a-9ad9-adc12a5e47ad-multus-socket-dir-parent\") pod \"multus-mjhhc\" (UID: \"74091848-cf67-4c6a-9ad9-adc12a5e47ad\") " pod="openshift-multus/multus-mjhhc" Apr 16 19:53:54.218523 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.218201 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/74091848-cf67-4c6a-9ad9-adc12a5e47ad-host-run-netns\") pod \"multus-mjhhc\" (UID: \"74091848-cf67-4c6a-9ad9-adc12a5e47ad\") " pod="openshift-multus/multus-mjhhc" Apr 16 19:53:54.218523 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.218232 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/2cd1935c-d57b-4e12-881b-0c81444e85ac-cnibin\") pod \"multus-additional-cni-plugins-j48bx\" (UID: \"2cd1935c-d57b-4e12-881b-0c81444e85ac\") " pod="openshift-multus/multus-additional-cni-plugins-j48bx" Apr 16 19:53:54.218523 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.218262 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/40cddc46-c9be-411b-92b8-a65e04009fc9-run-ovn\") pod \"ovnkube-node-bsbmz\" (UID: \"40cddc46-c9be-411b-92b8-a65e04009fc9\") " pod="openshift-ovn-kubernetes/ovnkube-node-bsbmz" Apr 16 19:53:54.218523 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.218392 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 16 19:53:54.219087 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.218802 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 16 19:53:54.219087 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.218958 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 16 19:53:54.224128 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.224110 2572 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 19:53:54.239427 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.239391 2572 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-15 19:48:53 +0000 UTC" deadline="2028-01-31 01:09:24.743626588 +0000 UTC" Apr 16 19:53:54.239486 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.239429 2572 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="15701h15m30.504201634s" Apr 16 19:53:54.247353 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.247334 2572 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-j7hph" Apr 16 19:53:54.255654 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.255640 2572 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-j7hph" Apr 16 19:53:54.292249 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.292232 2572 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 19:53:54.311401 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.311388 2572 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 16 19:53:54.318433 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.318418 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/74091848-cf67-4c6a-9ad9-adc12a5e47ad-multus-conf-dir\") pod \"multus-mjhhc\" (UID: \"74091848-cf67-4c6a-9ad9-adc12a5e47ad\") " pod="openshift-multus/multus-mjhhc" Apr 16 19:53:54.318487 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.318440 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/74091848-cf67-4c6a-9ad9-adc12a5e47ad-etc-kubernetes\") pod \"multus-mjhhc\" (UID: \"74091848-cf67-4c6a-9ad9-adc12a5e47ad\") " pod="openshift-multus/multus-mjhhc" Apr 16 19:53:54.318522 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.318487 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/74091848-cf67-4c6a-9ad9-adc12a5e47ad-multus-conf-dir\") pod \"multus-mjhhc\" (UID: \"74091848-cf67-4c6a-9ad9-adc12a5e47ad\") " pod="openshift-multus/multus-mjhhc" Apr 16 19:53:54.318553 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.318527 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/74091848-cf67-4c6a-9ad9-adc12a5e47ad-etc-kubernetes\") pod \"multus-mjhhc\" (UID: \"74091848-cf67-4c6a-9ad9-adc12a5e47ad\") " pod="openshift-multus/multus-mjhhc" Apr 16 19:53:54.318553 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.318548 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/d41807f7-41a1-4bc5-bb49-bc957656ab37-etc-sysconfig\") pod \"tuned-d2lfx\" (UID: \"d41807f7-41a1-4bc5-bb49-bc957656ab37\") " pod="openshift-cluster-node-tuning-operator/tuned-d2lfx" Apr 16 19:53:54.318621 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.318564 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/d41807f7-41a1-4bc5-bb49-bc957656ab37-run\") pod \"tuned-d2lfx\" (UID: \"d41807f7-41a1-4bc5-bb49-bc957656ab37\") " pod="openshift-cluster-node-tuning-operator/tuned-d2lfx" Apr 16 19:53:54.318621 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.318581 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ffafc088-a777-4b43-b168-c2fdb6bdbbab-kubelet-dir\") pod \"aws-ebs-csi-driver-node-lnmvl\" (UID: \"ffafc088-a777-4b43-b168-c2fdb6bdbbab\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lnmvl" Apr 16 19:53:54.318621 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.318597 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d2ktz\" (UniqueName: \"kubernetes.io/projected/ffafc088-a777-4b43-b168-c2fdb6bdbbab-kube-api-access-d2ktz\") pod \"aws-ebs-csi-driver-node-lnmvl\" (UID: \"ffafc088-a777-4b43-b168-c2fdb6bdbbab\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lnmvl" Apr 16 19:53:54.318621 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.318612 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7jwt9\" (UniqueName: \"kubernetes.io/projected/74091848-cf67-4c6a-9ad9-adc12a5e47ad-kube-api-access-7jwt9\") pod \"multus-mjhhc\" (UID: \"74091848-cf67-4c6a-9ad9-adc12a5e47ad\") " pod="openshift-multus/multus-mjhhc" Apr 16 19:53:54.318791 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.318628 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/40cddc46-c9be-411b-92b8-a65e04009fc9-run-systemd\") pod \"ovnkube-node-bsbmz\" (UID: \"40cddc46-c9be-411b-92b8-a65e04009fc9\") " pod="openshift-ovn-kubernetes/ovnkube-node-bsbmz" Apr 16 19:53:54.318791 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.318642 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/40cddc46-c9be-411b-92b8-a65e04009fc9-node-log\") pod \"ovnkube-node-bsbmz\" (UID: \"40cddc46-c9be-411b-92b8-a65e04009fc9\") " pod="openshift-ovn-kubernetes/ovnkube-node-bsbmz" Apr 16 19:53:54.318791 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.318659 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ffafc088-a777-4b43-b168-c2fdb6bdbbab-kubelet-dir\") pod \"aws-ebs-csi-driver-node-lnmvl\" (UID: \"ffafc088-a777-4b43-b168-c2fdb6bdbbab\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lnmvl" Apr 16 19:53:54.318791 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.318671 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/40cddc46-c9be-411b-92b8-a65e04009fc9-var-lib-openvswitch\") pod \"ovnkube-node-bsbmz\" (UID: \"40cddc46-c9be-411b-92b8-a65e04009fc9\") " pod="openshift-ovn-kubernetes/ovnkube-node-bsbmz" Apr 16 19:53:54.318791 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.318702 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/40cddc46-c9be-411b-92b8-a65e04009fc9-host-cni-bin\") pod \"ovnkube-node-bsbmz\" (UID: \"40cddc46-c9be-411b-92b8-a65e04009fc9\") " pod="openshift-ovn-kubernetes/ovnkube-node-bsbmz" Apr 16 19:53:54.318791 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.318714 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/40cddc46-c9be-411b-92b8-a65e04009fc9-node-log\") pod \"ovnkube-node-bsbmz\" (UID: \"40cddc46-c9be-411b-92b8-a65e04009fc9\") " pod="openshift-ovn-kubernetes/ovnkube-node-bsbmz" Apr 16 19:53:54.318791 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.318751 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/40cddc46-c9be-411b-92b8-a65e04009fc9-host-cni-bin\") pod \"ovnkube-node-bsbmz\" (UID: \"40cddc46-c9be-411b-92b8-a65e04009fc9\") " pod="openshift-ovn-kubernetes/ovnkube-node-bsbmz" Apr 16 19:53:54.318791 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.318754 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/40cddc46-c9be-411b-92b8-a65e04009fc9-var-lib-openvswitch\") pod \"ovnkube-node-bsbmz\" (UID: \"40cddc46-c9be-411b-92b8-a65e04009fc9\") " pod="openshift-ovn-kubernetes/ovnkube-node-bsbmz" Apr 16 19:53:54.319155 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.318794 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/2cd1935c-d57b-4e12-881b-0c81444e85ac-cni-binary-copy\") pod \"multus-additional-cni-plugins-j48bx\" (UID: \"2cd1935c-d57b-4e12-881b-0c81444e85ac\") " pod="openshift-multus/multus-additional-cni-plugins-j48bx" Apr 16 19:53:54.319155 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.318826 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/d41807f7-41a1-4bc5-bb49-bc957656ab37-etc-systemd\") pod \"tuned-d2lfx\" (UID: \"d41807f7-41a1-4bc5-bb49-bc957656ab37\") " pod="openshift-cluster-node-tuning-operator/tuned-d2lfx" Apr 16 19:53:54.319155 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.318853 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/458f83e2-e97a-457a-9081-a5ae099b6973-metrics-certs\") pod \"network-metrics-daemon-5j478\" (UID: \"458f83e2-e97a-457a-9081-a5ae099b6973\") " pod="openshift-multus/network-metrics-daemon-5j478" Apr 16 19:53:54.319155 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.318878 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/74091848-cf67-4c6a-9ad9-adc12a5e47ad-host-var-lib-cni-bin\") pod \"multus-mjhhc\" (UID: \"74091848-cf67-4c6a-9ad9-adc12a5e47ad\") " pod="openshift-multus/multus-mjhhc" Apr 16 19:53:54.319155 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.318901 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/40cddc46-c9be-411b-92b8-a65e04009fc9-systemd-units\") pod \"ovnkube-node-bsbmz\" (UID: \"40cddc46-c9be-411b-92b8-a65e04009fc9\") " pod="openshift-ovn-kubernetes/ovnkube-node-bsbmz" Apr 16 19:53:54.319155 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.318924 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/40cddc46-c9be-411b-92b8-a65e04009fc9-host-run-ovn-kubernetes\") pod \"ovnkube-node-bsbmz\" (UID: \"40cddc46-c9be-411b-92b8-a65e04009fc9\") " pod="openshift-ovn-kubernetes/ovnkube-node-bsbmz" Apr 16 19:53:54.319155 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.318929 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/74091848-cf67-4c6a-9ad9-adc12a5e47ad-host-var-lib-cni-bin\") pod \"multus-mjhhc\" (UID: \"74091848-cf67-4c6a-9ad9-adc12a5e47ad\") " pod="openshift-multus/multus-mjhhc" Apr 16 19:53:54.319155 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.318949 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/40cddc46-c9be-411b-92b8-a65e04009fc9-ovn-node-metrics-cert\") pod \"ovnkube-node-bsbmz\" (UID: \"40cddc46-c9be-411b-92b8-a65e04009fc9\") " pod="openshift-ovn-kubernetes/ovnkube-node-bsbmz" Apr 16 19:53:54.319155 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.318977 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/d41807f7-41a1-4bc5-bb49-bc957656ab37-etc-modprobe-d\") pod \"tuned-d2lfx\" (UID: \"d41807f7-41a1-4bc5-bb49-bc957656ab37\") " pod="openshift-cluster-node-tuning-operator/tuned-d2lfx" Apr 16 19:53:54.319155 ip-10-0-128-48 kubenswrapper[2572]: E0416 19:53:54.319000 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:53:54.319155 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.319010 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/40cddc46-c9be-411b-92b8-a65e04009fc9-host-run-ovn-kubernetes\") pod \"ovnkube-node-bsbmz\" (UID: \"40cddc46-c9be-411b-92b8-a65e04009fc9\") " pod="openshift-ovn-kubernetes/ovnkube-node-bsbmz" Apr 16 19:53:54.319155 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.319067 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4ee2a187-ea67-4cf0-a11b-3bf8404d67b6-host-slash\") pod \"iptables-alerter-mlhp9\" (UID: \"4ee2a187-ea67-4cf0-a11b-3bf8404d67b6\") " pod="openshift-network-operator/iptables-alerter-mlhp9" Apr 16 19:53:54.319155 ip-10-0-128-48 kubenswrapper[2572]: E0416 19:53:54.319079 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/458f83e2-e97a-457a-9081-a5ae099b6973-metrics-certs podName:458f83e2-e97a-457a-9081-a5ae099b6973 nodeName:}" failed. No retries permitted until 2026-04-16 19:53:54.819047282 +0000 UTC m=+1.997809329 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/458f83e2-e97a-457a-9081-a5ae099b6973-metrics-certs") pod "network-metrics-daemon-5j478" (UID: "458f83e2-e97a-457a-9081-a5ae099b6973") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:53:54.319155 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.319004 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4ee2a187-ea67-4cf0-a11b-3bf8404d67b6-host-slash\") pod \"iptables-alerter-mlhp9\" (UID: \"4ee2a187-ea67-4cf0-a11b-3bf8404d67b6\") " pod="openshift-network-operator/iptables-alerter-mlhp9" Apr 16 19:53:54.319155 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.319130 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/74091848-cf67-4c6a-9ad9-adc12a5e47ad-multus-cni-dir\") pod \"multus-mjhhc\" (UID: \"74091848-cf67-4c6a-9ad9-adc12a5e47ad\") " pod="openshift-multus/multus-mjhhc" Apr 16 19:53:54.319155 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.319156 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/40cddc46-c9be-411b-92b8-a65e04009fc9-host-run-netns\") pod \"ovnkube-node-bsbmz\" (UID: \"40cddc46-c9be-411b-92b8-a65e04009fc9\") " pod="openshift-ovn-kubernetes/ovnkube-node-bsbmz" Apr 16 19:53:54.319935 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.319182 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6f74t\" (UniqueName: \"kubernetes.io/projected/2cd1935c-d57b-4e12-881b-0c81444e85ac-kube-api-access-6f74t\") pod \"multus-additional-cni-plugins-j48bx\" (UID: \"2cd1935c-d57b-4e12-881b-0c81444e85ac\") " pod="openshift-multus/multus-additional-cni-plugins-j48bx" Apr 16 19:53:54.319935 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.319211 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-shnwv\" (UniqueName: \"kubernetes.io/projected/d41807f7-41a1-4bc5-bb49-bc957656ab37-kube-api-access-shnwv\") pod \"tuned-d2lfx\" (UID: \"d41807f7-41a1-4bc5-bb49-bc957656ab37\") " pod="openshift-cluster-node-tuning-operator/tuned-d2lfx" Apr 16 19:53:54.319935 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.319238 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7w58f\" (UniqueName: \"kubernetes.io/projected/b9261443-4578-43ee-abc0-7931d8ab9f10-kube-api-access-7w58f\") pod \"node-ca-2jzjc\" (UID: \"b9261443-4578-43ee-abc0-7931d8ab9f10\") " pod="openshift-image-registry/node-ca-2jzjc" Apr 16 19:53:54.319935 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.319264 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/2cd1935c-d57b-4e12-881b-0c81444e85ac-os-release\") pod \"multus-additional-cni-plugins-j48bx\" (UID: \"2cd1935c-d57b-4e12-881b-0c81444e85ac\") " pod="openshift-multus/multus-additional-cni-plugins-j48bx" Apr 16 19:53:54.319935 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.319293 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/74091848-cf67-4c6a-9ad9-adc12a5e47ad-multus-socket-dir-parent\") pod \"multus-mjhhc\" (UID: \"74091848-cf67-4c6a-9ad9-adc12a5e47ad\") " pod="openshift-multus/multus-mjhhc" Apr 16 19:53:54.319935 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.319295 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/40cddc46-c9be-411b-92b8-a65e04009fc9-host-run-netns\") pod \"ovnkube-node-bsbmz\" (UID: \"40cddc46-c9be-411b-92b8-a65e04009fc9\") " pod="openshift-ovn-kubernetes/ovnkube-node-bsbmz" Apr 16 19:53:54.319935 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.319318 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/74091848-cf67-4c6a-9ad9-adc12a5e47ad-host-run-netns\") pod \"multus-mjhhc\" (UID: \"74091848-cf67-4c6a-9ad9-adc12a5e47ad\") " pod="openshift-multus/multus-mjhhc" Apr 16 19:53:54.319935 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.319343 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/2cd1935c-d57b-4e12-881b-0c81444e85ac-cnibin\") pod \"multus-additional-cni-plugins-j48bx\" (UID: \"2cd1935c-d57b-4e12-881b-0c81444e85ac\") " pod="openshift-multus/multus-additional-cni-plugins-j48bx" Apr 16 19:53:54.319935 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.319368 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/2cd1935c-d57b-4e12-881b-0c81444e85ac-cni-binary-copy\") pod \"multus-additional-cni-plugins-j48bx\" (UID: \"2cd1935c-d57b-4e12-881b-0c81444e85ac\") " pod="openshift-multus/multus-additional-cni-plugins-j48bx" Apr 16 19:53:54.319935 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.319369 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/d41807f7-41a1-4bc5-bb49-bc957656ab37-etc-sysctl-conf\") pod \"tuned-d2lfx\" (UID: \"d41807f7-41a1-4bc5-bb49-bc957656ab37\") " pod="openshift-cluster-node-tuning-operator/tuned-d2lfx" Apr 16 19:53:54.319935 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.319415 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d41807f7-41a1-4bc5-bb49-bc957656ab37-lib-modules\") pod \"tuned-d2lfx\" (UID: \"d41807f7-41a1-4bc5-bb49-bc957656ab37\") " pod="openshift-cluster-node-tuning-operator/tuned-d2lfx" Apr 16 19:53:54.319935 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.319407 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/2cd1935c-d57b-4e12-881b-0c81444e85ac-os-release\") pod \"multus-additional-cni-plugins-j48bx\" (UID: \"2cd1935c-d57b-4e12-881b-0c81444e85ac\") " pod="openshift-multus/multus-additional-cni-plugins-j48bx" Apr 16 19:53:54.319935 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.319426 2572 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 16 19:53:54.319935 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.319438 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d41807f7-41a1-4bc5-bb49-bc957656ab37-host\") pod \"tuned-d2lfx\" (UID: \"d41807f7-41a1-4bc5-bb49-bc957656ab37\") " pod="openshift-cluster-node-tuning-operator/tuned-d2lfx" Apr 16 19:53:54.319935 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.319132 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/40cddc46-c9be-411b-92b8-a65e04009fc9-systemd-units\") pod \"ovnkube-node-bsbmz\" (UID: \"40cddc46-c9be-411b-92b8-a65e04009fc9\") " pod="openshift-ovn-kubernetes/ovnkube-node-bsbmz" Apr 16 19:53:54.319935 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.319455 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/74091848-cf67-4c6a-9ad9-adc12a5e47ad-multus-socket-dir-parent\") pod \"multus-mjhhc\" (UID: \"74091848-cf67-4c6a-9ad9-adc12a5e47ad\") " pod="openshift-multus/multus-mjhhc" Apr 16 19:53:54.319935 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.319433 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/74091848-cf67-4c6a-9ad9-adc12a5e47ad-multus-cni-dir\") pod \"multus-mjhhc\" (UID: \"74091848-cf67-4c6a-9ad9-adc12a5e47ad\") " pod="openshift-multus/multus-mjhhc" Apr 16 19:53:54.320769 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.319467 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/40cddc46-c9be-411b-92b8-a65e04009fc9-run-ovn\") pod \"ovnkube-node-bsbmz\" (UID: \"40cddc46-c9be-411b-92b8-a65e04009fc9\") " pod="openshift-ovn-kubernetes/ovnkube-node-bsbmz" Apr 16 19:53:54.320769 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.319490 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/40cddc46-c9be-411b-92b8-a65e04009fc9-run-systemd\") pod \"ovnkube-node-bsbmz\" (UID: \"40cddc46-c9be-411b-92b8-a65e04009fc9\") " pod="openshift-ovn-kubernetes/ovnkube-node-bsbmz" Apr 16 19:53:54.320769 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.319482 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/2cd1935c-d57b-4e12-881b-0c81444e85ac-cnibin\") pod \"multus-additional-cni-plugins-j48bx\" (UID: \"2cd1935c-d57b-4e12-881b-0c81444e85ac\") " pod="openshift-multus/multus-additional-cni-plugins-j48bx" Apr 16 19:53:54.320769 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.319505 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/d41807f7-41a1-4bc5-bb49-bc957656ab37-tmp\") pod \"tuned-d2lfx\" (UID: \"d41807f7-41a1-4bc5-bb49-bc957656ab37\") " pod="openshift-cluster-node-tuning-operator/tuned-d2lfx" Apr 16 19:53:54.320769 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.319512 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/74091848-cf67-4c6a-9ad9-adc12a5e47ad-host-run-netns\") pod \"multus-mjhhc\" (UID: \"74091848-cf67-4c6a-9ad9-adc12a5e47ad\") " pod="openshift-multus/multus-mjhhc" Apr 16 19:53:54.320769 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.319530 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/9fd607fa-9f37-4836-a69c-147c1052dbc4-konnectivity-ca\") pod \"konnectivity-agent-dxbfj\" (UID: \"9fd607fa-9f37-4836-a69c-147c1052dbc4\") " pod="kube-system/konnectivity-agent-dxbfj" Apr 16 19:53:54.320769 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.319583 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/74091848-cf67-4c6a-9ad9-adc12a5e47ad-cni-binary-copy\") pod \"multus-mjhhc\" (UID: \"74091848-cf67-4c6a-9ad9-adc12a5e47ad\") " pod="openshift-multus/multus-mjhhc" Apr 16 19:53:54.320769 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.319592 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/40cddc46-c9be-411b-92b8-a65e04009fc9-run-ovn\") pod \"ovnkube-node-bsbmz\" (UID: \"40cddc46-c9be-411b-92b8-a65e04009fc9\") " pod="openshift-ovn-kubernetes/ovnkube-node-bsbmz" Apr 16 19:53:54.320769 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.319627 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/40cddc46-c9be-411b-92b8-a65e04009fc9-host-cni-netd\") pod \"ovnkube-node-bsbmz\" (UID: \"40cddc46-c9be-411b-92b8-a65e04009fc9\") " pod="openshift-ovn-kubernetes/ovnkube-node-bsbmz" Apr 16 19:53:54.320769 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.319671 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d41807f7-41a1-4bc5-bb49-bc957656ab37-var-lib-kubelet\") pod \"tuned-d2lfx\" (UID: \"d41807f7-41a1-4bc5-bb49-bc957656ab37\") " pod="openshift-cluster-node-tuning-operator/tuned-d2lfx" Apr 16 19:53:54.320769 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.319699 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/40cddc46-c9be-411b-92b8-a65e04009fc9-host-slash\") pod \"ovnkube-node-bsbmz\" (UID: \"40cddc46-c9be-411b-92b8-a65e04009fc9\") " pod="openshift-ovn-kubernetes/ovnkube-node-bsbmz" Apr 16 19:53:54.320769 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.319724 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/40cddc46-c9be-411b-92b8-a65e04009fc9-ovnkube-config\") pod \"ovnkube-node-bsbmz\" (UID: \"40cddc46-c9be-411b-92b8-a65e04009fc9\") " pod="openshift-ovn-kubernetes/ovnkube-node-bsbmz" Apr 16 19:53:54.320769 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.319732 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/40cddc46-c9be-411b-92b8-a65e04009fc9-host-cni-netd\") pod \"ovnkube-node-bsbmz\" (UID: \"40cddc46-c9be-411b-92b8-a65e04009fc9\") " pod="openshift-ovn-kubernetes/ovnkube-node-bsbmz" Apr 16 19:53:54.320769 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.319758 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b9261443-4578-43ee-abc0-7931d8ab9f10-host\") pod \"node-ca-2jzjc\" (UID: \"b9261443-4578-43ee-abc0-7931d8ab9f10\") " pod="openshift-image-registry/node-ca-2jzjc" Apr 16 19:53:54.320769 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.319787 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/40cddc46-c9be-411b-92b8-a65e04009fc9-host-slash\") pod \"ovnkube-node-bsbmz\" (UID: \"40cddc46-c9be-411b-92b8-a65e04009fc9\") " pod="openshift-ovn-kubernetes/ovnkube-node-bsbmz" Apr 16 19:53:54.320769 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.319841 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/74091848-cf67-4c6a-9ad9-adc12a5e47ad-system-cni-dir\") pod \"multus-mjhhc\" (UID: \"74091848-cf67-4c6a-9ad9-adc12a5e47ad\") " pod="openshift-multus/multus-mjhhc" Apr 16 19:53:54.320769 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.319875 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/40cddc46-c9be-411b-92b8-a65e04009fc9-etc-openvswitch\") pod \"ovnkube-node-bsbmz\" (UID: \"40cddc46-c9be-411b-92b8-a65e04009fc9\") " pod="openshift-ovn-kubernetes/ovnkube-node-bsbmz" Apr 16 19:53:54.320769 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.319879 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/74091848-cf67-4c6a-9ad9-adc12a5e47ad-system-cni-dir\") pod \"multus-mjhhc\" (UID: \"74091848-cf67-4c6a-9ad9-adc12a5e47ad\") " pod="openshift-multus/multus-mjhhc" Apr 16 19:53:54.321597 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.319902 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2cd1935c-d57b-4e12-881b-0c81444e85ac-system-cni-dir\") pod \"multus-additional-cni-plugins-j48bx\" (UID: \"2cd1935c-d57b-4e12-881b-0c81444e85ac\") " pod="openshift-multus/multus-additional-cni-plugins-j48bx" Apr 16 19:53:54.321597 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.319931 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2cd1935c-d57b-4e12-881b-0c81444e85ac-system-cni-dir\") pod \"multus-additional-cni-plugins-j48bx\" (UID: \"2cd1935c-d57b-4e12-881b-0c81444e85ac\") " pod="openshift-multus/multus-additional-cni-plugins-j48bx" Apr 16 19:53:54.321597 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.319932 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/40cddc46-c9be-411b-92b8-a65e04009fc9-etc-openvswitch\") pod \"ovnkube-node-bsbmz\" (UID: \"40cddc46-c9be-411b-92b8-a65e04009fc9\") " pod="openshift-ovn-kubernetes/ovnkube-node-bsbmz" Apr 16 19:53:54.321597 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.319964 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/40cddc46-c9be-411b-92b8-a65e04009fc9-host-kubelet\") pod \"ovnkube-node-bsbmz\" (UID: \"40cddc46-c9be-411b-92b8-a65e04009fc9\") " pod="openshift-ovn-kubernetes/ovnkube-node-bsbmz" Apr 16 19:53:54.321597 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.320031 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/74091848-cf67-4c6a-9ad9-adc12a5e47ad-os-release\") pod \"multus-mjhhc\" (UID: \"74091848-cf67-4c6a-9ad9-adc12a5e47ad\") " pod="openshift-multus/multus-mjhhc" Apr 16 19:53:54.321597 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.320062 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/74091848-cf67-4c6a-9ad9-adc12a5e47ad-multus-daemon-config\") pod \"multus-mjhhc\" (UID: \"74091848-cf67-4c6a-9ad9-adc12a5e47ad\") " pod="openshift-multus/multus-mjhhc" Apr 16 19:53:54.321597 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.320080 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/40cddc46-c9be-411b-92b8-a65e04009fc9-log-socket\") pod \"ovnkube-node-bsbmz\" (UID: \"40cddc46-c9be-411b-92b8-a65e04009fc9\") " pod="openshift-ovn-kubernetes/ovnkube-node-bsbmz" Apr 16 19:53:54.321597 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.320083 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/74091848-cf67-4c6a-9ad9-adc12a5e47ad-os-release\") pod \"multus-mjhhc\" (UID: \"74091848-cf67-4c6a-9ad9-adc12a5e47ad\") " pod="openshift-multus/multus-mjhhc" Apr 16 19:53:54.321597 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.320123 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/40cddc46-c9be-411b-92b8-a65e04009fc9-env-overrides\") pod \"ovnkube-node-bsbmz\" (UID: \"40cddc46-c9be-411b-92b8-a65e04009fc9\") " pod="openshift-ovn-kubernetes/ovnkube-node-bsbmz" Apr 16 19:53:54.321597 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.320160 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/40cddc46-c9be-411b-92b8-a65e04009fc9-log-socket\") pod \"ovnkube-node-bsbmz\" (UID: \"40cddc46-c9be-411b-92b8-a65e04009fc9\") " pod="openshift-ovn-kubernetes/ovnkube-node-bsbmz" Apr 16 19:53:54.321597 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.320193 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-724dh\" (UniqueName: \"kubernetes.io/projected/40cddc46-c9be-411b-92b8-a65e04009fc9-kube-api-access-724dh\") pod \"ovnkube-node-bsbmz\" (UID: \"40cddc46-c9be-411b-92b8-a65e04009fc9\") " pod="openshift-ovn-kubernetes/ovnkube-node-bsbmz" Apr 16 19:53:54.321597 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.320218 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/2cd1935c-d57b-4e12-881b-0c81444e85ac-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-j48bx\" (UID: \"2cd1935c-d57b-4e12-881b-0c81444e85ac\") " pod="openshift-multus/multus-additional-cni-plugins-j48bx" Apr 16 19:53:54.321597 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.320230 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/74091848-cf67-4c6a-9ad9-adc12a5e47ad-cni-binary-copy\") pod \"multus-mjhhc\" (UID: \"74091848-cf67-4c6a-9ad9-adc12a5e47ad\") " pod="openshift-multus/multus-mjhhc" Apr 16 19:53:54.321597 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.320033 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/40cddc46-c9be-411b-92b8-a65e04009fc9-host-kubelet\") pod \"ovnkube-node-bsbmz\" (UID: \"40cddc46-c9be-411b-92b8-a65e04009fc9\") " pod="openshift-ovn-kubernetes/ovnkube-node-bsbmz" Apr 16 19:53:54.321597 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.320272 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/d41807f7-41a1-4bc5-bb49-bc957656ab37-etc-tuned\") pod \"tuned-d2lfx\" (UID: \"d41807f7-41a1-4bc5-bb49-bc957656ab37\") " pod="openshift-cluster-node-tuning-operator/tuned-d2lfx" Apr 16 19:53:54.321597 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.320316 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/40cddc46-c9be-411b-92b8-a65e04009fc9-ovnkube-script-lib\") pod \"ovnkube-node-bsbmz\" (UID: \"40cddc46-c9be-411b-92b8-a65e04009fc9\") " pod="openshift-ovn-kubernetes/ovnkube-node-bsbmz" Apr 16 19:53:54.321597 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.320345 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/4ee2a187-ea67-4cf0-a11b-3bf8404d67b6-iptables-alerter-script\") pod \"iptables-alerter-mlhp9\" (UID: \"4ee2a187-ea67-4cf0-a11b-3bf8404d67b6\") " pod="openshift-network-operator/iptables-alerter-mlhp9" Apr 16 19:53:54.322615 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.320373 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/ffafc088-a777-4b43-b168-c2fdb6bdbbab-socket-dir\") pod \"aws-ebs-csi-driver-node-lnmvl\" (UID: \"ffafc088-a777-4b43-b168-c2fdb6bdbbab\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lnmvl" Apr 16 19:53:54.322615 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.320396 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/40cddc46-c9be-411b-92b8-a65e04009fc9-run-openvswitch\") pod \"ovnkube-node-bsbmz\" (UID: \"40cddc46-c9be-411b-92b8-a65e04009fc9\") " pod="openshift-ovn-kubernetes/ovnkube-node-bsbmz" Apr 16 19:53:54.322615 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.320864 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/b9261443-4578-43ee-abc0-7931d8ab9f10-serviceca\") pod \"node-ca-2jzjc\" (UID: \"b9261443-4578-43ee-abc0-7931d8ab9f10\") " pod="openshift-image-registry/node-ca-2jzjc" Apr 16 19:53:54.322615 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.320977 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h4682\" (UniqueName: \"kubernetes.io/projected/4ee2a187-ea67-4cf0-a11b-3bf8404d67b6-kube-api-access-h4682\") pod \"iptables-alerter-mlhp9\" (UID: \"4ee2a187-ea67-4cf0-a11b-3bf8404d67b6\") " pod="openshift-network-operator/iptables-alerter-mlhp9" Apr 16 19:53:54.322615 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.321007 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/74091848-cf67-4c6a-9ad9-adc12a5e47ad-cnibin\") pod \"multus-mjhhc\" (UID: \"74091848-cf67-4c6a-9ad9-adc12a5e47ad\") " pod="openshift-multus/multus-mjhhc" Apr 16 19:53:54.322615 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.321057 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/74091848-cf67-4c6a-9ad9-adc12a5e47ad-host-run-k8s-cni-cncf-io\") pod \"multus-mjhhc\" (UID: \"74091848-cf67-4c6a-9ad9-adc12a5e47ad\") " pod="openshift-multus/multus-mjhhc" Apr 16 19:53:54.322615 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.321076 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/74091848-cf67-4c6a-9ad9-adc12a5e47ad-multus-daemon-config\") pod \"multus-mjhhc\" (UID: \"74091848-cf67-4c6a-9ad9-adc12a5e47ad\") " pod="openshift-multus/multus-mjhhc" Apr 16 19:53:54.322615 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.321109 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/74091848-cf67-4c6a-9ad9-adc12a5e47ad-host-var-lib-cni-multus\") pod \"multus-mjhhc\" (UID: \"74091848-cf67-4c6a-9ad9-adc12a5e47ad\") " pod="openshift-multus/multus-mjhhc" Apr 16 19:53:54.322615 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.321155 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/74091848-cf67-4c6a-9ad9-adc12a5e47ad-host-var-lib-cni-multus\") pod \"multus-mjhhc\" (UID: \"74091848-cf67-4c6a-9ad9-adc12a5e47ad\") " pod="openshift-multus/multus-mjhhc" Apr 16 19:53:54.322615 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.321175 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/74091848-cf67-4c6a-9ad9-adc12a5e47ad-hostroot\") pod \"multus-mjhhc\" (UID: \"74091848-cf67-4c6a-9ad9-adc12a5e47ad\") " pod="openshift-multus/multus-mjhhc" Apr 16 19:53:54.322615 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.321189 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/40cddc46-c9be-411b-92b8-a65e04009fc9-env-overrides\") pod \"ovnkube-node-bsbmz\" (UID: \"40cddc46-c9be-411b-92b8-a65e04009fc9\") " pod="openshift-ovn-kubernetes/ovnkube-node-bsbmz" Apr 16 19:53:54.322615 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.321221 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/74091848-cf67-4c6a-9ad9-adc12a5e47ad-host-run-multus-certs\") pod \"multus-mjhhc\" (UID: \"74091848-cf67-4c6a-9ad9-adc12a5e47ad\") " pod="openshift-multus/multus-mjhhc" Apr 16 19:53:54.322615 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.321451 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/74091848-cf67-4c6a-9ad9-adc12a5e47ad-hostroot\") pod \"multus-mjhhc\" (UID: \"74091848-cf67-4c6a-9ad9-adc12a5e47ad\") " pod="openshift-multus/multus-mjhhc" Apr 16 19:53:54.322615 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.321503 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/74091848-cf67-4c6a-9ad9-adc12a5e47ad-host-run-multus-certs\") pod \"multus-mjhhc\" (UID: \"74091848-cf67-4c6a-9ad9-adc12a5e47ad\") " pod="openshift-multus/multus-mjhhc" Apr 16 19:53:54.322615 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.321529 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/74091848-cf67-4c6a-9ad9-adc12a5e47ad-host-run-k8s-cni-cncf-io\") pod \"multus-mjhhc\" (UID: \"74091848-cf67-4c6a-9ad9-adc12a5e47ad\") " pod="openshift-multus/multus-mjhhc" Apr 16 19:53:54.322615 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.321605 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/74091848-cf67-4c6a-9ad9-adc12a5e47ad-cnibin\") pod \"multus-mjhhc\" (UID: \"74091848-cf67-4c6a-9ad9-adc12a5e47ad\") " pod="openshift-multus/multus-mjhhc" Apr 16 19:53:54.322615 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.321721 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/ffafc088-a777-4b43-b168-c2fdb6bdbbab-socket-dir\") pod \"aws-ebs-csi-driver-node-lnmvl\" (UID: \"ffafc088-a777-4b43-b168-c2fdb6bdbbab\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lnmvl" Apr 16 19:53:54.322615 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.320349 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/40cddc46-c9be-411b-92b8-a65e04009fc9-ovnkube-config\") pod \"ovnkube-node-bsbmz\" (UID: \"40cddc46-c9be-411b-92b8-a65e04009fc9\") " pod="openshift-ovn-kubernetes/ovnkube-node-bsbmz" Apr 16 19:53:54.323467 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.321769 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/40cddc46-c9be-411b-92b8-a65e04009fc9-ovnkube-script-lib\") pod \"ovnkube-node-bsbmz\" (UID: \"40cddc46-c9be-411b-92b8-a65e04009fc9\") " pod="openshift-ovn-kubernetes/ovnkube-node-bsbmz" Apr 16 19:53:54.323467 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.321833 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/40cddc46-c9be-411b-92b8-a65e04009fc9-run-openvswitch\") pod \"ovnkube-node-bsbmz\" (UID: \"40cddc46-c9be-411b-92b8-a65e04009fc9\") " pod="openshift-ovn-kubernetes/ovnkube-node-bsbmz" Apr 16 19:53:54.323467 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.321976 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lk7dq\" (UniqueName: \"kubernetes.io/projected/dc2e7638-ebb5-4713-a221-8c885ed0b19d-kube-api-access-lk7dq\") pod \"network-check-target-qvsgg\" (UID: \"dc2e7638-ebb5-4713-a221-8c885ed0b19d\") " pod="openshift-network-diagnostics/network-check-target-qvsgg" Apr 16 19:53:54.323467 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.322040 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dg7hq\" (UniqueName: \"kubernetes.io/projected/458f83e2-e97a-457a-9081-a5ae099b6973-kube-api-access-dg7hq\") pod \"network-metrics-daemon-5j478\" (UID: \"458f83e2-e97a-457a-9081-a5ae099b6973\") " pod="openshift-multus/network-metrics-daemon-5j478" Apr 16 19:53:54.323467 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.322045 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/2cd1935c-d57b-4e12-881b-0c81444e85ac-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-j48bx\" (UID: \"2cd1935c-d57b-4e12-881b-0c81444e85ac\") " pod="openshift-multus/multus-additional-cni-plugins-j48bx" Apr 16 19:53:54.323467 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.322068 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/ffafc088-a777-4b43-b168-c2fdb6bdbbab-registration-dir\") pod \"aws-ebs-csi-driver-node-lnmvl\" (UID: \"ffafc088-a777-4b43-b168-c2fdb6bdbbab\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lnmvl" Apr 16 19:53:54.323467 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.322130 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/74091848-cf67-4c6a-9ad9-adc12a5e47ad-host-var-lib-kubelet\") pod \"multus-mjhhc\" (UID: \"74091848-cf67-4c6a-9ad9-adc12a5e47ad\") " pod="openshift-multus/multus-mjhhc" Apr 16 19:53:54.323467 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.322166 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/2cd1935c-d57b-4e12-881b-0c81444e85ac-tuning-conf-dir\") pod \"multus-additional-cni-plugins-j48bx\" (UID: \"2cd1935c-d57b-4e12-881b-0c81444e85ac\") " pod="openshift-multus/multus-additional-cni-plugins-j48bx" Apr 16 19:53:54.323467 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.322200 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/2cd1935c-d57b-4e12-881b-0c81444e85ac-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-j48bx\" (UID: \"2cd1935c-d57b-4e12-881b-0c81444e85ac\") " pod="openshift-multus/multus-additional-cni-plugins-j48bx" Apr 16 19:53:54.323467 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.322236 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d41807f7-41a1-4bc5-bb49-bc957656ab37-etc-kubernetes\") pod \"tuned-d2lfx\" (UID: \"d41807f7-41a1-4bc5-bb49-bc957656ab37\") " pod="openshift-cluster-node-tuning-operator/tuned-d2lfx" Apr 16 19:53:54.323467 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.322286 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/9fd607fa-9f37-4836-a69c-147c1052dbc4-agent-certs\") pod \"konnectivity-agent-dxbfj\" (UID: \"9fd607fa-9f37-4836-a69c-147c1052dbc4\") " pod="kube-system/konnectivity-agent-dxbfj" Apr 16 19:53:54.323467 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.322333 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/ffafc088-a777-4b43-b168-c2fdb6bdbbab-device-dir\") pod \"aws-ebs-csi-driver-node-lnmvl\" (UID: \"ffafc088-a777-4b43-b168-c2fdb6bdbbab\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lnmvl" Apr 16 19:53:54.323467 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.322364 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/ffafc088-a777-4b43-b168-c2fdb6bdbbab-etc-selinux\") pod \"aws-ebs-csi-driver-node-lnmvl\" (UID: \"ffafc088-a777-4b43-b168-c2fdb6bdbbab\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lnmvl" Apr 16 19:53:54.323467 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.322395 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/ffafc088-a777-4b43-b168-c2fdb6bdbbab-sys-fs\") pod \"aws-ebs-csi-driver-node-lnmvl\" (UID: \"ffafc088-a777-4b43-b168-c2fdb6bdbbab\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lnmvl" Apr 16 19:53:54.323467 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.322430 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/2cd1935c-d57b-4e12-881b-0c81444e85ac-tuning-conf-dir\") pod \"multus-additional-cni-plugins-j48bx\" (UID: \"2cd1935c-d57b-4e12-881b-0c81444e85ac\") " pod="openshift-multus/multus-additional-cni-plugins-j48bx" Apr 16 19:53:54.323467 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.322480 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/ffafc088-a777-4b43-b168-c2fdb6bdbbab-registration-dir\") pod \"aws-ebs-csi-driver-node-lnmvl\" (UID: \"ffafc088-a777-4b43-b168-c2fdb6bdbbab\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lnmvl" Apr 16 19:53:54.324190 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.322478 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/4ee2a187-ea67-4cf0-a11b-3bf8404d67b6-iptables-alerter-script\") pod \"iptables-alerter-mlhp9\" (UID: \"4ee2a187-ea67-4cf0-a11b-3bf8404d67b6\") " pod="openshift-network-operator/iptables-alerter-mlhp9" Apr 16 19:53:54.324190 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.322539 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/ffafc088-a777-4b43-b168-c2fdb6bdbbab-device-dir\") pod \"aws-ebs-csi-driver-node-lnmvl\" (UID: \"ffafc088-a777-4b43-b168-c2fdb6bdbbab\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lnmvl" Apr 16 19:53:54.324190 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.322555 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/74091848-cf67-4c6a-9ad9-adc12a5e47ad-host-var-lib-kubelet\") pod \"multus-mjhhc\" (UID: \"74091848-cf67-4c6a-9ad9-adc12a5e47ad\") " pod="openshift-multus/multus-mjhhc" Apr 16 19:53:54.324190 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.322619 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/ffafc088-a777-4b43-b168-c2fdb6bdbbab-etc-selinux\") pod \"aws-ebs-csi-driver-node-lnmvl\" (UID: \"ffafc088-a777-4b43-b168-c2fdb6bdbbab\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lnmvl" Apr 16 19:53:54.324190 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.322697 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/40cddc46-c9be-411b-92b8-a65e04009fc9-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-bsbmz\" (UID: \"40cddc46-c9be-411b-92b8-a65e04009fc9\") " pod="openshift-ovn-kubernetes/ovnkube-node-bsbmz" Apr 16 19:53:54.324190 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.322735 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/d41807f7-41a1-4bc5-bb49-bc957656ab37-etc-sysctl-d\") pod \"tuned-d2lfx\" (UID: \"d41807f7-41a1-4bc5-bb49-bc957656ab37\") " pod="openshift-cluster-node-tuning-operator/tuned-d2lfx" Apr 16 19:53:54.324190 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.322761 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d41807f7-41a1-4bc5-bb49-bc957656ab37-sys\") pod \"tuned-d2lfx\" (UID: \"d41807f7-41a1-4bc5-bb49-bc957656ab37\") " pod="openshift-cluster-node-tuning-operator/tuned-d2lfx" Apr 16 19:53:54.324190 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.322803 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/ffafc088-a777-4b43-b168-c2fdb6bdbbab-sys-fs\") pod \"aws-ebs-csi-driver-node-lnmvl\" (UID: \"ffafc088-a777-4b43-b168-c2fdb6bdbbab\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lnmvl" Apr 16 19:53:54.324190 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.322986 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/40cddc46-c9be-411b-92b8-a65e04009fc9-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-bsbmz\" (UID: \"40cddc46-c9be-411b-92b8-a65e04009fc9\") " pod="openshift-ovn-kubernetes/ovnkube-node-bsbmz" Apr 16 19:53:54.324190 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.323007 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/2cd1935c-d57b-4e12-881b-0c81444e85ac-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-j48bx\" (UID: \"2cd1935c-d57b-4e12-881b-0c81444e85ac\") " pod="openshift-multus/multus-additional-cni-plugins-j48bx" Apr 16 19:53:54.324190 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.323316 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/40cddc46-c9be-411b-92b8-a65e04009fc9-ovn-node-metrics-cert\") pod \"ovnkube-node-bsbmz\" (UID: \"40cddc46-c9be-411b-92b8-a65e04009fc9\") " pod="openshift-ovn-kubernetes/ovnkube-node-bsbmz" Apr 16 19:53:54.332303 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.332281 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7jwt9\" (UniqueName: \"kubernetes.io/projected/74091848-cf67-4c6a-9ad9-adc12a5e47ad-kube-api-access-7jwt9\") pod \"multus-mjhhc\" (UID: \"74091848-cf67-4c6a-9ad9-adc12a5e47ad\") " pod="openshift-multus/multus-mjhhc" Apr 16 19:53:54.332814 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.332797 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d2ktz\" (UniqueName: \"kubernetes.io/projected/ffafc088-a777-4b43-b168-c2fdb6bdbbab-kube-api-access-d2ktz\") pod \"aws-ebs-csi-driver-node-lnmvl\" (UID: \"ffafc088-a777-4b43-b168-c2fdb6bdbbab\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lnmvl" Apr 16 19:53:54.340854 ip-10-0-128-48 kubenswrapper[2572]: E0416 19:53:54.340837 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 19:53:54.340925 ip-10-0-128-48 kubenswrapper[2572]: E0416 19:53:54.340859 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 19:53:54.340925 ip-10-0-128-48 kubenswrapper[2572]: E0416 19:53:54.340872 2572 projected.go:194] Error preparing data for projected volume kube-api-access-lk7dq for pod openshift-network-diagnostics/network-check-target-qvsgg: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 19:53:54.340925 ip-10-0-128-48 kubenswrapper[2572]: E0416 19:53:54.340922 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/dc2e7638-ebb5-4713-a221-8c885ed0b19d-kube-api-access-lk7dq podName:dc2e7638-ebb5-4713-a221-8c885ed0b19d nodeName:}" failed. No retries permitted until 2026-04-16 19:53:54.840907684 +0000 UTC m=+2.019669705 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-lk7dq" (UniqueName: "kubernetes.io/projected/dc2e7638-ebb5-4713-a221-8c885ed0b19d-kube-api-access-lk7dq") pod "network-check-target-qvsgg" (UID: "dc2e7638-ebb5-4713-a221-8c885ed0b19d") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 19:53:54.341805 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.341779 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dg7hq\" (UniqueName: \"kubernetes.io/projected/458f83e2-e97a-457a-9081-a5ae099b6973-kube-api-access-dg7hq\") pod \"network-metrics-daemon-5j478\" (UID: \"458f83e2-e97a-457a-9081-a5ae099b6973\") " pod="openshift-multus/network-metrics-daemon-5j478" Apr 16 19:53:54.342799 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.342770 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h4682\" (UniqueName: \"kubernetes.io/projected/4ee2a187-ea67-4cf0-a11b-3bf8404d67b6-kube-api-access-h4682\") pod \"iptables-alerter-mlhp9\" (UID: \"4ee2a187-ea67-4cf0-a11b-3bf8404d67b6\") " pod="openshift-network-operator/iptables-alerter-mlhp9" Apr 16 19:53:54.343008 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.342991 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6f74t\" (UniqueName: \"kubernetes.io/projected/2cd1935c-d57b-4e12-881b-0c81444e85ac-kube-api-access-6f74t\") pod \"multus-additional-cni-plugins-j48bx\" (UID: \"2cd1935c-d57b-4e12-881b-0c81444e85ac\") " pod="openshift-multus/multus-additional-cni-plugins-j48bx" Apr 16 19:53:54.343065 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.343003 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-724dh\" (UniqueName: \"kubernetes.io/projected/40cddc46-c9be-411b-92b8-a65e04009fc9-kube-api-access-724dh\") pod \"ovnkube-node-bsbmz\" (UID: \"40cddc46-c9be-411b-92b8-a65e04009fc9\") " pod="openshift-ovn-kubernetes/ovnkube-node-bsbmz" Apr 16 19:53:54.364776 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.364737 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-48.ec2.internal" event={"ID":"0ef9ecbb9850ae7e0cccd45a7695b7be","Type":"ContainerStarted","Data":"6d50ab3929e9c0ce1605aec217682a1643380aed4a25951d41ba30baed3fbf28"} Apr 16 19:53:54.365626 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.365609 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-48.ec2.internal" event={"ID":"a59663744879090502bab85c5c499c1b","Type":"ContainerStarted","Data":"fa83fd5379315de8b636531d88f7903c001d022e2df6860722b5d8e75ee09091"} Apr 16 19:53:54.423541 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.423506 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/d41807f7-41a1-4bc5-bb49-bc957656ab37-etc-sysctl-conf\") pod \"tuned-d2lfx\" (UID: \"d41807f7-41a1-4bc5-bb49-bc957656ab37\") " pod="openshift-cluster-node-tuning-operator/tuned-d2lfx" Apr 16 19:53:54.423638 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.423548 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d41807f7-41a1-4bc5-bb49-bc957656ab37-lib-modules\") pod \"tuned-d2lfx\" (UID: \"d41807f7-41a1-4bc5-bb49-bc957656ab37\") " pod="openshift-cluster-node-tuning-operator/tuned-d2lfx" Apr 16 19:53:54.423638 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.423570 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d41807f7-41a1-4bc5-bb49-bc957656ab37-host\") pod \"tuned-d2lfx\" (UID: \"d41807f7-41a1-4bc5-bb49-bc957656ab37\") " pod="openshift-cluster-node-tuning-operator/tuned-d2lfx" Apr 16 19:53:54.423638 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.423592 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/d41807f7-41a1-4bc5-bb49-bc957656ab37-tmp\") pod \"tuned-d2lfx\" (UID: \"d41807f7-41a1-4bc5-bb49-bc957656ab37\") " pod="openshift-cluster-node-tuning-operator/tuned-d2lfx" Apr 16 19:53:54.423638 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.423615 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/9fd607fa-9f37-4836-a69c-147c1052dbc4-konnectivity-ca\") pod \"konnectivity-agent-dxbfj\" (UID: \"9fd607fa-9f37-4836-a69c-147c1052dbc4\") " pod="kube-system/konnectivity-agent-dxbfj" Apr 16 19:53:54.423822 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.423641 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d41807f7-41a1-4bc5-bb49-bc957656ab37-host\") pod \"tuned-d2lfx\" (UID: \"d41807f7-41a1-4bc5-bb49-bc957656ab37\") " pod="openshift-cluster-node-tuning-operator/tuned-d2lfx" Apr 16 19:53:54.423822 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.423644 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d41807f7-41a1-4bc5-bb49-bc957656ab37-var-lib-kubelet\") pod \"tuned-d2lfx\" (UID: \"d41807f7-41a1-4bc5-bb49-bc957656ab37\") " pod="openshift-cluster-node-tuning-operator/tuned-d2lfx" Apr 16 19:53:54.423822 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.423665 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d41807f7-41a1-4bc5-bb49-bc957656ab37-lib-modules\") pod \"tuned-d2lfx\" (UID: \"d41807f7-41a1-4bc5-bb49-bc957656ab37\") " pod="openshift-cluster-node-tuning-operator/tuned-d2lfx" Apr 16 19:53:54.423822 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.423667 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/d41807f7-41a1-4bc5-bb49-bc957656ab37-etc-sysctl-conf\") pod \"tuned-d2lfx\" (UID: \"d41807f7-41a1-4bc5-bb49-bc957656ab37\") " pod="openshift-cluster-node-tuning-operator/tuned-d2lfx" Apr 16 19:53:54.423822 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.423690 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d41807f7-41a1-4bc5-bb49-bc957656ab37-var-lib-kubelet\") pod \"tuned-d2lfx\" (UID: \"d41807f7-41a1-4bc5-bb49-bc957656ab37\") " pod="openshift-cluster-node-tuning-operator/tuned-d2lfx" Apr 16 19:53:54.423822 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.423693 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b9261443-4578-43ee-abc0-7931d8ab9f10-host\") pod \"node-ca-2jzjc\" (UID: \"b9261443-4578-43ee-abc0-7931d8ab9f10\") " pod="openshift-image-registry/node-ca-2jzjc" Apr 16 19:53:54.423822 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.423729 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b9261443-4578-43ee-abc0-7931d8ab9f10-host\") pod \"node-ca-2jzjc\" (UID: \"b9261443-4578-43ee-abc0-7931d8ab9f10\") " pod="openshift-image-registry/node-ca-2jzjc" Apr 16 19:53:54.423822 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.423737 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/d41807f7-41a1-4bc5-bb49-bc957656ab37-etc-tuned\") pod \"tuned-d2lfx\" (UID: \"d41807f7-41a1-4bc5-bb49-bc957656ab37\") " pod="openshift-cluster-node-tuning-operator/tuned-d2lfx" Apr 16 19:53:54.423822 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.423770 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/b9261443-4578-43ee-abc0-7931d8ab9f10-serviceca\") pod \"node-ca-2jzjc\" (UID: \"b9261443-4578-43ee-abc0-7931d8ab9f10\") " pod="openshift-image-registry/node-ca-2jzjc" Apr 16 19:53:54.423822 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.423802 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d41807f7-41a1-4bc5-bb49-bc957656ab37-etc-kubernetes\") pod \"tuned-d2lfx\" (UID: \"d41807f7-41a1-4bc5-bb49-bc957656ab37\") " pod="openshift-cluster-node-tuning-operator/tuned-d2lfx" Apr 16 19:53:54.423822 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.423824 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/9fd607fa-9f37-4836-a69c-147c1052dbc4-agent-certs\") pod \"konnectivity-agent-dxbfj\" (UID: \"9fd607fa-9f37-4836-a69c-147c1052dbc4\") " pod="kube-system/konnectivity-agent-dxbfj" Apr 16 19:53:54.424404 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.423850 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/d41807f7-41a1-4bc5-bb49-bc957656ab37-etc-sysctl-d\") pod \"tuned-d2lfx\" (UID: \"d41807f7-41a1-4bc5-bb49-bc957656ab37\") " pod="openshift-cluster-node-tuning-operator/tuned-d2lfx" Apr 16 19:53:54.424404 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.423876 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d41807f7-41a1-4bc5-bb49-bc957656ab37-sys\") pod \"tuned-d2lfx\" (UID: \"d41807f7-41a1-4bc5-bb49-bc957656ab37\") " pod="openshift-cluster-node-tuning-operator/tuned-d2lfx" Apr 16 19:53:54.424404 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.423870 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d41807f7-41a1-4bc5-bb49-bc957656ab37-etc-kubernetes\") pod \"tuned-d2lfx\" (UID: \"d41807f7-41a1-4bc5-bb49-bc957656ab37\") " pod="openshift-cluster-node-tuning-operator/tuned-d2lfx" Apr 16 19:53:54.424404 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.423901 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/d41807f7-41a1-4bc5-bb49-bc957656ab37-etc-sysconfig\") pod \"tuned-d2lfx\" (UID: \"d41807f7-41a1-4bc5-bb49-bc957656ab37\") " pod="openshift-cluster-node-tuning-operator/tuned-d2lfx" Apr 16 19:53:54.424404 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.423935 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/d41807f7-41a1-4bc5-bb49-bc957656ab37-run\") pod \"tuned-d2lfx\" (UID: \"d41807f7-41a1-4bc5-bb49-bc957656ab37\") " pod="openshift-cluster-node-tuning-operator/tuned-d2lfx" Apr 16 19:53:54.424404 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.423940 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/d41807f7-41a1-4bc5-bb49-bc957656ab37-etc-sysconfig\") pod \"tuned-d2lfx\" (UID: \"d41807f7-41a1-4bc5-bb49-bc957656ab37\") " pod="openshift-cluster-node-tuning-operator/tuned-d2lfx" Apr 16 19:53:54.424404 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.423969 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/d41807f7-41a1-4bc5-bb49-bc957656ab37-etc-systemd\") pod \"tuned-d2lfx\" (UID: \"d41807f7-41a1-4bc5-bb49-bc957656ab37\") " pod="openshift-cluster-node-tuning-operator/tuned-d2lfx" Apr 16 19:53:54.424404 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.423966 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/d41807f7-41a1-4bc5-bb49-bc957656ab37-etc-sysctl-d\") pod \"tuned-d2lfx\" (UID: \"d41807f7-41a1-4bc5-bb49-bc957656ab37\") " pod="openshift-cluster-node-tuning-operator/tuned-d2lfx" Apr 16 19:53:54.424404 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.423996 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d41807f7-41a1-4bc5-bb49-bc957656ab37-sys\") pod \"tuned-d2lfx\" (UID: \"d41807f7-41a1-4bc5-bb49-bc957656ab37\") " pod="openshift-cluster-node-tuning-operator/tuned-d2lfx" Apr 16 19:53:54.424404 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.424006 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/d41807f7-41a1-4bc5-bb49-bc957656ab37-run\") pod \"tuned-d2lfx\" (UID: \"d41807f7-41a1-4bc5-bb49-bc957656ab37\") " pod="openshift-cluster-node-tuning-operator/tuned-d2lfx" Apr 16 19:53:54.424404 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.424047 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/d41807f7-41a1-4bc5-bb49-bc957656ab37-etc-systemd\") pod \"tuned-d2lfx\" (UID: \"d41807f7-41a1-4bc5-bb49-bc957656ab37\") " pod="openshift-cluster-node-tuning-operator/tuned-d2lfx" Apr 16 19:53:54.424404 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.424130 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/d41807f7-41a1-4bc5-bb49-bc957656ab37-etc-modprobe-d\") pod \"tuned-d2lfx\" (UID: \"d41807f7-41a1-4bc5-bb49-bc957656ab37\") " pod="openshift-cluster-node-tuning-operator/tuned-d2lfx" Apr 16 19:53:54.424404 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.424161 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-shnwv\" (UniqueName: \"kubernetes.io/projected/d41807f7-41a1-4bc5-bb49-bc957656ab37-kube-api-access-shnwv\") pod \"tuned-d2lfx\" (UID: \"d41807f7-41a1-4bc5-bb49-bc957656ab37\") " pod="openshift-cluster-node-tuning-operator/tuned-d2lfx" Apr 16 19:53:54.424404 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.424185 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7w58f\" (UniqueName: \"kubernetes.io/projected/b9261443-4578-43ee-abc0-7931d8ab9f10-kube-api-access-7w58f\") pod \"node-ca-2jzjc\" (UID: \"b9261443-4578-43ee-abc0-7931d8ab9f10\") " pod="openshift-image-registry/node-ca-2jzjc" Apr 16 19:53:54.424404 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.424230 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/d41807f7-41a1-4bc5-bb49-bc957656ab37-etc-modprobe-d\") pod \"tuned-d2lfx\" (UID: \"d41807f7-41a1-4bc5-bb49-bc957656ab37\") " pod="openshift-cluster-node-tuning-operator/tuned-d2lfx" Apr 16 19:53:54.424404 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.424241 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/9fd607fa-9f37-4836-a69c-147c1052dbc4-konnectivity-ca\") pod \"konnectivity-agent-dxbfj\" (UID: \"9fd607fa-9f37-4836-a69c-147c1052dbc4\") " pod="kube-system/konnectivity-agent-dxbfj" Apr 16 19:53:54.424404 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.424264 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/b9261443-4578-43ee-abc0-7931d8ab9f10-serviceca\") pod \"node-ca-2jzjc\" (UID: \"b9261443-4578-43ee-abc0-7931d8ab9f10\") " pod="openshift-image-registry/node-ca-2jzjc" Apr 16 19:53:54.425571 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.425551 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/d41807f7-41a1-4bc5-bb49-bc957656ab37-tmp\") pod \"tuned-d2lfx\" (UID: \"d41807f7-41a1-4bc5-bb49-bc957656ab37\") " pod="openshift-cluster-node-tuning-operator/tuned-d2lfx" Apr 16 19:53:54.425836 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.425821 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-9mwx9"] Apr 16 19:53:54.425921 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.425823 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/d41807f7-41a1-4bc5-bb49-bc957656ab37-etc-tuned\") pod \"tuned-d2lfx\" (UID: \"d41807f7-41a1-4bc5-bb49-bc957656ab37\") " pod="openshift-cluster-node-tuning-operator/tuned-d2lfx" Apr 16 19:53:54.426030 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.426015 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/9fd607fa-9f37-4836-a69c-147c1052dbc4-agent-certs\") pod \"konnectivity-agent-dxbfj\" (UID: \"9fd607fa-9f37-4836-a69c-147c1052dbc4\") " pod="kube-system/konnectivity-agent-dxbfj" Apr 16 19:53:54.428313 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.428299 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-9mwx9" Apr 16 19:53:54.428369 ip-10-0-128-48 kubenswrapper[2572]: E0416 19:53:54.428350 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-9mwx9" podUID="fec5c910-3cf0-49a0-b436-62a7236c7d68" Apr 16 19:53:54.436152 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.436107 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-shnwv\" (UniqueName: \"kubernetes.io/projected/d41807f7-41a1-4bc5-bb49-bc957656ab37-kube-api-access-shnwv\") pod \"tuned-d2lfx\" (UID: \"d41807f7-41a1-4bc5-bb49-bc957656ab37\") " pod="openshift-cluster-node-tuning-operator/tuned-d2lfx" Apr 16 19:53:54.436414 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.436400 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7w58f\" (UniqueName: \"kubernetes.io/projected/b9261443-4578-43ee-abc0-7931d8ab9f10-kube-api-access-7w58f\") pod \"node-ca-2jzjc\" (UID: \"b9261443-4578-43ee-abc0-7931d8ab9f10\") " pod="openshift-image-registry/node-ca-2jzjc" Apr 16 19:53:54.517554 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.517531 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-bsbmz" Apr 16 19:53:54.523909 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.523890 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-mjhhc" Apr 16 19:53:54.524593 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:54.524571 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod40cddc46_c9be_411b_92b8_a65e04009fc9.slice/crio-3c35ac8a615fa383f71bc1e441083a07b2c69b6b77c30ce68d45bdf6a740d65a WatchSource:0}: Error finding container 3c35ac8a615fa383f71bc1e441083a07b2c69b6b77c30ce68d45bdf6a740d65a: Status 404 returned error can't find the container with id 3c35ac8a615fa383f71bc1e441083a07b2c69b6b77c30ce68d45bdf6a740d65a Apr 16 19:53:54.524675 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.524612 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/fec5c910-3cf0-49a0-b436-62a7236c7d68-kubelet-config\") pod \"global-pull-secret-syncer-9mwx9\" (UID: \"fec5c910-3cf0-49a0-b436-62a7236c7d68\") " pod="kube-system/global-pull-secret-syncer-9mwx9" Apr 16 19:53:54.524675 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.524664 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/fec5c910-3cf0-49a0-b436-62a7236c7d68-dbus\") pod \"global-pull-secret-syncer-9mwx9\" (UID: \"fec5c910-3cf0-49a0-b436-62a7236c7d68\") " pod="kube-system/global-pull-secret-syncer-9mwx9" Apr 16 19:53:54.524806 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.524713 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/fec5c910-3cf0-49a0-b436-62a7236c7d68-original-pull-secret\") pod \"global-pull-secret-syncer-9mwx9\" (UID: \"fec5c910-3cf0-49a0-b436-62a7236c7d68\") " pod="kube-system/global-pull-secret-syncer-9mwx9" Apr 16 19:53:54.530380 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.530363 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lnmvl" Apr 16 19:53:54.530707 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:54.530686 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod74091848_cf67_4c6a_9ad9_adc12a5e47ad.slice/crio-21185986a6403cfa66df62d73e46942706012819f27b0bbd09704946ee3e218e WatchSource:0}: Error finding container 21185986a6403cfa66df62d73e46942706012819f27b0bbd09704946ee3e218e: Status 404 returned error can't find the container with id 21185986a6403cfa66df62d73e46942706012819f27b0bbd09704946ee3e218e Apr 16 19:53:54.534829 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.534588 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-j48bx" Apr 16 19:53:54.539788 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:54.539763 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podffafc088_a777_4b43_b168_c2fdb6bdbbab.slice/crio-64f35271839b5ef780932ae45bcf5a7f783e3a63cf37bdad2fe855c1e04fc7cb WatchSource:0}: Error finding container 64f35271839b5ef780932ae45bcf5a7f783e3a63cf37bdad2fe855c1e04fc7cb: Status 404 returned error can't find the container with id 64f35271839b5ef780932ae45bcf5a7f783e3a63cf37bdad2fe855c1e04fc7cb Apr 16 19:53:54.540013 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.539982 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-mlhp9" Apr 16 19:53:54.544971 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:54.544948 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2cd1935c_d57b_4e12_881b_0c81444e85ac.slice/crio-63c180bc638785c0bb929cd9ac7946fe423822d7117ed377bf4ef8e6762afd16 WatchSource:0}: Error finding container 63c180bc638785c0bb929cd9ac7946fe423822d7117ed377bf4ef8e6762afd16: Status 404 returned error can't find the container with id 63c180bc638785c0bb929cd9ac7946fe423822d7117ed377bf4ef8e6762afd16 Apr 16 19:53:54.546057 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.546037 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-dxbfj" Apr 16 19:53:54.548313 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:54.548286 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4ee2a187_ea67_4cf0_a11b_3bf8404d67b6.slice/crio-70a208bc4eb0916aa758485a59c8c91286e8ef3d1f75b6356a9514cd9ad8e5d7 WatchSource:0}: Error finding container 70a208bc4eb0916aa758485a59c8c91286e8ef3d1f75b6356a9514cd9ad8e5d7: Status 404 returned error can't find the container with id 70a208bc4eb0916aa758485a59c8c91286e8ef3d1f75b6356a9514cd9ad8e5d7 Apr 16 19:53:54.550570 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.550468 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-d2lfx" Apr 16 19:53:54.553815 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:54.553783 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9fd607fa_9f37_4836_a69c_147c1052dbc4.slice/crio-def601a3ea6233b8cdf125d0e7ce76f90755cdcc9f7c9c160d5faf4bab61359c WatchSource:0}: Error finding container def601a3ea6233b8cdf125d0e7ce76f90755cdcc9f7c9c160d5faf4bab61359c: Status 404 returned error can't find the container with id def601a3ea6233b8cdf125d0e7ce76f90755cdcc9f7c9c160d5faf4bab61359c Apr 16 19:53:54.554678 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.554655 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-2jzjc" Apr 16 19:53:54.558612 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:54.558529 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd41807f7_41a1_4bc5_bb49_bc957656ab37.slice/crio-4bc85940d9f27d8d7d0f6782be97daeb40ce1a6bdb462efc164b34c7bdf54ede WatchSource:0}: Error finding container 4bc85940d9f27d8d7d0f6782be97daeb40ce1a6bdb462efc164b34c7bdf54ede: Status 404 returned error can't find the container with id 4bc85940d9f27d8d7d0f6782be97daeb40ce1a6bdb462efc164b34c7bdf54ede Apr 16 19:53:54.563270 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:53:54.563241 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb9261443_4578_43ee_abc0_7931d8ab9f10.slice/crio-2889056a9e2c1f88391957c2a224ad0cfd96d76827022650a0cad3eb3b7c9337 WatchSource:0}: Error finding container 2889056a9e2c1f88391957c2a224ad0cfd96d76827022650a0cad3eb3b7c9337: Status 404 returned error can't find the container with id 2889056a9e2c1f88391957c2a224ad0cfd96d76827022650a0cad3eb3b7c9337 Apr 16 19:53:54.573737 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.573715 2572 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 19:53:54.625794 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.625771 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/fec5c910-3cf0-49a0-b436-62a7236c7d68-dbus\") pod \"global-pull-secret-syncer-9mwx9\" (UID: \"fec5c910-3cf0-49a0-b436-62a7236c7d68\") " pod="kube-system/global-pull-secret-syncer-9mwx9" Apr 16 19:53:54.625896 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.625829 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/fec5c910-3cf0-49a0-b436-62a7236c7d68-original-pull-secret\") pod \"global-pull-secret-syncer-9mwx9\" (UID: \"fec5c910-3cf0-49a0-b436-62a7236c7d68\") " pod="kube-system/global-pull-secret-syncer-9mwx9" Apr 16 19:53:54.625955 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.625893 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/fec5c910-3cf0-49a0-b436-62a7236c7d68-kubelet-config\") pod \"global-pull-secret-syncer-9mwx9\" (UID: \"fec5c910-3cf0-49a0-b436-62a7236c7d68\") " pod="kube-system/global-pull-secret-syncer-9mwx9" Apr 16 19:53:54.626045 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.626030 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/fec5c910-3cf0-49a0-b436-62a7236c7d68-kubelet-config\") pod \"global-pull-secret-syncer-9mwx9\" (UID: \"fec5c910-3cf0-49a0-b436-62a7236c7d68\") " pod="kube-system/global-pull-secret-syncer-9mwx9" Apr 16 19:53:54.626180 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.626164 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/fec5c910-3cf0-49a0-b436-62a7236c7d68-dbus\") pod \"global-pull-secret-syncer-9mwx9\" (UID: \"fec5c910-3cf0-49a0-b436-62a7236c7d68\") " pod="kube-system/global-pull-secret-syncer-9mwx9" Apr 16 19:53:54.626255 ip-10-0-128-48 kubenswrapper[2572]: E0416 19:53:54.626209 2572 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 19:53:54.626332 ip-10-0-128-48 kubenswrapper[2572]: E0416 19:53:54.626275 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fec5c910-3cf0-49a0-b436-62a7236c7d68-original-pull-secret podName:fec5c910-3cf0-49a0-b436-62a7236c7d68 nodeName:}" failed. No retries permitted until 2026-04-16 19:53:55.126256339 +0000 UTC m=+2.305018345 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/fec5c910-3cf0-49a0-b436-62a7236c7d68-original-pull-secret") pod "global-pull-secret-syncer-9mwx9" (UID: "fec5c910-3cf0-49a0-b436-62a7236c7d68") : object "kube-system"/"original-pull-secret" not registered Apr 16 19:53:54.828301 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.828224 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/458f83e2-e97a-457a-9081-a5ae099b6973-metrics-certs\") pod \"network-metrics-daemon-5j478\" (UID: \"458f83e2-e97a-457a-9081-a5ae099b6973\") " pod="openshift-multus/network-metrics-daemon-5j478" Apr 16 19:53:54.828457 ip-10-0-128-48 kubenswrapper[2572]: E0416 19:53:54.828417 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:53:54.828575 ip-10-0-128-48 kubenswrapper[2572]: E0416 19:53:54.828497 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/458f83e2-e97a-457a-9081-a5ae099b6973-metrics-certs podName:458f83e2-e97a-457a-9081-a5ae099b6973 nodeName:}" failed. No retries permitted until 2026-04-16 19:53:55.828475954 +0000 UTC m=+3.007237962 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/458f83e2-e97a-457a-9081-a5ae099b6973-metrics-certs") pod "network-metrics-daemon-5j478" (UID: "458f83e2-e97a-457a-9081-a5ae099b6973") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:53:54.930592 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:54.928877 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lk7dq\" (UniqueName: \"kubernetes.io/projected/dc2e7638-ebb5-4713-a221-8c885ed0b19d-kube-api-access-lk7dq\") pod \"network-check-target-qvsgg\" (UID: \"dc2e7638-ebb5-4713-a221-8c885ed0b19d\") " pod="openshift-network-diagnostics/network-check-target-qvsgg" Apr 16 19:53:54.930592 ip-10-0-128-48 kubenswrapper[2572]: E0416 19:53:54.929126 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 19:53:54.930592 ip-10-0-128-48 kubenswrapper[2572]: E0416 19:53:54.929148 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 19:53:54.930592 ip-10-0-128-48 kubenswrapper[2572]: E0416 19:53:54.929162 2572 projected.go:194] Error preparing data for projected volume kube-api-access-lk7dq for pod openshift-network-diagnostics/network-check-target-qvsgg: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 19:53:54.930592 ip-10-0-128-48 kubenswrapper[2572]: E0416 19:53:54.929220 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/dc2e7638-ebb5-4713-a221-8c885ed0b19d-kube-api-access-lk7dq podName:dc2e7638-ebb5-4713-a221-8c885ed0b19d nodeName:}" failed. No retries permitted until 2026-04-16 19:53:55.929202136 +0000 UTC m=+3.107964155 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-lk7dq" (UniqueName: "kubernetes.io/projected/dc2e7638-ebb5-4713-a221-8c885ed0b19d-kube-api-access-lk7dq") pod "network-check-target-qvsgg" (UID: "dc2e7638-ebb5-4713-a221-8c885ed0b19d") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 19:53:55.131077 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:55.130908 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/fec5c910-3cf0-49a0-b436-62a7236c7d68-original-pull-secret\") pod \"global-pull-secret-syncer-9mwx9\" (UID: \"fec5c910-3cf0-49a0-b436-62a7236c7d68\") " pod="kube-system/global-pull-secret-syncer-9mwx9" Apr 16 19:53:55.131904 ip-10-0-128-48 kubenswrapper[2572]: E0416 19:53:55.131078 2572 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 19:53:55.131904 ip-10-0-128-48 kubenswrapper[2572]: E0416 19:53:55.131161 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fec5c910-3cf0-49a0-b436-62a7236c7d68-original-pull-secret podName:fec5c910-3cf0-49a0-b436-62a7236c7d68 nodeName:}" failed. No retries permitted until 2026-04-16 19:53:56.13114308 +0000 UTC m=+3.309905083 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/fec5c910-3cf0-49a0-b436-62a7236c7d68-original-pull-secret") pod "global-pull-secret-syncer-9mwx9" (UID: "fec5c910-3cf0-49a0-b436-62a7236c7d68") : object "kube-system"/"original-pull-secret" not registered Apr 16 19:53:55.257071 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:55.256975 2572 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 19:48:54 +0000 UTC" deadline="2028-01-11 11:30:27.91466027 +0000 UTC" Apr 16 19:53:55.257071 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:55.257019 2572 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15231h36m32.657645356s" Apr 16 19:53:55.390051 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:55.389966 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-dxbfj" event={"ID":"9fd607fa-9f37-4836-a69c-147c1052dbc4","Type":"ContainerStarted","Data":"def601a3ea6233b8cdf125d0e7ce76f90755cdcc9f7c9c160d5faf4bab61359c"} Apr 16 19:53:55.400821 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:55.400788 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-mlhp9" event={"ID":"4ee2a187-ea67-4cf0-a11b-3bf8404d67b6","Type":"ContainerStarted","Data":"70a208bc4eb0916aa758485a59c8c91286e8ef3d1f75b6356a9514cd9ad8e5d7"} Apr 16 19:53:55.405977 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:55.405926 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-j48bx" event={"ID":"2cd1935c-d57b-4e12-881b-0c81444e85ac","Type":"ContainerStarted","Data":"63c180bc638785c0bb929cd9ac7946fe423822d7117ed377bf4ef8e6762afd16"} Apr 16 19:53:55.407737 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:55.407695 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-mjhhc" event={"ID":"74091848-cf67-4c6a-9ad9-adc12a5e47ad","Type":"ContainerStarted","Data":"21185986a6403cfa66df62d73e46942706012819f27b0bbd09704946ee3e218e"} Apr 16 19:53:55.414959 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:55.414913 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bsbmz" event={"ID":"40cddc46-c9be-411b-92b8-a65e04009fc9","Type":"ContainerStarted","Data":"3c35ac8a615fa383f71bc1e441083a07b2c69b6b77c30ce68d45bdf6a740d65a"} Apr 16 19:53:55.417635 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:55.417581 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-2jzjc" event={"ID":"b9261443-4578-43ee-abc0-7931d8ab9f10","Type":"ContainerStarted","Data":"2889056a9e2c1f88391957c2a224ad0cfd96d76827022650a0cad3eb3b7c9337"} Apr 16 19:53:55.421425 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:55.421363 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-d2lfx" event={"ID":"d41807f7-41a1-4bc5-bb49-bc957656ab37","Type":"ContainerStarted","Data":"4bc85940d9f27d8d7d0f6782be97daeb40ce1a6bdb462efc164b34c7bdf54ede"} Apr 16 19:53:55.424810 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:55.424772 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lnmvl" event={"ID":"ffafc088-a777-4b43-b168-c2fdb6bdbbab","Type":"ContainerStarted","Data":"64f35271839b5ef780932ae45bcf5a7f783e3a63cf37bdad2fe855c1e04fc7cb"} Apr 16 19:53:55.650517 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:55.650433 2572 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 19:53:55.836662 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:55.836628 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/458f83e2-e97a-457a-9081-a5ae099b6973-metrics-certs\") pod \"network-metrics-daemon-5j478\" (UID: \"458f83e2-e97a-457a-9081-a5ae099b6973\") " pod="openshift-multus/network-metrics-daemon-5j478" Apr 16 19:53:55.836855 ip-10-0-128-48 kubenswrapper[2572]: E0416 19:53:55.836799 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:53:55.836918 ip-10-0-128-48 kubenswrapper[2572]: E0416 19:53:55.836873 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/458f83e2-e97a-457a-9081-a5ae099b6973-metrics-certs podName:458f83e2-e97a-457a-9081-a5ae099b6973 nodeName:}" failed. No retries permitted until 2026-04-16 19:53:57.836853948 +0000 UTC m=+5.015615954 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/458f83e2-e97a-457a-9081-a5ae099b6973-metrics-certs") pod "network-metrics-daemon-5j478" (UID: "458f83e2-e97a-457a-9081-a5ae099b6973") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:53:55.937766 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:55.937682 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lk7dq\" (UniqueName: \"kubernetes.io/projected/dc2e7638-ebb5-4713-a221-8c885ed0b19d-kube-api-access-lk7dq\") pod \"network-check-target-qvsgg\" (UID: \"dc2e7638-ebb5-4713-a221-8c885ed0b19d\") " pod="openshift-network-diagnostics/network-check-target-qvsgg" Apr 16 19:53:55.937926 ip-10-0-128-48 kubenswrapper[2572]: E0416 19:53:55.937885 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 19:53:55.937926 ip-10-0-128-48 kubenswrapper[2572]: E0416 19:53:55.937904 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 19:53:55.937926 ip-10-0-128-48 kubenswrapper[2572]: E0416 19:53:55.937916 2572 projected.go:194] Error preparing data for projected volume kube-api-access-lk7dq for pod openshift-network-diagnostics/network-check-target-qvsgg: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 19:53:55.938116 ip-10-0-128-48 kubenswrapper[2572]: E0416 19:53:55.937974 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/dc2e7638-ebb5-4713-a221-8c885ed0b19d-kube-api-access-lk7dq podName:dc2e7638-ebb5-4713-a221-8c885ed0b19d nodeName:}" failed. No retries permitted until 2026-04-16 19:53:57.937955292 +0000 UTC m=+5.116717309 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-lk7dq" (UniqueName: "kubernetes.io/projected/dc2e7638-ebb5-4713-a221-8c885ed0b19d-kube-api-access-lk7dq") pod "network-check-target-qvsgg" (UID: "dc2e7638-ebb5-4713-a221-8c885ed0b19d") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 19:53:56.139637 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:56.139599 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/fec5c910-3cf0-49a0-b436-62a7236c7d68-original-pull-secret\") pod \"global-pull-secret-syncer-9mwx9\" (UID: \"fec5c910-3cf0-49a0-b436-62a7236c7d68\") " pod="kube-system/global-pull-secret-syncer-9mwx9" Apr 16 19:53:56.140083 ip-10-0-128-48 kubenswrapper[2572]: E0416 19:53:56.139744 2572 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 19:53:56.140083 ip-10-0-128-48 kubenswrapper[2572]: E0416 19:53:56.139802 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fec5c910-3cf0-49a0-b436-62a7236c7d68-original-pull-secret podName:fec5c910-3cf0-49a0-b436-62a7236c7d68 nodeName:}" failed. No retries permitted until 2026-04-16 19:53:58.139783613 +0000 UTC m=+5.318545620 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/fec5c910-3cf0-49a0-b436-62a7236c7d68-original-pull-secret") pod "global-pull-secret-syncer-9mwx9" (UID: "fec5c910-3cf0-49a0-b436-62a7236c7d68") : object "kube-system"/"original-pull-secret" not registered Apr 16 19:53:56.258463 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:56.257938 2572 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 19:48:54 +0000 UTC" deadline="2027-10-14 06:36:02.567908876 +0000 UTC" Apr 16 19:53:56.258463 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:56.257976 2572 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13090h42m6.309936694s" Apr 16 19:53:56.361301 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:56.361212 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-9mwx9" Apr 16 19:53:56.361470 ip-10-0-128-48 kubenswrapper[2572]: E0416 19:53:56.361338 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-9mwx9" podUID="fec5c910-3cf0-49a0-b436-62a7236c7d68" Apr 16 19:53:56.361972 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:56.361212 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qvsgg" Apr 16 19:53:56.361972 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:56.361674 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5j478" Apr 16 19:53:56.361972 ip-10-0-128-48 kubenswrapper[2572]: E0416 19:53:56.361768 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qvsgg" podUID="dc2e7638-ebb5-4713-a221-8c885ed0b19d" Apr 16 19:53:56.361972 ip-10-0-128-48 kubenswrapper[2572]: E0416 19:53:56.361879 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5j478" podUID="458f83e2-e97a-457a-9081-a5ae099b6973" Apr 16 19:53:57.854562 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:57.854527 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/458f83e2-e97a-457a-9081-a5ae099b6973-metrics-certs\") pod \"network-metrics-daemon-5j478\" (UID: \"458f83e2-e97a-457a-9081-a5ae099b6973\") " pod="openshift-multus/network-metrics-daemon-5j478" Apr 16 19:53:57.855032 ip-10-0-128-48 kubenswrapper[2572]: E0416 19:53:57.854690 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:53:57.855032 ip-10-0-128-48 kubenswrapper[2572]: E0416 19:53:57.854762 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/458f83e2-e97a-457a-9081-a5ae099b6973-metrics-certs podName:458f83e2-e97a-457a-9081-a5ae099b6973 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:01.854740239 +0000 UTC m=+9.033502246 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/458f83e2-e97a-457a-9081-a5ae099b6973-metrics-certs") pod "network-metrics-daemon-5j478" (UID: "458f83e2-e97a-457a-9081-a5ae099b6973") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:53:57.955286 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:57.955247 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lk7dq\" (UniqueName: \"kubernetes.io/projected/dc2e7638-ebb5-4713-a221-8c885ed0b19d-kube-api-access-lk7dq\") pod \"network-check-target-qvsgg\" (UID: \"dc2e7638-ebb5-4713-a221-8c885ed0b19d\") " pod="openshift-network-diagnostics/network-check-target-qvsgg" Apr 16 19:53:57.955533 ip-10-0-128-48 kubenswrapper[2572]: E0416 19:53:57.955515 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 19:53:57.955600 ip-10-0-128-48 kubenswrapper[2572]: E0416 19:53:57.955540 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 19:53:57.955600 ip-10-0-128-48 kubenswrapper[2572]: E0416 19:53:57.955553 2572 projected.go:194] Error preparing data for projected volume kube-api-access-lk7dq for pod openshift-network-diagnostics/network-check-target-qvsgg: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 19:53:57.955710 ip-10-0-128-48 kubenswrapper[2572]: E0416 19:53:57.955611 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/dc2e7638-ebb5-4713-a221-8c885ed0b19d-kube-api-access-lk7dq podName:dc2e7638-ebb5-4713-a221-8c885ed0b19d nodeName:}" failed. No retries permitted until 2026-04-16 19:54:01.955592975 +0000 UTC m=+9.134354982 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-lk7dq" (UniqueName: "kubernetes.io/projected/dc2e7638-ebb5-4713-a221-8c885ed0b19d-kube-api-access-lk7dq") pod "network-check-target-qvsgg" (UID: "dc2e7638-ebb5-4713-a221-8c885ed0b19d") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 19:53:58.157103 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:58.157011 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/fec5c910-3cf0-49a0-b436-62a7236c7d68-original-pull-secret\") pod \"global-pull-secret-syncer-9mwx9\" (UID: \"fec5c910-3cf0-49a0-b436-62a7236c7d68\") " pod="kube-system/global-pull-secret-syncer-9mwx9" Apr 16 19:53:58.157265 ip-10-0-128-48 kubenswrapper[2572]: E0416 19:53:58.157179 2572 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 19:53:58.157265 ip-10-0-128-48 kubenswrapper[2572]: E0416 19:53:58.157250 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fec5c910-3cf0-49a0-b436-62a7236c7d68-original-pull-secret podName:fec5c910-3cf0-49a0-b436-62a7236c7d68 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:02.157232537 +0000 UTC m=+9.335994543 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/fec5c910-3cf0-49a0-b436-62a7236c7d68-original-pull-secret") pod "global-pull-secret-syncer-9mwx9" (UID: "fec5c910-3cf0-49a0-b436-62a7236c7d68") : object "kube-system"/"original-pull-secret" not registered Apr 16 19:53:58.361821 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:58.361788 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qvsgg" Apr 16 19:53:58.361979 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:58.361827 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5j478" Apr 16 19:53:58.361979 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:53:58.361788 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-9mwx9" Apr 16 19:53:58.361979 ip-10-0-128-48 kubenswrapper[2572]: E0416 19:53:58.361915 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qvsgg" podUID="dc2e7638-ebb5-4713-a221-8c885ed0b19d" Apr 16 19:53:58.362218 ip-10-0-128-48 kubenswrapper[2572]: E0416 19:53:58.362001 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-9mwx9" podUID="fec5c910-3cf0-49a0-b436-62a7236c7d68" Apr 16 19:53:58.362218 ip-10-0-128-48 kubenswrapper[2572]: E0416 19:53:58.362109 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5j478" podUID="458f83e2-e97a-457a-9081-a5ae099b6973" Apr 16 19:54:00.361124 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:00.361079 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qvsgg" Apr 16 19:54:00.361460 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:00.361079 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5j478" Apr 16 19:54:00.361460 ip-10-0-128-48 kubenswrapper[2572]: E0416 19:54:00.361225 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qvsgg" podUID="dc2e7638-ebb5-4713-a221-8c885ed0b19d" Apr 16 19:54:00.361460 ip-10-0-128-48 kubenswrapper[2572]: E0416 19:54:00.361423 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5j478" podUID="458f83e2-e97a-457a-9081-a5ae099b6973" Apr 16 19:54:00.361563 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:00.361499 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-9mwx9" Apr 16 19:54:00.361604 ip-10-0-128-48 kubenswrapper[2572]: E0416 19:54:00.361558 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-9mwx9" podUID="fec5c910-3cf0-49a0-b436-62a7236c7d68" Apr 16 19:54:01.883472 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:01.883401 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/458f83e2-e97a-457a-9081-a5ae099b6973-metrics-certs\") pod \"network-metrics-daemon-5j478\" (UID: \"458f83e2-e97a-457a-9081-a5ae099b6973\") " pod="openshift-multus/network-metrics-daemon-5j478" Apr 16 19:54:01.883892 ip-10-0-128-48 kubenswrapper[2572]: E0416 19:54:01.883590 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:54:01.883892 ip-10-0-128-48 kubenswrapper[2572]: E0416 19:54:01.883663 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/458f83e2-e97a-457a-9081-a5ae099b6973-metrics-certs podName:458f83e2-e97a-457a-9081-a5ae099b6973 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:09.883642223 +0000 UTC m=+17.062404289 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/458f83e2-e97a-457a-9081-a5ae099b6973-metrics-certs") pod "network-metrics-daemon-5j478" (UID: "458f83e2-e97a-457a-9081-a5ae099b6973") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:54:01.984915 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:01.984799 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lk7dq\" (UniqueName: \"kubernetes.io/projected/dc2e7638-ebb5-4713-a221-8c885ed0b19d-kube-api-access-lk7dq\") pod \"network-check-target-qvsgg\" (UID: \"dc2e7638-ebb5-4713-a221-8c885ed0b19d\") " pod="openshift-network-diagnostics/network-check-target-qvsgg" Apr 16 19:54:01.985142 ip-10-0-128-48 kubenswrapper[2572]: E0416 19:54:01.984994 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 19:54:01.985142 ip-10-0-128-48 kubenswrapper[2572]: E0416 19:54:01.985015 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 19:54:01.985142 ip-10-0-128-48 kubenswrapper[2572]: E0416 19:54:01.985026 2572 projected.go:194] Error preparing data for projected volume kube-api-access-lk7dq for pod openshift-network-diagnostics/network-check-target-qvsgg: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 19:54:01.985142 ip-10-0-128-48 kubenswrapper[2572]: E0416 19:54:01.985103 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/dc2e7638-ebb5-4713-a221-8c885ed0b19d-kube-api-access-lk7dq podName:dc2e7638-ebb5-4713-a221-8c885ed0b19d nodeName:}" failed. No retries permitted until 2026-04-16 19:54:09.985070903 +0000 UTC m=+17.163832906 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-lk7dq" (UniqueName: "kubernetes.io/projected/dc2e7638-ebb5-4713-a221-8c885ed0b19d-kube-api-access-lk7dq") pod "network-check-target-qvsgg" (UID: "dc2e7638-ebb5-4713-a221-8c885ed0b19d") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 19:54:02.186809 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:02.186720 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/fec5c910-3cf0-49a0-b436-62a7236c7d68-original-pull-secret\") pod \"global-pull-secret-syncer-9mwx9\" (UID: \"fec5c910-3cf0-49a0-b436-62a7236c7d68\") " pod="kube-system/global-pull-secret-syncer-9mwx9" Apr 16 19:54:02.186945 ip-10-0-128-48 kubenswrapper[2572]: E0416 19:54:02.186901 2572 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 19:54:02.186995 ip-10-0-128-48 kubenswrapper[2572]: E0416 19:54:02.186982 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fec5c910-3cf0-49a0-b436-62a7236c7d68-original-pull-secret podName:fec5c910-3cf0-49a0-b436-62a7236c7d68 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:10.186961231 +0000 UTC m=+17.365723250 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/fec5c910-3cf0-49a0-b436-62a7236c7d68-original-pull-secret") pod "global-pull-secret-syncer-9mwx9" (UID: "fec5c910-3cf0-49a0-b436-62a7236c7d68") : object "kube-system"/"original-pull-secret" not registered Apr 16 19:54:02.361285 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:02.361257 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-9mwx9" Apr 16 19:54:02.361285 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:02.361290 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qvsgg" Apr 16 19:54:02.361500 ip-10-0-128-48 kubenswrapper[2572]: E0416 19:54:02.361363 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-9mwx9" podUID="fec5c910-3cf0-49a0-b436-62a7236c7d68" Apr 16 19:54:02.361500 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:02.361403 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5j478" Apr 16 19:54:02.361500 ip-10-0-128-48 kubenswrapper[2572]: E0416 19:54:02.361482 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5j478" podUID="458f83e2-e97a-457a-9081-a5ae099b6973" Apr 16 19:54:02.361634 ip-10-0-128-48 kubenswrapper[2572]: E0416 19:54:02.361578 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qvsgg" podUID="dc2e7638-ebb5-4713-a221-8c885ed0b19d" Apr 16 19:54:04.361134 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:04.361085 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qvsgg" Apr 16 19:54:04.361583 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:04.361110 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-9mwx9" Apr 16 19:54:04.361583 ip-10-0-128-48 kubenswrapper[2572]: E0416 19:54:04.361241 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qvsgg" podUID="dc2e7638-ebb5-4713-a221-8c885ed0b19d" Apr 16 19:54:04.361583 ip-10-0-128-48 kubenswrapper[2572]: E0416 19:54:04.361315 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-9mwx9" podUID="fec5c910-3cf0-49a0-b436-62a7236c7d68" Apr 16 19:54:04.361583 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:04.361109 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5j478" Apr 16 19:54:04.361583 ip-10-0-128-48 kubenswrapper[2572]: E0416 19:54:04.361448 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5j478" podUID="458f83e2-e97a-457a-9081-a5ae099b6973" Apr 16 19:54:06.361185 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:06.361143 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-9mwx9" Apr 16 19:54:06.361629 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:06.361254 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qvsgg" Apr 16 19:54:06.361629 ip-10-0-128-48 kubenswrapper[2572]: E0416 19:54:06.361255 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-9mwx9" podUID="fec5c910-3cf0-49a0-b436-62a7236c7d68" Apr 16 19:54:06.361629 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:06.361287 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5j478" Apr 16 19:54:06.361629 ip-10-0-128-48 kubenswrapper[2572]: E0416 19:54:06.361383 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qvsgg" podUID="dc2e7638-ebb5-4713-a221-8c885ed0b19d" Apr 16 19:54:06.361629 ip-10-0-128-48 kubenswrapper[2572]: E0416 19:54:06.361451 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5j478" podUID="458f83e2-e97a-457a-9081-a5ae099b6973" Apr 16 19:54:08.361595 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:08.361325 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-9mwx9" Apr 16 19:54:08.362026 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:08.361329 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5j478" Apr 16 19:54:08.362026 ip-10-0-128-48 kubenswrapper[2572]: E0416 19:54:08.361639 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-9mwx9" podUID="fec5c910-3cf0-49a0-b436-62a7236c7d68" Apr 16 19:54:08.362026 ip-10-0-128-48 kubenswrapper[2572]: E0416 19:54:08.361748 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5j478" podUID="458f83e2-e97a-457a-9081-a5ae099b6973" Apr 16 19:54:08.362026 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:08.361366 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qvsgg" Apr 16 19:54:08.362026 ip-10-0-128-48 kubenswrapper[2572]: E0416 19:54:08.361839 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qvsgg" podUID="dc2e7638-ebb5-4713-a221-8c885ed0b19d" Apr 16 19:54:09.952324 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:09.952294 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/458f83e2-e97a-457a-9081-a5ae099b6973-metrics-certs\") pod \"network-metrics-daemon-5j478\" (UID: \"458f83e2-e97a-457a-9081-a5ae099b6973\") " pod="openshift-multus/network-metrics-daemon-5j478" Apr 16 19:54:09.952828 ip-10-0-128-48 kubenswrapper[2572]: E0416 19:54:09.952462 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:54:09.952828 ip-10-0-128-48 kubenswrapper[2572]: E0416 19:54:09.952564 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/458f83e2-e97a-457a-9081-a5ae099b6973-metrics-certs podName:458f83e2-e97a-457a-9081-a5ae099b6973 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:25.952540823 +0000 UTC m=+33.131302829 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/458f83e2-e97a-457a-9081-a5ae099b6973-metrics-certs") pod "network-metrics-daemon-5j478" (UID: "458f83e2-e97a-457a-9081-a5ae099b6973") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:54:10.052809 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:10.052776 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lk7dq\" (UniqueName: \"kubernetes.io/projected/dc2e7638-ebb5-4713-a221-8c885ed0b19d-kube-api-access-lk7dq\") pod \"network-check-target-qvsgg\" (UID: \"dc2e7638-ebb5-4713-a221-8c885ed0b19d\") " pod="openshift-network-diagnostics/network-check-target-qvsgg" Apr 16 19:54:10.052952 ip-10-0-128-48 kubenswrapper[2572]: E0416 19:54:10.052934 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 19:54:10.053019 ip-10-0-128-48 kubenswrapper[2572]: E0416 19:54:10.052956 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 19:54:10.053019 ip-10-0-128-48 kubenswrapper[2572]: E0416 19:54:10.052973 2572 projected.go:194] Error preparing data for projected volume kube-api-access-lk7dq for pod openshift-network-diagnostics/network-check-target-qvsgg: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 19:54:10.053112 ip-10-0-128-48 kubenswrapper[2572]: E0416 19:54:10.053032 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/dc2e7638-ebb5-4713-a221-8c885ed0b19d-kube-api-access-lk7dq podName:dc2e7638-ebb5-4713-a221-8c885ed0b19d nodeName:}" failed. No retries permitted until 2026-04-16 19:54:26.05301267 +0000 UTC m=+33.231774688 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-lk7dq" (UniqueName: "kubernetes.io/projected/dc2e7638-ebb5-4713-a221-8c885ed0b19d-kube-api-access-lk7dq") pod "network-check-target-qvsgg" (UID: "dc2e7638-ebb5-4713-a221-8c885ed0b19d") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 19:54:10.254227 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:10.254156 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/fec5c910-3cf0-49a0-b436-62a7236c7d68-original-pull-secret\") pod \"global-pull-secret-syncer-9mwx9\" (UID: \"fec5c910-3cf0-49a0-b436-62a7236c7d68\") " pod="kube-system/global-pull-secret-syncer-9mwx9" Apr 16 19:54:10.254361 ip-10-0-128-48 kubenswrapper[2572]: E0416 19:54:10.254259 2572 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 19:54:10.254361 ip-10-0-128-48 kubenswrapper[2572]: E0416 19:54:10.254310 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fec5c910-3cf0-49a0-b436-62a7236c7d68-original-pull-secret podName:fec5c910-3cf0-49a0-b436-62a7236c7d68 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:26.254292226 +0000 UTC m=+33.433054230 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/fec5c910-3cf0-49a0-b436-62a7236c7d68-original-pull-secret") pod "global-pull-secret-syncer-9mwx9" (UID: "fec5c910-3cf0-49a0-b436-62a7236c7d68") : object "kube-system"/"original-pull-secret" not registered Apr 16 19:54:10.361112 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:10.361074 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-9mwx9" Apr 16 19:54:10.361258 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:10.361075 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5j478" Apr 16 19:54:10.361258 ip-10-0-128-48 kubenswrapper[2572]: E0416 19:54:10.361209 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-9mwx9" podUID="fec5c910-3cf0-49a0-b436-62a7236c7d68" Apr 16 19:54:10.361258 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:10.361219 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qvsgg" Apr 16 19:54:10.361394 ip-10-0-128-48 kubenswrapper[2572]: E0416 19:54:10.361303 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5j478" podUID="458f83e2-e97a-457a-9081-a5ae099b6973" Apr 16 19:54:10.361394 ip-10-0-128-48 kubenswrapper[2572]: E0416 19:54:10.361383 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qvsgg" podUID="dc2e7638-ebb5-4713-a221-8c885ed0b19d" Apr 16 19:54:12.361631 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:12.361504 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qvsgg" Apr 16 19:54:12.361631 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:12.361515 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5j478" Apr 16 19:54:12.362177 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:12.361515 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-9mwx9" Apr 16 19:54:12.362177 ip-10-0-128-48 kubenswrapper[2572]: E0416 19:54:12.361751 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5j478" podUID="458f83e2-e97a-457a-9081-a5ae099b6973" Apr 16 19:54:12.362177 ip-10-0-128-48 kubenswrapper[2572]: E0416 19:54:12.361753 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qvsgg" podUID="dc2e7638-ebb5-4713-a221-8c885ed0b19d" Apr 16 19:54:12.362177 ip-10-0-128-48 kubenswrapper[2572]: E0416 19:54:12.361792 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-9mwx9" podUID="fec5c910-3cf0-49a0-b436-62a7236c7d68" Apr 16 19:54:12.460198 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:12.460165 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-d2lfx" event={"ID":"d41807f7-41a1-4bc5-bb49-bc957656ab37","Type":"ContainerStarted","Data":"02772acd941120c6f7b8f62da5ee274f60207b54a6eda75e746f704c08522a17"} Apr 16 19:54:12.462104 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:12.462057 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-mjhhc" event={"ID":"74091848-cf67-4c6a-9ad9-adc12a5e47ad","Type":"ContainerStarted","Data":"e4322d32a1d0e53d23ab627e1b26989eb84901330778551b2af29492f16a81b8"} Apr 16 19:54:12.468933 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:12.468910 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bsbmz_40cddc46-c9be-411b-92b8-a65e04009fc9/ovn-acl-logging/0.log" Apr 16 19:54:12.469351 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:12.469319 2572 generic.go:358] "Generic (PLEG): container finished" podID="40cddc46-c9be-411b-92b8-a65e04009fc9" containerID="6259ae1d7abed2754c035d7cb4e255c41f702615f6ed81bb3a67343334d0759a" exitCode=1 Apr 16 19:54:12.469438 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:12.469353 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bsbmz" event={"ID":"40cddc46-c9be-411b-92b8-a65e04009fc9","Type":"ContainerStarted","Data":"92c666288ace01ed3ef11cd87e7d097a00f5ac073846f4dd4fe8b6a0b232cf46"} Apr 16 19:54:12.469438 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:12.469393 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bsbmz" event={"ID":"40cddc46-c9be-411b-92b8-a65e04009fc9","Type":"ContainerStarted","Data":"a31bacd95414f5d5286dd304d2258d71b646010a4a5b76bddc81cef25b3b8fa7"} Apr 16 19:54:12.469438 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:12.469409 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bsbmz" event={"ID":"40cddc46-c9be-411b-92b8-a65e04009fc9","Type":"ContainerStarted","Data":"04ea0c736d35ffd92ed49fe493bb7358efdd9777f5a0c42b4ec42393c92ab16b"} Apr 16 19:54:12.469438 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:12.469421 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bsbmz" event={"ID":"40cddc46-c9be-411b-92b8-a65e04009fc9","Type":"ContainerDied","Data":"6259ae1d7abed2754c035d7cb4e255c41f702615f6ed81bb3a67343334d0759a"} Apr 16 19:54:12.469438 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:12.469437 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bsbmz" event={"ID":"40cddc46-c9be-411b-92b8-a65e04009fc9","Type":"ContainerStarted","Data":"b599c055825fb587f6de6af11bbbc4a873f327fae3019d03c1635b0650efe6f0"} Apr 16 19:54:12.473084 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:12.473045 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-48.ec2.internal" event={"ID":"0ef9ecbb9850ae7e0cccd45a7695b7be","Type":"ContainerStarted","Data":"a772273c47b01e2c38cb777947cc4067ea82866a2fb876c50824e58bd43cdeb3"} Apr 16 19:54:12.497608 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:12.497545 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-d2lfx" podStartSLOduration=2.257555061 podStartE2EDuration="19.497531354s" podCreationTimestamp="2026-04-16 19:53:53 +0000 UTC" firstStartedPulling="2026-04-16 19:53:54.560918032 +0000 UTC m=+1.739680038" lastFinishedPulling="2026-04-16 19:54:11.800894321 +0000 UTC m=+18.979656331" observedRunningTime="2026-04-16 19:54:12.478297604 +0000 UTC m=+19.657059630" watchObservedRunningTime="2026-04-16 19:54:12.497531354 +0000 UTC m=+19.676293357" Apr 16 19:54:12.514637 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:12.514581 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-mjhhc" podStartSLOduration=1.917583958 podStartE2EDuration="19.514562099s" podCreationTimestamp="2026-04-16 19:53:53 +0000 UTC" firstStartedPulling="2026-04-16 19:53:54.533237808 +0000 UTC m=+1.711999812" lastFinishedPulling="2026-04-16 19:54:12.13021595 +0000 UTC m=+19.308977953" observedRunningTime="2026-04-16 19:54:12.497343638 +0000 UTC m=+19.676105664" watchObservedRunningTime="2026-04-16 19:54:12.514562099 +0000 UTC m=+19.693324125" Apr 16 19:54:13.476105 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:13.475787 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lnmvl" event={"ID":"ffafc088-a777-4b43-b168-c2fdb6bdbbab","Type":"ContainerStarted","Data":"5ea4fd43dbf28f75962354d442e047ed4fb29592ec4f3b65487166551279c6a2"} Apr 16 19:54:13.477117 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:13.477075 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-dxbfj" event={"ID":"9fd607fa-9f37-4836-a69c-147c1052dbc4","Type":"ContainerStarted","Data":"7ae963e5e097b11fbdd633c3b8c4b431708605770986f453a9301309a30c2e37"} Apr 16 19:54:13.478864 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:13.478830 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-mlhp9" event={"ID":"4ee2a187-ea67-4cf0-a11b-3bf8404d67b6","Type":"ContainerStarted","Data":"1b35baaf4c4dc4091748ba2d9ff4d88ff9fbe3cfe7ba88d3c3ac7a5e8a2d4688"} Apr 16 19:54:13.480231 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:13.480209 2572 generic.go:358] "Generic (PLEG): container finished" podID="2cd1935c-d57b-4e12-881b-0c81444e85ac" containerID="f2aeb3de420e2a94589d82dc39dff048ee3b4ec62bce8ca3dbc3658f5f3448d8" exitCode=0 Apr 16 19:54:13.480328 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:13.480298 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-j48bx" event={"ID":"2cd1935c-d57b-4e12-881b-0c81444e85ac","Type":"ContainerDied","Data":"f2aeb3de420e2a94589d82dc39dff048ee3b4ec62bce8ca3dbc3658f5f3448d8"} Apr 16 19:54:13.483037 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:13.483015 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bsbmz_40cddc46-c9be-411b-92b8-a65e04009fc9/ovn-acl-logging/0.log" Apr 16 19:54:13.483455 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:13.483434 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bsbmz" event={"ID":"40cddc46-c9be-411b-92b8-a65e04009fc9","Type":"ContainerStarted","Data":"f404964265fab5e7b0a78198d125536f53f92d9100ecffc5292ee2981e9b84f0"} Apr 16 19:54:13.484883 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:13.484864 2572 generic.go:358] "Generic (PLEG): container finished" podID="a59663744879090502bab85c5c499c1b" containerID="eeda2b4a449aa90df56d0b602b7f463938c708c6dcf734cfb72db91a6d0b33ad" exitCode=0 Apr 16 19:54:13.484977 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:13.484931 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-48.ec2.internal" event={"ID":"a59663744879090502bab85c5c499c1b","Type":"ContainerDied","Data":"eeda2b4a449aa90df56d0b602b7f463938c708c6dcf734cfb72db91a6d0b33ad"} Apr 16 19:54:13.486253 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:13.486231 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-2jzjc" event={"ID":"b9261443-4578-43ee-abc0-7931d8ab9f10","Type":"ContainerStarted","Data":"656e9c9fec0844f941ede2090941ad78cf875563dc45f86b071552d82c0df6dd"} Apr 16 19:54:13.491572 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:13.491535 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-dxbfj" podStartSLOduration=3.272980571 podStartE2EDuration="20.491524936s" podCreationTimestamp="2026-04-16 19:53:53 +0000 UTC" firstStartedPulling="2026-04-16 19:53:54.556564881 +0000 UTC m=+1.735326895" lastFinishedPulling="2026-04-16 19:54:11.775109254 +0000 UTC m=+18.953871260" observedRunningTime="2026-04-16 19:54:13.491476726 +0000 UTC m=+20.670238777" watchObservedRunningTime="2026-04-16 19:54:13.491524936 +0000 UTC m=+20.670286962" Apr 16 19:54:13.491660 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:13.491622 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-48.ec2.internal" podStartSLOduration=19.491616007 podStartE2EDuration="19.491616007s" podCreationTimestamp="2026-04-16 19:53:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 19:54:12.514190115 +0000 UTC m=+19.692952150" watchObservedRunningTime="2026-04-16 19:54:13.491616007 +0000 UTC m=+20.670378033" Apr 16 19:54:13.530223 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:13.530178 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-mlhp9" podStartSLOduration=3.281280049 podStartE2EDuration="20.530162799s" podCreationTimestamp="2026-04-16 19:53:53 +0000 UTC" firstStartedPulling="2026-04-16 19:53:54.550171341 +0000 UTC m=+1.728933359" lastFinishedPulling="2026-04-16 19:54:11.799054105 +0000 UTC m=+18.977816109" observedRunningTime="2026-04-16 19:54:13.504812369 +0000 UTC m=+20.683574394" watchObservedRunningTime="2026-04-16 19:54:13.530162799 +0000 UTC m=+20.708924827" Apr 16 19:54:13.565582 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:13.565555 2572 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 16 19:54:13.582001 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:13.581966 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-2jzjc" podStartSLOduration=7.128754839 podStartE2EDuration="20.581954742s" podCreationTimestamp="2026-04-16 19:53:53 +0000 UTC" firstStartedPulling="2026-04-16 19:53:54.564930066 +0000 UTC m=+1.743692073" lastFinishedPulling="2026-04-16 19:54:08.018129969 +0000 UTC m=+15.196891976" observedRunningTime="2026-04-16 19:54:13.560712263 +0000 UTC m=+20.739474293" watchObservedRunningTime="2026-04-16 19:54:13.581954742 +0000 UTC m=+20.760716766" Apr 16 19:54:14.278393 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:14.278275 2572 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-16T19:54:13.565579046Z","UUID":"9c8d2e85-b7e3-465f-86cc-28e038ae40fb","Handler":null,"Name":"","Endpoint":""} Apr 16 19:54:14.281522 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:14.281498 2572 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 16 19:54:14.281639 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:14.281531 2572 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 16 19:54:14.362011 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:14.361976 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qvsgg" Apr 16 19:54:14.362149 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:14.361981 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5j478" Apr 16 19:54:14.362220 ip-10-0-128-48 kubenswrapper[2572]: E0416 19:54:14.362177 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qvsgg" podUID="dc2e7638-ebb5-4713-a221-8c885ed0b19d" Apr 16 19:54:14.362220 ip-10-0-128-48 kubenswrapper[2572]: E0416 19:54:14.362193 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5j478" podUID="458f83e2-e97a-457a-9081-a5ae099b6973" Apr 16 19:54:14.362220 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:14.361980 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-9mwx9" Apr 16 19:54:14.362359 ip-10-0-128-48 kubenswrapper[2572]: E0416 19:54:14.362284 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-9mwx9" podUID="fec5c910-3cf0-49a0-b436-62a7236c7d68" Apr 16 19:54:14.490793 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:14.490762 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lnmvl" event={"ID":"ffafc088-a777-4b43-b168-c2fdb6bdbbab","Type":"ContainerStarted","Data":"25dfe5a61c77f99174822d7c3e4fbeb0f3b1d7092ad000ac710b70f1042d392e"} Apr 16 19:54:14.491208 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:14.490803 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lnmvl" event={"ID":"ffafc088-a777-4b43-b168-c2fdb6bdbbab","Type":"ContainerStarted","Data":"b04e1449c2d9abd9670e162699780ae3d6acf8d77d71f9773e5c11ca1802468e"} Apr 16 19:54:14.492531 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:14.492496 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-48.ec2.internal" event={"ID":"a59663744879090502bab85c5c499c1b","Type":"ContainerStarted","Data":"c2a7c6a8aa8e714352b9a076a95b29974ef8b26f1fda0f2fcf19918914f17c82"} Apr 16 19:54:14.506884 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:14.506844 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lnmvl" podStartSLOduration=1.74413267 podStartE2EDuration="21.506828356s" podCreationTimestamp="2026-04-16 19:53:53 +0000 UTC" firstStartedPulling="2026-04-16 19:53:54.542080605 +0000 UTC m=+1.720842621" lastFinishedPulling="2026-04-16 19:54:14.304776294 +0000 UTC m=+21.483538307" observedRunningTime="2026-04-16 19:54:14.506430318 +0000 UTC m=+21.685192344" watchObservedRunningTime="2026-04-16 19:54:14.506828356 +0000 UTC m=+21.685590380" Apr 16 19:54:14.522517 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:14.522476 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-48.ec2.internal" podStartSLOduration=20.522463968 podStartE2EDuration="20.522463968s" podCreationTimestamp="2026-04-16 19:53:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 19:54:14.52231651 +0000 UTC m=+21.701078534" watchObservedRunningTime="2026-04-16 19:54:14.522463968 +0000 UTC m=+21.701225992" Apr 16 19:54:15.287956 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:15.287926 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-dxbfj" Apr 16 19:54:15.288582 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:15.288553 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-dxbfj" Apr 16 19:54:15.497682 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:15.497647 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bsbmz_40cddc46-c9be-411b-92b8-a65e04009fc9/ovn-acl-logging/0.log" Apr 16 19:54:15.498167 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:15.498141 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bsbmz" event={"ID":"40cddc46-c9be-411b-92b8-a65e04009fc9","Type":"ContainerStarted","Data":"a50e43744e530212e12ede2d759bb2a912c4c417f2e41cb4b59eadcafde31d36"} Apr 16 19:54:15.498691 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:15.498667 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-dxbfj" Apr 16 19:54:15.499191 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:15.499161 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-dxbfj" Apr 16 19:54:16.361180 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:16.361152 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qvsgg" Apr 16 19:54:16.361342 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:16.361152 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-9mwx9" Apr 16 19:54:16.361342 ip-10-0-128-48 kubenswrapper[2572]: E0416 19:54:16.361263 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qvsgg" podUID="dc2e7638-ebb5-4713-a221-8c885ed0b19d" Apr 16 19:54:16.361444 ip-10-0-128-48 kubenswrapper[2572]: E0416 19:54:16.361357 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-9mwx9" podUID="fec5c910-3cf0-49a0-b436-62a7236c7d68" Apr 16 19:54:16.361444 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:16.361159 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5j478" Apr 16 19:54:16.361534 ip-10-0-128-48 kubenswrapper[2572]: E0416 19:54:16.361467 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5j478" podUID="458f83e2-e97a-457a-9081-a5ae099b6973" Apr 16 19:54:18.361337 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:18.361186 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5j478" Apr 16 19:54:18.361915 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:18.361226 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qvsgg" Apr 16 19:54:18.361915 ip-10-0-128-48 kubenswrapper[2572]: E0416 19:54:18.361415 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5j478" podUID="458f83e2-e97a-457a-9081-a5ae099b6973" Apr 16 19:54:18.361915 ip-10-0-128-48 kubenswrapper[2572]: E0416 19:54:18.361464 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qvsgg" podUID="dc2e7638-ebb5-4713-a221-8c885ed0b19d" Apr 16 19:54:18.361915 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:18.361248 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-9mwx9" Apr 16 19:54:18.361915 ip-10-0-128-48 kubenswrapper[2572]: E0416 19:54:18.361528 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-9mwx9" podUID="fec5c910-3cf0-49a0-b436-62a7236c7d68" Apr 16 19:54:18.504730 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:18.504675 2572 generic.go:358] "Generic (PLEG): container finished" podID="2cd1935c-d57b-4e12-881b-0c81444e85ac" containerID="450dade32e8e2f95aa8580068a545fe545baa6c3bf74ad0242ea7085fb6553d1" exitCode=0 Apr 16 19:54:18.504841 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:18.504738 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-j48bx" event={"ID":"2cd1935c-d57b-4e12-881b-0c81444e85ac","Type":"ContainerDied","Data":"450dade32e8e2f95aa8580068a545fe545baa6c3bf74ad0242ea7085fb6553d1"} Apr 16 19:54:18.507636 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:18.507620 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bsbmz_40cddc46-c9be-411b-92b8-a65e04009fc9/ovn-acl-logging/0.log" Apr 16 19:54:18.508000 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:18.507979 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bsbmz" event={"ID":"40cddc46-c9be-411b-92b8-a65e04009fc9","Type":"ContainerStarted","Data":"2311b40e18539e7ee92fc47c7b93ca6d1346cf98591d6798b878faeca579034c"} Apr 16 19:54:18.508306 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:18.508292 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-bsbmz" Apr 16 19:54:18.508386 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:18.508311 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-bsbmz" Apr 16 19:54:18.508386 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:18.508323 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-bsbmz" Apr 16 19:54:18.508480 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:18.508383 2572 scope.go:117] "RemoveContainer" containerID="6259ae1d7abed2754c035d7cb4e255c41f702615f6ed81bb3a67343334d0759a" Apr 16 19:54:18.523155 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:18.523137 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-bsbmz" Apr 16 19:54:18.523417 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:18.523401 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-bsbmz" Apr 16 19:54:19.512720 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:19.512485 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-j48bx" event={"ID":"2cd1935c-d57b-4e12-881b-0c81444e85ac","Type":"ContainerStarted","Data":"d3a7d86c1efa87e678a5b0af919338211251287fa038e4006bebc8938a690a59"} Apr 16 19:54:19.515830 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:19.515807 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bsbmz_40cddc46-c9be-411b-92b8-a65e04009fc9/ovn-acl-logging/0.log" Apr 16 19:54:19.516191 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:19.516169 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bsbmz" event={"ID":"40cddc46-c9be-411b-92b8-a65e04009fc9","Type":"ContainerStarted","Data":"f0a34e7a73da6cd2214878a2d61aac8e10794830dbd02b5626c82a0bc3f95a67"} Apr 16 19:54:19.837789 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:19.837740 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-bsbmz" podStartSLOduration=9.527566185 podStartE2EDuration="26.837724163s" podCreationTimestamp="2026-04-16 19:53:53 +0000 UTC" firstStartedPulling="2026-04-16 19:53:54.525889423 +0000 UTC m=+1.704651425" lastFinishedPulling="2026-04-16 19:54:11.836047395 +0000 UTC m=+19.014809403" observedRunningTime="2026-04-16 19:54:19.570176842 +0000 UTC m=+26.748938867" watchObservedRunningTime="2026-04-16 19:54:19.837724163 +0000 UTC m=+27.016486188" Apr 16 19:54:19.838507 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:19.838483 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-5j478"] Apr 16 19:54:19.838647 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:19.838614 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5j478" Apr 16 19:54:19.838750 ip-10-0-128-48 kubenswrapper[2572]: E0416 19:54:19.838733 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5j478" podUID="458f83e2-e97a-457a-9081-a5ae099b6973" Apr 16 19:54:19.841585 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:19.841564 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-qvsgg"] Apr 16 19:54:19.841697 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:19.841646 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qvsgg" Apr 16 19:54:19.841751 ip-10-0-128-48 kubenswrapper[2572]: E0416 19:54:19.841735 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qvsgg" podUID="dc2e7638-ebb5-4713-a221-8c885ed0b19d" Apr 16 19:54:19.842184 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:19.842164 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-9mwx9"] Apr 16 19:54:19.842277 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:19.842247 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-9mwx9" Apr 16 19:54:19.842350 ip-10-0-128-48 kubenswrapper[2572]: E0416 19:54:19.842327 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-9mwx9" podUID="fec5c910-3cf0-49a0-b436-62a7236c7d68" Apr 16 19:54:20.488376 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:20.488313 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-t4xmk"] Apr 16 19:54:20.519612 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:20.519584 2572 generic.go:358] "Generic (PLEG): container finished" podID="2cd1935c-d57b-4e12-881b-0c81444e85ac" containerID="d3a7d86c1efa87e678a5b0af919338211251287fa038e4006bebc8938a690a59" exitCode=0 Apr 16 19:54:20.526211 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:20.526187 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-j48bx" event={"ID":"2cd1935c-d57b-4e12-881b-0c81444e85ac","Type":"ContainerDied","Data":"d3a7d86c1efa87e678a5b0af919338211251287fa038e4006bebc8938a690a59"} Apr 16 19:54:20.526323 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:20.526278 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-t4xmk" Apr 16 19:54:20.529856 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:20.529835 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-j2vdp\"" Apr 16 19:54:20.530084 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:20.530070 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 16 19:54:20.531305 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:20.531283 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 16 19:54:20.638248 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:20.638224 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/e34d0c6b-2806-42a6-9665-4b769fb05f24-tmp-dir\") pod \"node-resolver-t4xmk\" (UID: \"e34d0c6b-2806-42a6-9665-4b769fb05f24\") " pod="openshift-dns/node-resolver-t4xmk" Apr 16 19:54:20.638248 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:20.638255 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/e34d0c6b-2806-42a6-9665-4b769fb05f24-hosts-file\") pod \"node-resolver-t4xmk\" (UID: \"e34d0c6b-2806-42a6-9665-4b769fb05f24\") " pod="openshift-dns/node-resolver-t4xmk" Apr 16 19:54:20.638459 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:20.638422 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ltnj8\" (UniqueName: \"kubernetes.io/projected/e34d0c6b-2806-42a6-9665-4b769fb05f24-kube-api-access-ltnj8\") pod \"node-resolver-t4xmk\" (UID: \"e34d0c6b-2806-42a6-9665-4b769fb05f24\") " pod="openshift-dns/node-resolver-t4xmk" Apr 16 19:54:20.739562 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:20.739501 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/e34d0c6b-2806-42a6-9665-4b769fb05f24-hosts-file\") pod \"node-resolver-t4xmk\" (UID: \"e34d0c6b-2806-42a6-9665-4b769fb05f24\") " pod="openshift-dns/node-resolver-t4xmk" Apr 16 19:54:20.739669 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:20.739569 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ltnj8\" (UniqueName: \"kubernetes.io/projected/e34d0c6b-2806-42a6-9665-4b769fb05f24-kube-api-access-ltnj8\") pod \"node-resolver-t4xmk\" (UID: \"e34d0c6b-2806-42a6-9665-4b769fb05f24\") " pod="openshift-dns/node-resolver-t4xmk" Apr 16 19:54:20.739669 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:20.739624 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/e34d0c6b-2806-42a6-9665-4b769fb05f24-tmp-dir\") pod \"node-resolver-t4xmk\" (UID: \"e34d0c6b-2806-42a6-9665-4b769fb05f24\") " pod="openshift-dns/node-resolver-t4xmk" Apr 16 19:54:20.739669 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:20.739631 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/e34d0c6b-2806-42a6-9665-4b769fb05f24-hosts-file\") pod \"node-resolver-t4xmk\" (UID: \"e34d0c6b-2806-42a6-9665-4b769fb05f24\") " pod="openshift-dns/node-resolver-t4xmk" Apr 16 19:54:20.739923 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:20.739906 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/e34d0c6b-2806-42a6-9665-4b769fb05f24-tmp-dir\") pod \"node-resolver-t4xmk\" (UID: \"e34d0c6b-2806-42a6-9665-4b769fb05f24\") " pod="openshift-dns/node-resolver-t4xmk" Apr 16 19:54:20.750642 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:20.750624 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ltnj8\" (UniqueName: \"kubernetes.io/projected/e34d0c6b-2806-42a6-9665-4b769fb05f24-kube-api-access-ltnj8\") pod \"node-resolver-t4xmk\" (UID: \"e34d0c6b-2806-42a6-9665-4b769fb05f24\") " pod="openshift-dns/node-resolver-t4xmk" Apr 16 19:54:20.835452 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:20.835427 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-t4xmk" Apr 16 19:54:20.842318 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:54:20.842282 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode34d0c6b_2806_42a6_9665_4b769fb05f24.slice/crio-b7d2f068a2b39be8ff074d4825593fea69cdf93838000661efc808d02aeb0d62 WatchSource:0}: Error finding container b7d2f068a2b39be8ff074d4825593fea69cdf93838000661efc808d02aeb0d62: Status 404 returned error can't find the container with id b7d2f068a2b39be8ff074d4825593fea69cdf93838000661efc808d02aeb0d62 Apr 16 19:54:21.361487 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:21.361338 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qvsgg" Apr 16 19:54:21.361620 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:21.361342 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5j478" Apr 16 19:54:21.361620 ip-10-0-128-48 kubenswrapper[2572]: E0416 19:54:21.361564 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qvsgg" podUID="dc2e7638-ebb5-4713-a221-8c885ed0b19d" Apr 16 19:54:21.361694 ip-10-0-128-48 kubenswrapper[2572]: E0416 19:54:21.361641 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5j478" podUID="458f83e2-e97a-457a-9081-a5ae099b6973" Apr 16 19:54:21.361694 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:21.361344 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-9mwx9" Apr 16 19:54:21.361751 ip-10-0-128-48 kubenswrapper[2572]: E0416 19:54:21.361710 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-9mwx9" podUID="fec5c910-3cf0-49a0-b436-62a7236c7d68" Apr 16 19:54:21.522434 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:21.522398 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-t4xmk" event={"ID":"e34d0c6b-2806-42a6-9665-4b769fb05f24","Type":"ContainerStarted","Data":"974c93b46aa6704ceb0ba32d3af8befcaea6f0ac36ecc0b0f927c0769e36d5fa"} Apr 16 19:54:21.522796 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:21.522443 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-t4xmk" event={"ID":"e34d0c6b-2806-42a6-9665-4b769fb05f24","Type":"ContainerStarted","Data":"b7d2f068a2b39be8ff074d4825593fea69cdf93838000661efc808d02aeb0d62"} Apr 16 19:54:22.526176 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:22.526140 2572 generic.go:358] "Generic (PLEG): container finished" podID="2cd1935c-d57b-4e12-881b-0c81444e85ac" containerID="9e6ba743c64306dce7e83cc9d6cdd3bea6c6172a1701e718ae973a410abe0b10" exitCode=0 Apr 16 19:54:22.526702 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:22.526203 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-j48bx" event={"ID":"2cd1935c-d57b-4e12-881b-0c81444e85ac","Type":"ContainerDied","Data":"9e6ba743c64306dce7e83cc9d6cdd3bea6c6172a1701e718ae973a410abe0b10"} Apr 16 19:54:22.554277 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:22.554234 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-t4xmk" podStartSLOduration=2.554220195 podStartE2EDuration="2.554220195s" podCreationTimestamp="2026-04-16 19:54:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 19:54:21.545364427 +0000 UTC m=+28.724126458" watchObservedRunningTime="2026-04-16 19:54:22.554220195 +0000 UTC m=+29.732982219" Apr 16 19:54:23.362373 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:23.362336 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-9mwx9" Apr 16 19:54:23.362551 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:23.362415 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qvsgg" Apr 16 19:54:23.362551 ip-10-0-128-48 kubenswrapper[2572]: E0416 19:54:23.362516 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qvsgg" podUID="dc2e7638-ebb5-4713-a221-8c885ed0b19d" Apr 16 19:54:23.362661 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:23.362558 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5j478" Apr 16 19:54:23.362710 ip-10-0-128-48 kubenswrapper[2572]: E0416 19:54:23.362680 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5j478" podUID="458f83e2-e97a-457a-9081-a5ae099b6973" Apr 16 19:54:23.362788 ip-10-0-128-48 kubenswrapper[2572]: E0416 19:54:23.362769 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-9mwx9" podUID="fec5c910-3cf0-49a0-b436-62a7236c7d68" Apr 16 19:54:25.168530 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.168495 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-48.ec2.internal" event="NodeReady" Apr 16 19:54:25.169065 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.168649 2572 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 16 19:54:25.213078 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.213048 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-6cnrc"] Apr 16 19:54:25.236844 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.236814 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-g96dz"] Apr 16 19:54:25.237051 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.237028 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-6cnrc" Apr 16 19:54:25.239995 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.239974 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"openshift-service-ca.crt\"" Apr 16 19:54:25.240112 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.240004 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"kube-root-ca.crt\"" Apr 16 19:54:25.240188 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.240159 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-storage-operator\"/\"volume-data-source-validator-dockercfg-vjbrh\"" Apr 16 19:54:25.253473 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.253227 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-656d5cd769-kngz2"] Apr 16 19:54:25.253473 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.253440 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-g96dz" Apr 16 19:54:25.257386 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.257366 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"config\"" Apr 16 19:54:25.257620 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.257598 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-storage-version-migrator-operator-dockercfg-82hlq\"" Apr 16 19:54:25.257713 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.257608 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-root-ca.crt\"" Apr 16 19:54:25.257867 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.257849 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"serving-cert\"" Apr 16 19:54:25.261566 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.261309 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"openshift-service-ca.crt\"" Apr 16 19:54:25.274158 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.272941 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-74fc646496-l58qj"] Apr 16 19:54:25.292928 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.292907 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-585dfdc468-gm4s4"] Apr 16 19:54:25.293047 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.293026 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-656d5cd769-kngz2" Apr 16 19:54:25.293339 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.293277 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-74fc646496-l58qj" Apr 16 19:54:25.297295 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.297271 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 16 19:54:25.299357 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.299330 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-stats-default\"" Apr 16 19:54:25.300087 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.300068 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-metrics-certs-default\"" Apr 16 19:54:25.301156 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.300279 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"service-ca-bundle\"" Apr 16 19:54:25.301156 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.300342 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-nfjh6\"" Apr 16 19:54:25.301156 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.300373 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 16 19:54:25.301156 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.300433 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"default-ingress-cert\"" Apr 16 19:54:25.301156 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.300676 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 16 19:54:25.301378 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.301218 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 16 19:54:25.301378 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.301362 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-q92g7\"" Apr 16 19:54:25.301442 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.301420 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 16 19:54:25.305891 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.305872 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 16 19:54:25.311479 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.311461 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-2xttw"] Apr 16 19:54:25.311619 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.311602 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-gm4s4" Apr 16 19:54:25.314291 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.314274 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"operator-dockercfg-pxzc9\"" Apr 16 19:54:25.314375 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.314356 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"service-ca-bundle\"" Apr 16 19:54:25.314649 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.314633 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 16 19:54:25.314758 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.314739 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"openshift-insights-serving-cert\"" Apr 16 19:54:25.315422 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.315399 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 16 19:54:25.320522 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.320495 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"trusted-ca-bundle\"" Apr 16 19:54:25.329286 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.329264 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-trj4c"] Apr 16 19:54:25.329412 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.329388 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-2xttw" Apr 16 19:54:25.332575 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.332556 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"serving-cert\"" Apr 16 19:54:25.333019 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.332915 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"openshift-service-ca.crt\"" Apr 16 19:54:25.333019 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.332982 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"console-operator-config\"" Apr 16 19:54:25.333802 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.333784 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"console-operator-dockercfg-h5947\"" Apr 16 19:54:25.333878 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.333814 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"kube-root-ca.crt\"" Apr 16 19:54:25.338743 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.338245 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"trusted-ca\"" Apr 16 19:54:25.351591 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.351570 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-qxcc8"] Apr 16 19:54:25.351740 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.351724 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-trj4c" Apr 16 19:54:25.354547 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.354405 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"samples-operator-tls\"" Apr 16 19:54:25.354547 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.354420 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"kube-root-ca.crt\"" Apr 16 19:54:25.354547 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.354438 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-jvpqq\"" Apr 16 19:54:25.354758 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.354583 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"openshift-service-ca.crt\"" Apr 16 19:54:25.367940 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.367920 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-qxcc8" Apr 16 19:54:25.367940 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.367936 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-9mwx9" Apr 16 19:54:25.368087 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.367928 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5j478" Apr 16 19:54:25.368087 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.368042 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qvsgg" Apr 16 19:54:25.371465 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.371424 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-6cnrc"] Apr 16 19:54:25.371465 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.371455 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7dd8db8f47-n8g5p"] Apr 16 19:54:25.372609 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.372561 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 19:54:25.372738 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.372724 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"serving-cert\"" Apr 16 19:54:25.372809 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.372793 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 19:54:25.372971 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.372950 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"openshift-service-ca.crt\"" Apr 16 19:54:25.373077 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.373060 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 16 19:54:25.373166 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.373127 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-dockercfg-jwf5k\"" Apr 16 19:54:25.373292 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.373274 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-25nwv\"" Apr 16 19:54:25.373394 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.373375 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 19:54:25.373514 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.373499 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-config\"" Apr 16 19:54:25.373823 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.373808 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"kube-root-ca.crt\"" Apr 16 19:54:25.373931 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.373913 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-l2m6x\"" Apr 16 19:54:25.376849 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.376825 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/c37c23a9-747d-419a-8a58-907dc11ecf6d-ca-trust-extracted\") pod \"image-registry-656d5cd769-kngz2\" (UID: \"c37c23a9-747d-419a-8a58-907dc11ecf6d\") " pod="openshift-image-registry/image-registry-656d5cd769-kngz2" Apr 16 19:54:25.376946 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.376861 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/499fb04b-1629-4d2a-8d4c-4b6f38ec093e-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-g96dz\" (UID: \"499fb04b-1629-4d2a-8d4c-4b6f38ec093e\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-g96dz" Apr 16 19:54:25.376946 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.376893 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c37c23a9-747d-419a-8a58-907dc11ecf6d-trusted-ca\") pod \"image-registry-656d5cd769-kngz2\" (UID: \"c37c23a9-747d-419a-8a58-907dc11ecf6d\") " pod="openshift-image-registry/image-registry-656d5cd769-kngz2" Apr 16 19:54:25.376946 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.376914 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ftxm9\" (UniqueName: \"kubernetes.io/projected/c37c23a9-747d-419a-8a58-907dc11ecf6d-kube-api-access-ftxm9\") pod \"image-registry-656d5cd769-kngz2\" (UID: \"c37c23a9-747d-419a-8a58-907dc11ecf6d\") " pod="openshift-image-registry/image-registry-656d5cd769-kngz2" Apr 16 19:54:25.376946 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.376929 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/142d9ab9-4b05-479d-b198-2760a09292d1-service-ca-bundle\") pod \"router-default-74fc646496-l58qj\" (UID: \"142d9ab9-4b05-479d-b198-2760a09292d1\") " pod="openshift-ingress/router-default-74fc646496-l58qj" Apr 16 19:54:25.377182 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.376976 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c37c23a9-747d-419a-8a58-907dc11ecf6d-bound-sa-token\") pod \"image-registry-656d5cd769-kngz2\" (UID: \"c37c23a9-747d-419a-8a58-907dc11ecf6d\") " pod="openshift-image-registry/image-registry-656d5cd769-kngz2" Apr 16 19:54:25.377182 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.376994 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c37c23a9-747d-419a-8a58-907dc11ecf6d-registry-tls\") pod \"image-registry-656d5cd769-kngz2\" (UID: \"c37c23a9-747d-419a-8a58-907dc11ecf6d\") " pod="openshift-image-registry/image-registry-656d5cd769-kngz2" Apr 16 19:54:25.377182 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.377015 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/142d9ab9-4b05-479d-b198-2760a09292d1-metrics-certs\") pod \"router-default-74fc646496-l58qj\" (UID: \"142d9ab9-4b05-479d-b198-2760a09292d1\") " pod="openshift-ingress/router-default-74fc646496-l58qj" Apr 16 19:54:25.377182 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.377042 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/c37c23a9-747d-419a-8a58-907dc11ecf6d-registry-certificates\") pod \"image-registry-656d5cd769-kngz2\" (UID: \"c37c23a9-747d-419a-8a58-907dc11ecf6d\") " pod="openshift-image-registry/image-registry-656d5cd769-kngz2" Apr 16 19:54:25.377182 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.377060 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-679zt\" (UniqueName: \"kubernetes.io/projected/499fb04b-1629-4d2a-8d4c-4b6f38ec093e-kube-api-access-679zt\") pod \"kube-storage-version-migrator-operator-6769c5d45-g96dz\" (UID: \"499fb04b-1629-4d2a-8d4c-4b6f38ec093e\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-g96dz" Apr 16 19:54:25.377182 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.377127 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/c37c23a9-747d-419a-8a58-907dc11ecf6d-image-registry-private-configuration\") pod \"image-registry-656d5cd769-kngz2\" (UID: \"c37c23a9-747d-419a-8a58-907dc11ecf6d\") " pod="openshift-image-registry/image-registry-656d5cd769-kngz2" Apr 16 19:54:25.377182 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.377146 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/c37c23a9-747d-419a-8a58-907dc11ecf6d-installation-pull-secrets\") pod \"image-registry-656d5cd769-kngz2\" (UID: \"c37c23a9-747d-419a-8a58-907dc11ecf6d\") " pod="openshift-image-registry/image-registry-656d5cd769-kngz2" Apr 16 19:54:25.377182 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.377160 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/142d9ab9-4b05-479d-b198-2760a09292d1-default-certificate\") pod \"router-default-74fc646496-l58qj\" (UID: \"142d9ab9-4b05-479d-b198-2760a09292d1\") " pod="openshift-ingress/router-default-74fc646496-l58qj" Apr 16 19:54:25.377182 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.377174 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/142d9ab9-4b05-479d-b198-2760a09292d1-stats-auth\") pod \"router-default-74fc646496-l58qj\" (UID: \"142d9ab9-4b05-479d-b198-2760a09292d1\") " pod="openshift-ingress/router-default-74fc646496-l58qj" Apr 16 19:54:25.377611 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.377191 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x78bl\" (UniqueName: \"kubernetes.io/projected/72d1ed32-dd1f-4c30-adc3-db411de6c394-kube-api-access-x78bl\") pod \"volume-data-source-validator-7c6cbb6c87-6cnrc\" (UID: \"72d1ed32-dd1f-4c30-adc3-db411de6c394\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-6cnrc" Apr 16 19:54:25.377611 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.377245 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/499fb04b-1629-4d2a-8d4c-4b6f38ec093e-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-g96dz\" (UID: \"499fb04b-1629-4d2a-8d4c-4b6f38ec093e\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-g96dz" Apr 16 19:54:25.377611 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.377287 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hk9w8\" (UniqueName: \"kubernetes.io/projected/142d9ab9-4b05-479d-b198-2760a09292d1-kube-api-access-hk9w8\") pod \"router-default-74fc646496-l58qj\" (UID: \"142d9ab9-4b05-479d-b198-2760a09292d1\") " pod="openshift-ingress/router-default-74fc646496-l58qj" Apr 16 19:54:25.385347 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.385327 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-rcxgg"] Apr 16 19:54:25.385478 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.385462 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7dd8db8f47-n8g5p" Apr 16 19:54:25.388315 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.388294 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-dockercfg-d6dnh\"" Apr 16 19:54:25.388406 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.388314 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 16 19:54:25.388406 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.388377 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-hub-kubeconfig\"" Apr 16 19:54:25.388514 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.388425 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 16 19:54:25.388514 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.388294 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 16 19:54:25.403716 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.403696 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-649f77c955-hl7lw"] Apr 16 19:54:25.403848 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.403830 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-rcxgg" Apr 16 19:54:25.407320 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.407262 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-tls\"" Apr 16 19:54:25.407320 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.407279 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 16 19:54:25.407320 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.407265 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 16 19:54:25.407967 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.407950 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-84znb\"" Apr 16 19:54:25.408027 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.407959 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemetry-config\"" Apr 16 19:54:25.422262 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.422216 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-lcvgf"] Apr 16 19:54:25.422368 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.422353 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-649f77c955-hl7lw" Apr 16 19:54:25.425185 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.425167 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"work-manager-hub-kubeconfig\"" Apr 16 19:54:25.439853 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.439836 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-c8jsl"] Apr 16 19:54:25.439991 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.439975 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-lcvgf" Apr 16 19:54:25.442892 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.442875 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"networking-console-plugin-cert\"" Apr 16 19:54:25.443169 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.443149 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-w9jfp\"" Apr 16 19:54:25.443563 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.443548 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-console\"/\"networking-console-plugin\"" Apr 16 19:54:25.454613 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.454594 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-78477dcc55-b42tf"] Apr 16 19:54:25.454613 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.454611 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-c8jsl" Apr 16 19:54:25.458208 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.457633 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-9clf4"] Apr 16 19:54:25.458432 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.458367 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-78477dcc55-b42tf" Apr 16 19:54:25.458528 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.458507 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"network-diagnostics-dockercfg-gcdwj\"" Apr 16 19:54:25.460923 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.460903 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-g96dz"] Apr 16 19:54:25.461019 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.460928 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-qxcc8"] Apr 16 19:54:25.461019 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.460941 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-656d5cd769-kngz2"] Apr 16 19:54:25.461019 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.460952 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-2xttw"] Apr 16 19:54:25.461019 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.460961 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-74fc646496-l58qj"] Apr 16 19:54:25.461019 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.460968 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-trj4c"] Apr 16 19:54:25.461019 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.460981 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-ppsrs"] Apr 16 19:54:25.461322 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.461064 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-9clf4" Apr 16 19:54:25.461322 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.461232 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-ca\"" Apr 16 19:54:25.461422 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.461343 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-service-proxy-server-certificates\"" Apr 16 19:54:25.461829 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.461813 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-hub-kubeconfig\"" Apr 16 19:54:25.461923 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.461905 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-open-cluster-management.io-proxy-agent-signer-client-cert\"" Apr 16 19:54:25.462728 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.462713 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-649f77c955-hl7lw"] Apr 16 19:54:25.462802 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.462731 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-c8jsl"] Apr 16 19:54:25.462802 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.462739 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7dd8db8f47-n8g5p"] Apr 16 19:54:25.462802 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.462751 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-78477dcc55-b42tf"] Apr 16 19:54:25.462802 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.462758 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-rcxgg"] Apr 16 19:54:25.462971 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.462827 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-ppsrs" Apr 16 19:54:25.462971 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.462841 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-gm4s4"] Apr 16 19:54:25.462971 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.462853 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-ppsrs"] Apr 16 19:54:25.462971 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.462863 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-lcvgf"] Apr 16 19:54:25.462971 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.462874 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-9clf4"] Apr 16 19:54:25.464751 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.464732 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 16 19:54:25.468020 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.468001 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 16 19:54:25.468376 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.468357 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 16 19:54:25.468465 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.468413 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-lx244\"" Apr 16 19:54:25.468529 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.468465 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-6lbj8\"" Apr 16 19:54:25.468651 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.468637 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 16 19:54:25.469273 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.469256 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 16 19:54:25.478025 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.478007 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hk9w8\" (UniqueName: \"kubernetes.io/projected/142d9ab9-4b05-479d-b198-2760a09292d1-kube-api-access-hk9w8\") pod \"router-default-74fc646496-l58qj\" (UID: \"142d9ab9-4b05-479d-b198-2760a09292d1\") " pod="openshift-ingress/router-default-74fc646496-l58qj" Apr 16 19:54:25.478025 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.478034 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mzw9x\" (UniqueName: \"kubernetes.io/projected/5ea56c9b-a13c-4e3d-a9c0-5a4d7c254e14-kube-api-access-mzw9x\") pod \"managed-serviceaccount-addon-agent-7dd8db8f47-n8g5p\" (UID: \"5ea56c9b-a13c-4e3d-a9c0-5a4d7c254e14\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7dd8db8f47-n8g5p" Apr 16 19:54:25.478025 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.478055 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/499fb04b-1629-4d2a-8d4c-4b6f38ec093e-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-g96dz\" (UID: \"499fb04b-1629-4d2a-8d4c-4b6f38ec093e\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-g96dz" Apr 16 19:54:25.478466 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.478082 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kvjhv\" (UniqueName: \"kubernetes.io/projected/2090cd61-18f9-425a-bf42-4b9628417aad-kube-api-access-kvjhv\") pod \"service-ca-operator-d6fc45fc5-qxcc8\" (UID: \"2090cd61-18f9-425a-bf42-4b9628417aad\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-qxcc8" Apr 16 19:54:25.478466 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.478129 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/142d9ab9-4b05-479d-b198-2760a09292d1-service-ca-bundle\") pod \"router-default-74fc646496-l58qj\" (UID: \"142d9ab9-4b05-479d-b198-2760a09292d1\") " pod="openshift-ingress/router-default-74fc646496-l58qj" Apr 16 19:54:25.478466 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.478156 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vsjz2\" (UniqueName: \"kubernetes.io/projected/684e28e1-6369-483c-abf7-b3f82437af2c-kube-api-access-vsjz2\") pod \"cluster-samples-operator-6dc5bdb6b4-trj4c\" (UID: \"684e28e1-6369-483c-abf7-b3f82437af2c\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-trj4c" Apr 16 19:54:25.478466 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.478245 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/142d9ab9-4b05-479d-b198-2760a09292d1-metrics-certs\") pod \"router-default-74fc646496-l58qj\" (UID: \"142d9ab9-4b05-479d-b198-2760a09292d1\") " pod="openshift-ingress/router-default-74fc646496-l58qj" Apr 16 19:54:25.478466 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.478274 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j7gqb\" (UniqueName: \"kubernetes.io/projected/d7cbe800-699f-48fe-9a8a-b74c32bf0dcc-kube-api-access-j7gqb\") pod \"console-operator-9d4b6777b-2xttw\" (UID: \"d7cbe800-699f-48fe-9a8a-b74c32bf0dcc\") " pod="openshift-console-operator/console-operator-9d4b6777b-2xttw" Apr 16 19:54:25.478466 ip-10-0-128-48 kubenswrapper[2572]: E0416 19:54:25.478287 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/142d9ab9-4b05-479d-b198-2760a09292d1-service-ca-bundle podName:142d9ab9-4b05-479d-b198-2760a09292d1 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:25.978267275 +0000 UTC m=+33.157029297 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/142d9ab9-4b05-479d-b198-2760a09292d1-service-ca-bundle") pod "router-default-74fc646496-l58qj" (UID: "142d9ab9-4b05-479d-b198-2760a09292d1") : configmap references non-existent config key: service-ca.crt Apr 16 19:54:25.478466 ip-10-0-128-48 kubenswrapper[2572]: E0416 19:54:25.478343 2572 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 19:54:25.478466 ip-10-0-128-48 kubenswrapper[2572]: E0416 19:54:25.478412 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/142d9ab9-4b05-479d-b198-2760a09292d1-metrics-certs podName:142d9ab9-4b05-479d-b198-2760a09292d1 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:25.978396179 +0000 UTC m=+33.157158183 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/142d9ab9-4b05-479d-b198-2760a09292d1-metrics-certs") pod "router-default-74fc646496-l58qj" (UID: "142d9ab9-4b05-479d-b198-2760a09292d1") : secret "router-metrics-certs-default" not found Apr 16 19:54:25.478466 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.478430 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/5ea56c9b-a13c-4e3d-a9c0-5a4d7c254e14-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-7dd8db8f47-n8g5p\" (UID: \"5ea56c9b-a13c-4e3d-a9c0-5a4d7c254e14\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7dd8db8f47-n8g5p" Apr 16 19:54:25.478466 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.478455 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c485156a-3973-41b9-9936-2cd58d6d6ea4-serving-cert\") pod \"insights-operator-585dfdc468-gm4s4\" (UID: \"c485156a-3973-41b9-9936-2cd58d6d6ea4\") " pod="openshift-insights/insights-operator-585dfdc468-gm4s4" Apr 16 19:54:25.479067 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.478486 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/c37c23a9-747d-419a-8a58-907dc11ecf6d-registry-certificates\") pod \"image-registry-656d5cd769-kngz2\" (UID: \"c37c23a9-747d-419a-8a58-907dc11ecf6d\") " pod="openshift-image-registry/image-registry-656d5cd769-kngz2" Apr 16 19:54:25.479067 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.478512 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-679zt\" (UniqueName: \"kubernetes.io/projected/499fb04b-1629-4d2a-8d4c-4b6f38ec093e-kube-api-access-679zt\") pod \"kube-storage-version-migrator-operator-6769c5d45-g96dz\" (UID: \"499fb04b-1629-4d2a-8d4c-4b6f38ec093e\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-g96dz" Apr 16 19:54:25.479067 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.478539 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/684e28e1-6369-483c-abf7-b3f82437af2c-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-trj4c\" (UID: \"684e28e1-6369-483c-abf7-b3f82437af2c\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-trj4c" Apr 16 19:54:25.479067 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.478591 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x78bl\" (UniqueName: \"kubernetes.io/projected/72d1ed32-dd1f-4c30-adc3-db411de6c394-kube-api-access-x78bl\") pod \"volume-data-source-validator-7c6cbb6c87-6cnrc\" (UID: \"72d1ed32-dd1f-4c30-adc3-db411de6c394\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-6cnrc" Apr 16 19:54:25.479067 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.478617 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/499fb04b-1629-4d2a-8d4c-4b6f38ec093e-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-g96dz\" (UID: \"499fb04b-1629-4d2a-8d4c-4b6f38ec093e\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-g96dz" Apr 16 19:54:25.479067 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.478647 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/142d9ab9-4b05-479d-b198-2760a09292d1-stats-auth\") pod \"router-default-74fc646496-l58qj\" (UID: \"142d9ab9-4b05-479d-b198-2760a09292d1\") " pod="openshift-ingress/router-default-74fc646496-l58qj" Apr 16 19:54:25.479067 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.478674 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/c37c23a9-747d-419a-8a58-907dc11ecf6d-installation-pull-secrets\") pod \"image-registry-656d5cd769-kngz2\" (UID: \"c37c23a9-747d-419a-8a58-907dc11ecf6d\") " pod="openshift-image-registry/image-registry-656d5cd769-kngz2" Apr 16 19:54:25.479067 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.478697 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/142d9ab9-4b05-479d-b198-2760a09292d1-default-certificate\") pod \"router-default-74fc646496-l58qj\" (UID: \"142d9ab9-4b05-479d-b198-2760a09292d1\") " pod="openshift-ingress/router-default-74fc646496-l58qj" Apr 16 19:54:25.479067 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.478983 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d7cbe800-699f-48fe-9a8a-b74c32bf0dcc-config\") pod \"console-operator-9d4b6777b-2xttw\" (UID: \"d7cbe800-699f-48fe-9a8a-b74c32bf0dcc\") " pod="openshift-console-operator/console-operator-9d4b6777b-2xttw" Apr 16 19:54:25.479067 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.479018 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tccvx\" (UniqueName: \"kubernetes.io/projected/c485156a-3973-41b9-9936-2cd58d6d6ea4-kube-api-access-tccvx\") pod \"insights-operator-585dfdc468-gm4s4\" (UID: \"c485156a-3973-41b9-9936-2cd58d6d6ea4\") " pod="openshift-insights/insights-operator-585dfdc468-gm4s4" Apr 16 19:54:25.479067 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.479047 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/c37c23a9-747d-419a-8a58-907dc11ecf6d-ca-trust-extracted\") pod \"image-registry-656d5cd769-kngz2\" (UID: \"c37c23a9-747d-419a-8a58-907dc11ecf6d\") " pod="openshift-image-registry/image-registry-656d5cd769-kngz2" Apr 16 19:54:25.479067 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.479070 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2090cd61-18f9-425a-bf42-4b9628417aad-serving-cert\") pod \"service-ca-operator-d6fc45fc5-qxcc8\" (UID: \"2090cd61-18f9-425a-bf42-4b9628417aad\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-qxcc8" Apr 16 19:54:25.479538 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.479110 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/c485156a-3973-41b9-9936-2cd58d6d6ea4-snapshots\") pod \"insights-operator-585dfdc468-gm4s4\" (UID: \"c485156a-3973-41b9-9936-2cd58d6d6ea4\") " pod="openshift-insights/insights-operator-585dfdc468-gm4s4" Apr 16 19:54:25.479538 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.479136 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c37c23a9-747d-419a-8a58-907dc11ecf6d-trusted-ca\") pod \"image-registry-656d5cd769-kngz2\" (UID: \"c37c23a9-747d-419a-8a58-907dc11ecf6d\") " pod="openshift-image-registry/image-registry-656d5cd769-kngz2" Apr 16 19:54:25.479538 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.479163 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ftxm9\" (UniqueName: \"kubernetes.io/projected/c37c23a9-747d-419a-8a58-907dc11ecf6d-kube-api-access-ftxm9\") pod \"image-registry-656d5cd769-kngz2\" (UID: \"c37c23a9-747d-419a-8a58-907dc11ecf6d\") " pod="openshift-image-registry/image-registry-656d5cd769-kngz2" Apr 16 19:54:25.479538 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.479178 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/c37c23a9-747d-419a-8a58-907dc11ecf6d-registry-certificates\") pod \"image-registry-656d5cd769-kngz2\" (UID: \"c37c23a9-747d-419a-8a58-907dc11ecf6d\") " pod="openshift-image-registry/image-registry-656d5cd769-kngz2" Apr 16 19:54:25.479538 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.479185 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/c485156a-3973-41b9-9936-2cd58d6d6ea4-tmp\") pod \"insights-operator-585dfdc468-gm4s4\" (UID: \"c485156a-3973-41b9-9936-2cd58d6d6ea4\") " pod="openshift-insights/insights-operator-585dfdc468-gm4s4" Apr 16 19:54:25.479538 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.479236 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c37c23a9-747d-419a-8a58-907dc11ecf6d-bound-sa-token\") pod \"image-registry-656d5cd769-kngz2\" (UID: \"c37c23a9-747d-419a-8a58-907dc11ecf6d\") " pod="openshift-image-registry/image-registry-656d5cd769-kngz2" Apr 16 19:54:25.479538 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.479239 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/499fb04b-1629-4d2a-8d4c-4b6f38ec093e-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-g96dz\" (UID: \"499fb04b-1629-4d2a-8d4c-4b6f38ec093e\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-g96dz" Apr 16 19:54:25.479538 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.479427 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/c37c23a9-747d-419a-8a58-907dc11ecf6d-ca-trust-extracted\") pod \"image-registry-656d5cd769-kngz2\" (UID: \"c37c23a9-747d-419a-8a58-907dc11ecf6d\") " pod="openshift-image-registry/image-registry-656d5cd769-kngz2" Apr 16 19:54:25.479538 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.479470 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d7cbe800-699f-48fe-9a8a-b74c32bf0dcc-trusted-ca\") pod \"console-operator-9d4b6777b-2xttw\" (UID: \"d7cbe800-699f-48fe-9a8a-b74c32bf0dcc\") " pod="openshift-console-operator/console-operator-9d4b6777b-2xttw" Apr 16 19:54:25.479538 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.479514 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c37c23a9-747d-419a-8a58-907dc11ecf6d-registry-tls\") pod \"image-registry-656d5cd769-kngz2\" (UID: \"c37c23a9-747d-419a-8a58-907dc11ecf6d\") " pod="openshift-image-registry/image-registry-656d5cd769-kngz2" Apr 16 19:54:25.480009 ip-10-0-128-48 kubenswrapper[2572]: E0416 19:54:25.479590 2572 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 19:54:25.480009 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.479591 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2090cd61-18f9-425a-bf42-4b9628417aad-config\") pod \"service-ca-operator-d6fc45fc5-qxcc8\" (UID: \"2090cd61-18f9-425a-bf42-4b9628417aad\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-qxcc8" Apr 16 19:54:25.480009 ip-10-0-128-48 kubenswrapper[2572]: E0416 19:54:25.479601 2572 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-656d5cd769-kngz2: secret "image-registry-tls" not found Apr 16 19:54:25.480009 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.479619 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c485156a-3973-41b9-9936-2cd58d6d6ea4-service-ca-bundle\") pod \"insights-operator-585dfdc468-gm4s4\" (UID: \"c485156a-3973-41b9-9936-2cd58d6d6ea4\") " pod="openshift-insights/insights-operator-585dfdc468-gm4s4" Apr 16 19:54:25.480009 ip-10-0-128-48 kubenswrapper[2572]: E0416 19:54:25.479636 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c37c23a9-747d-419a-8a58-907dc11ecf6d-registry-tls podName:c37c23a9-747d-419a-8a58-907dc11ecf6d nodeName:}" failed. No retries permitted until 2026-04-16 19:54:25.979624388 +0000 UTC m=+33.158386406 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/c37c23a9-747d-419a-8a58-907dc11ecf6d-registry-tls") pod "image-registry-656d5cd769-kngz2" (UID: "c37c23a9-747d-419a-8a58-907dc11ecf6d") : secret "image-registry-tls" not found Apr 16 19:54:25.480009 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.479676 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d7cbe800-699f-48fe-9a8a-b74c32bf0dcc-serving-cert\") pod \"console-operator-9d4b6777b-2xttw\" (UID: \"d7cbe800-699f-48fe-9a8a-b74c32bf0dcc\") " pod="openshift-console-operator/console-operator-9d4b6777b-2xttw" Apr 16 19:54:25.480009 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.479704 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/c37c23a9-747d-419a-8a58-907dc11ecf6d-image-registry-private-configuration\") pod \"image-registry-656d5cd769-kngz2\" (UID: \"c37c23a9-747d-419a-8a58-907dc11ecf6d\") " pod="openshift-image-registry/image-registry-656d5cd769-kngz2" Apr 16 19:54:25.480009 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.479731 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c485156a-3973-41b9-9936-2cd58d6d6ea4-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-gm4s4\" (UID: \"c485156a-3973-41b9-9936-2cd58d6d6ea4\") " pod="openshift-insights/insights-operator-585dfdc468-gm4s4" Apr 16 19:54:25.480553 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.480364 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c37c23a9-747d-419a-8a58-907dc11ecf6d-trusted-ca\") pod \"image-registry-656d5cd769-kngz2\" (UID: \"c37c23a9-747d-419a-8a58-907dc11ecf6d\") " pod="openshift-image-registry/image-registry-656d5cd769-kngz2" Apr 16 19:54:25.482576 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.482556 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/499fb04b-1629-4d2a-8d4c-4b6f38ec093e-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-g96dz\" (UID: \"499fb04b-1629-4d2a-8d4c-4b6f38ec093e\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-g96dz" Apr 16 19:54:25.482805 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.482739 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/c37c23a9-747d-419a-8a58-907dc11ecf6d-installation-pull-secrets\") pod \"image-registry-656d5cd769-kngz2\" (UID: \"c37c23a9-747d-419a-8a58-907dc11ecf6d\") " pod="openshift-image-registry/image-registry-656d5cd769-kngz2" Apr 16 19:54:25.482805 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.482739 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/142d9ab9-4b05-479d-b198-2760a09292d1-default-certificate\") pod \"router-default-74fc646496-l58qj\" (UID: \"142d9ab9-4b05-479d-b198-2760a09292d1\") " pod="openshift-ingress/router-default-74fc646496-l58qj" Apr 16 19:54:25.482805 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.482781 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/142d9ab9-4b05-479d-b198-2760a09292d1-stats-auth\") pod \"router-default-74fc646496-l58qj\" (UID: \"142d9ab9-4b05-479d-b198-2760a09292d1\") " pod="openshift-ingress/router-default-74fc646496-l58qj" Apr 16 19:54:25.484392 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.484375 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/c37c23a9-747d-419a-8a58-907dc11ecf6d-image-registry-private-configuration\") pod \"image-registry-656d5cd769-kngz2\" (UID: \"c37c23a9-747d-419a-8a58-907dc11ecf6d\") " pod="openshift-image-registry/image-registry-656d5cd769-kngz2" Apr 16 19:54:25.492568 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.492527 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-679zt\" (UniqueName: \"kubernetes.io/projected/499fb04b-1629-4d2a-8d4c-4b6f38ec093e-kube-api-access-679zt\") pod \"kube-storage-version-migrator-operator-6769c5d45-g96dz\" (UID: \"499fb04b-1629-4d2a-8d4c-4b6f38ec093e\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-g96dz" Apr 16 19:54:25.492771 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.492749 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x78bl\" (UniqueName: \"kubernetes.io/projected/72d1ed32-dd1f-4c30-adc3-db411de6c394-kube-api-access-x78bl\") pod \"volume-data-source-validator-7c6cbb6c87-6cnrc\" (UID: \"72d1ed32-dd1f-4c30-adc3-db411de6c394\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-6cnrc" Apr 16 19:54:25.493194 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.493173 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hk9w8\" (UniqueName: \"kubernetes.io/projected/142d9ab9-4b05-479d-b198-2760a09292d1-kube-api-access-hk9w8\") pod \"router-default-74fc646496-l58qj\" (UID: \"142d9ab9-4b05-479d-b198-2760a09292d1\") " pod="openshift-ingress/router-default-74fc646496-l58qj" Apr 16 19:54:25.494005 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.493985 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ftxm9\" (UniqueName: \"kubernetes.io/projected/c37c23a9-747d-419a-8a58-907dc11ecf6d-kube-api-access-ftxm9\") pod \"image-registry-656d5cd769-kngz2\" (UID: \"c37c23a9-747d-419a-8a58-907dc11ecf6d\") " pod="openshift-image-registry/image-registry-656d5cd769-kngz2" Apr 16 19:54:25.494914 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.494897 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c37c23a9-747d-419a-8a58-907dc11ecf6d-bound-sa-token\") pod \"image-registry-656d5cd769-kngz2\" (UID: \"c37c23a9-747d-419a-8a58-907dc11ecf6d\") " pod="openshift-image-registry/image-registry-656d5cd769-kngz2" Apr 16 19:54:25.548872 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.548850 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-6cnrc" Apr 16 19:54:25.564421 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.564399 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-g96dz" Apr 16 19:54:25.580328 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.580175 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d7cbe800-699f-48fe-9a8a-b74c32bf0dcc-serving-cert\") pod \"console-operator-9d4b6777b-2xttw\" (UID: \"d7cbe800-699f-48fe-9a8a-b74c32bf0dcc\") " pod="openshift-console-operator/console-operator-9d4b6777b-2xttw" Apr 16 19:54:25.580328 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.580222 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/856e9e38-1147-41be-a5de-fe7ea1bec4c3-klusterlet-config\") pod \"klusterlet-addon-workmgr-649f77c955-hl7lw\" (UID: \"856e9e38-1147-41be-a5de-fe7ea1bec4c3\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-649f77c955-hl7lw" Apr 16 19:54:25.580328 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.580252 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-74pjk\" (UniqueName: \"kubernetes.io/projected/a94af5f5-bc76-4166-81f8-5f322b1a86ab-kube-api-access-74pjk\") pod \"cluster-proxy-proxy-agent-78477dcc55-b42tf\" (UID: \"a94af5f5-bc76-4166-81f8-5f322b1a86ab\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-78477dcc55-b42tf" Apr 16 19:54:25.580328 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.580283 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c485156a-3973-41b9-9936-2cd58d6d6ea4-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-gm4s4\" (UID: \"c485156a-3973-41b9-9936-2cd58d6d6ea4\") " pod="openshift-insights/insights-operator-585dfdc468-gm4s4" Apr 16 19:54:25.580596 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.580331 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/dea42027-362c-41e6-a940-a20b985788b0-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-lcvgf\" (UID: \"dea42027-362c-41e6-a940-a20b985788b0\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-lcvgf" Apr 16 19:54:25.580596 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.580386 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mzw9x\" (UniqueName: \"kubernetes.io/projected/5ea56c9b-a13c-4e3d-a9c0-5a4d7c254e14-kube-api-access-mzw9x\") pod \"managed-serviceaccount-addon-agent-7dd8db8f47-n8g5p\" (UID: \"5ea56c9b-a13c-4e3d-a9c0-5a4d7c254e14\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7dd8db8f47-n8g5p" Apr 16 19:54:25.580596 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.580416 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/40b37910-94fc-4da8-aa1d-163af810d004-metrics-tls\") pod \"dns-default-9clf4\" (UID: \"40b37910-94fc-4da8-aa1d-163af810d004\") " pod="openshift-dns/dns-default-9clf4" Apr 16 19:54:25.580596 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.580448 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/a94af5f5-bc76-4166-81f8-5f322b1a86ab-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-78477dcc55-b42tf\" (UID: \"a94af5f5-bc76-4166-81f8-5f322b1a86ab\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-78477dcc55-b42tf" Apr 16 19:54:25.580596 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.580484 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kvjhv\" (UniqueName: \"kubernetes.io/projected/2090cd61-18f9-425a-bf42-4b9628417aad-kube-api-access-kvjhv\") pod \"service-ca-operator-d6fc45fc5-qxcc8\" (UID: \"2090cd61-18f9-425a-bf42-4b9628417aad\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-qxcc8" Apr 16 19:54:25.580596 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.580578 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vsjz2\" (UniqueName: \"kubernetes.io/projected/684e28e1-6369-483c-abf7-b3f82437af2c-kube-api-access-vsjz2\") pod \"cluster-samples-operator-6dc5bdb6b4-trj4c\" (UID: \"684e28e1-6369-483c-abf7-b3f82437af2c\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-trj4c" Apr 16 19:54:25.580977 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.580642 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j7gqb\" (UniqueName: \"kubernetes.io/projected/d7cbe800-699f-48fe-9a8a-b74c32bf0dcc-kube-api-access-j7gqb\") pod \"console-operator-9d4b6777b-2xttw\" (UID: \"d7cbe800-699f-48fe-9a8a-b74c32bf0dcc\") " pod="openshift-console-operator/console-operator-9d4b6777b-2xttw" Apr 16 19:54:25.580977 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.580669 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/5ea56c9b-a13c-4e3d-a9c0-5a4d7c254e14-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-7dd8db8f47-n8g5p\" (UID: \"5ea56c9b-a13c-4e3d-a9c0-5a4d7c254e14\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7dd8db8f47-n8g5p" Apr 16 19:54:25.580977 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.580694 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c485156a-3973-41b9-9936-2cd58d6d6ea4-serving-cert\") pod \"insights-operator-585dfdc468-gm4s4\" (UID: \"c485156a-3973-41b9-9936-2cd58d6d6ea4\") " pod="openshift-insights/insights-operator-585dfdc468-gm4s4" Apr 16 19:54:25.580977 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.580723 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/856e9e38-1147-41be-a5de-fe7ea1bec4c3-tmp\") pod \"klusterlet-addon-workmgr-649f77c955-hl7lw\" (UID: \"856e9e38-1147-41be-a5de-fe7ea1bec4c3\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-649f77c955-hl7lw" Apr 16 19:54:25.580977 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.580753 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/684e28e1-6369-483c-abf7-b3f82437af2c-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-trj4c\" (UID: \"684e28e1-6369-483c-abf7-b3f82437af2c\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-trj4c" Apr 16 19:54:25.580977 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.580777 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2676cacd-954e-4b7c-a85c-1b43b90f0471-cert\") pod \"ingress-canary-ppsrs\" (UID: \"2676cacd-954e-4b7c-a85c-1b43b90f0471\") " pod="openshift-ingress-canary/ingress-canary-ppsrs" Apr 16 19:54:25.580977 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.580801 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xf88s\" (UniqueName: \"kubernetes.io/projected/856e9e38-1147-41be-a5de-fe7ea1bec4c3-kube-api-access-xf88s\") pod \"klusterlet-addon-workmgr-649f77c955-hl7lw\" (UID: \"856e9e38-1147-41be-a5de-fe7ea1bec4c3\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-649f77c955-hl7lw" Apr 16 19:54:25.580977 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.580825 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/a94af5f5-bc76-4166-81f8-5f322b1a86ab-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-78477dcc55-b42tf\" (UID: \"a94af5f5-bc76-4166-81f8-5f322b1a86ab\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-78477dcc55-b42tf" Apr 16 19:54:25.580977 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.580887 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t6xmx\" (UniqueName: \"kubernetes.io/projected/528c7eb6-e54f-42e5-bd14-acbc205001c1-kube-api-access-t6xmx\") pod \"network-check-source-8894fc9bd-c8jsl\" (UID: \"528c7eb6-e54f-42e5-bd14-acbc205001c1\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-c8jsl" Apr 16 19:54:25.580977 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.580919 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d7cbe800-699f-48fe-9a8a-b74c32bf0dcc-config\") pod \"console-operator-9d4b6777b-2xttw\" (UID: \"d7cbe800-699f-48fe-9a8a-b74c32bf0dcc\") " pod="openshift-console-operator/console-operator-9d4b6777b-2xttw" Apr 16 19:54:25.580977 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.580949 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/b9ff25d1-4296-4a79-9bfa-cad826fb48cb-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-rcxgg\" (UID: \"b9ff25d1-4296-4a79-9bfa-cad826fb48cb\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-rcxgg" Apr 16 19:54:25.581582 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.580985 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/b9ff25d1-4296-4a79-9bfa-cad826fb48cb-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-rcxgg\" (UID: \"b9ff25d1-4296-4a79-9bfa-cad826fb48cb\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-rcxgg" Apr 16 19:54:25.581582 ip-10-0-128-48 kubenswrapper[2572]: E0416 19:54:25.581006 2572 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 19:54:25.581582 ip-10-0-128-48 kubenswrapper[2572]: E0416 19:54:25.581078 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/684e28e1-6369-483c-abf7-b3f82437af2c-samples-operator-tls podName:684e28e1-6369-483c-abf7-b3f82437af2c nodeName:}" failed. No retries permitted until 2026-04-16 19:54:26.081059214 +0000 UTC m=+33.259821217 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/684e28e1-6369-483c-abf7-b3f82437af2c-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-trj4c" (UID: "684e28e1-6369-483c-abf7-b3f82437af2c") : secret "samples-operator-tls" not found Apr 16 19:54:25.581582 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.581304 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c485156a-3973-41b9-9936-2cd58d6d6ea4-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-gm4s4\" (UID: \"c485156a-3973-41b9-9936-2cd58d6d6ea4\") " pod="openshift-insights/insights-operator-585dfdc468-gm4s4" Apr 16 19:54:25.581582 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.581373 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tccvx\" (UniqueName: \"kubernetes.io/projected/c485156a-3973-41b9-9936-2cd58d6d6ea4-kube-api-access-tccvx\") pod \"insights-operator-585dfdc468-gm4s4\" (UID: \"c485156a-3973-41b9-9936-2cd58d6d6ea4\") " pod="openshift-insights/insights-operator-585dfdc468-gm4s4" Apr 16 19:54:25.581582 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.581490 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2090cd61-18f9-425a-bf42-4b9628417aad-serving-cert\") pod \"service-ca-operator-d6fc45fc5-qxcc8\" (UID: \"2090cd61-18f9-425a-bf42-4b9628417aad\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-qxcc8" Apr 16 19:54:25.581582 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.581552 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/40b37910-94fc-4da8-aa1d-163af810d004-config-volume\") pod \"dns-default-9clf4\" (UID: \"40b37910-94fc-4da8-aa1d-163af810d004\") " pod="openshift-dns/dns-default-9clf4" Apr 16 19:54:25.581921 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.581630 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/c485156a-3973-41b9-9936-2cd58d6d6ea4-snapshots\") pod \"insights-operator-585dfdc468-gm4s4\" (UID: \"c485156a-3973-41b9-9936-2cd58d6d6ea4\") " pod="openshift-insights/insights-operator-585dfdc468-gm4s4" Apr 16 19:54:25.581921 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.581693 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d7cbe800-699f-48fe-9a8a-b74c32bf0dcc-config\") pod \"console-operator-9d4b6777b-2xttw\" (UID: \"d7cbe800-699f-48fe-9a8a-b74c32bf0dcc\") " pod="openshift-console-operator/console-operator-9d4b6777b-2xttw" Apr 16 19:54:25.581921 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.581706 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/a94af5f5-bc76-4166-81f8-5f322b1a86ab-ca\") pod \"cluster-proxy-proxy-agent-78477dcc55-b42tf\" (UID: \"a94af5f5-bc76-4166-81f8-5f322b1a86ab\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-78477dcc55-b42tf" Apr 16 19:54:25.581921 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.581771 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/40b37910-94fc-4da8-aa1d-163af810d004-tmp-dir\") pod \"dns-default-9clf4\" (UID: \"40b37910-94fc-4da8-aa1d-163af810d004\") " pod="openshift-dns/dns-default-9clf4" Apr 16 19:54:25.581921 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.581803 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/c485156a-3973-41b9-9936-2cd58d6d6ea4-tmp\") pod \"insights-operator-585dfdc468-gm4s4\" (UID: \"c485156a-3973-41b9-9936-2cd58d6d6ea4\") " pod="openshift-insights/insights-operator-585dfdc468-gm4s4" Apr 16 19:54:25.581921 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.581828 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/a94af5f5-bc76-4166-81f8-5f322b1a86ab-hub\") pod \"cluster-proxy-proxy-agent-78477dcc55-b42tf\" (UID: \"a94af5f5-bc76-4166-81f8-5f322b1a86ab\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-78477dcc55-b42tf" Apr 16 19:54:25.581921 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.581905 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d7cbe800-699f-48fe-9a8a-b74c32bf0dcc-trusted-ca\") pod \"console-operator-9d4b6777b-2xttw\" (UID: \"d7cbe800-699f-48fe-9a8a-b74c32bf0dcc\") " pod="openshift-console-operator/console-operator-9d4b6777b-2xttw" Apr 16 19:54:25.582255 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.581946 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/dea42027-362c-41e6-a940-a20b985788b0-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-lcvgf\" (UID: \"dea42027-362c-41e6-a940-a20b985788b0\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-lcvgf" Apr 16 19:54:25.582255 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.581979 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sn7vv\" (UniqueName: \"kubernetes.io/projected/40b37910-94fc-4da8-aa1d-163af810d004-kube-api-access-sn7vv\") pod \"dns-default-9clf4\" (UID: \"40b37910-94fc-4da8-aa1d-163af810d004\") " pod="openshift-dns/dns-default-9clf4" Apr 16 19:54:25.582255 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.582004 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j4696\" (UniqueName: \"kubernetes.io/projected/2676cacd-954e-4b7c-a85c-1b43b90f0471-kube-api-access-j4696\") pod \"ingress-canary-ppsrs\" (UID: \"2676cacd-954e-4b7c-a85c-1b43b90f0471\") " pod="openshift-ingress-canary/ingress-canary-ppsrs" Apr 16 19:54:25.582255 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.582034 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qffnj\" (UniqueName: \"kubernetes.io/projected/b9ff25d1-4296-4a79-9bfa-cad826fb48cb-kube-api-access-qffnj\") pod \"cluster-monitoring-operator-75587bd455-rcxgg\" (UID: \"b9ff25d1-4296-4a79-9bfa-cad826fb48cb\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-rcxgg" Apr 16 19:54:25.582255 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.582078 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/a94af5f5-bc76-4166-81f8-5f322b1a86ab-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-78477dcc55-b42tf\" (UID: \"a94af5f5-bc76-4166-81f8-5f322b1a86ab\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-78477dcc55-b42tf" Apr 16 19:54:25.582255 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.582133 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2090cd61-18f9-425a-bf42-4b9628417aad-config\") pod \"service-ca-operator-d6fc45fc5-qxcc8\" (UID: \"2090cd61-18f9-425a-bf42-4b9628417aad\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-qxcc8" Apr 16 19:54:25.582255 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.582161 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c485156a-3973-41b9-9936-2cd58d6d6ea4-service-ca-bundle\") pod \"insights-operator-585dfdc468-gm4s4\" (UID: \"c485156a-3973-41b9-9936-2cd58d6d6ea4\") " pod="openshift-insights/insights-operator-585dfdc468-gm4s4" Apr 16 19:54:25.582255 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.582190 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/c485156a-3973-41b9-9936-2cd58d6d6ea4-snapshots\") pod \"insights-operator-585dfdc468-gm4s4\" (UID: \"c485156a-3973-41b9-9936-2cd58d6d6ea4\") " pod="openshift-insights/insights-operator-585dfdc468-gm4s4" Apr 16 19:54:25.582611 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.582323 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/c485156a-3973-41b9-9936-2cd58d6d6ea4-tmp\") pod \"insights-operator-585dfdc468-gm4s4\" (UID: \"c485156a-3973-41b9-9936-2cd58d6d6ea4\") " pod="openshift-insights/insights-operator-585dfdc468-gm4s4" Apr 16 19:54:25.582711 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.582687 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c485156a-3973-41b9-9936-2cd58d6d6ea4-service-ca-bundle\") pod \"insights-operator-585dfdc468-gm4s4\" (UID: \"c485156a-3973-41b9-9936-2cd58d6d6ea4\") " pod="openshift-insights/insights-operator-585dfdc468-gm4s4" Apr 16 19:54:25.582866 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.582824 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2090cd61-18f9-425a-bf42-4b9628417aad-config\") pod \"service-ca-operator-d6fc45fc5-qxcc8\" (UID: \"2090cd61-18f9-425a-bf42-4b9628417aad\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-qxcc8" Apr 16 19:54:25.583675 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.583624 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d7cbe800-699f-48fe-9a8a-b74c32bf0dcc-trusted-ca\") pod \"console-operator-9d4b6777b-2xttw\" (UID: \"d7cbe800-699f-48fe-9a8a-b74c32bf0dcc\") " pod="openshift-console-operator/console-operator-9d4b6777b-2xttw" Apr 16 19:54:25.585315 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.585273 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d7cbe800-699f-48fe-9a8a-b74c32bf0dcc-serving-cert\") pod \"console-operator-9d4b6777b-2xttw\" (UID: \"d7cbe800-699f-48fe-9a8a-b74c32bf0dcc\") " pod="openshift-console-operator/console-operator-9d4b6777b-2xttw" Apr 16 19:54:25.585315 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.585273 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/5ea56c9b-a13c-4e3d-a9c0-5a4d7c254e14-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-7dd8db8f47-n8g5p\" (UID: \"5ea56c9b-a13c-4e3d-a9c0-5a4d7c254e14\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7dd8db8f47-n8g5p" Apr 16 19:54:25.585460 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.585347 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c485156a-3973-41b9-9936-2cd58d6d6ea4-serving-cert\") pod \"insights-operator-585dfdc468-gm4s4\" (UID: \"c485156a-3973-41b9-9936-2cd58d6d6ea4\") " pod="openshift-insights/insights-operator-585dfdc468-gm4s4" Apr 16 19:54:25.585884 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.585559 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2090cd61-18f9-425a-bf42-4b9628417aad-serving-cert\") pod \"service-ca-operator-d6fc45fc5-qxcc8\" (UID: \"2090cd61-18f9-425a-bf42-4b9628417aad\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-qxcc8" Apr 16 19:54:25.592831 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.592766 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mzw9x\" (UniqueName: \"kubernetes.io/projected/5ea56c9b-a13c-4e3d-a9c0-5a4d7c254e14-kube-api-access-mzw9x\") pod \"managed-serviceaccount-addon-agent-7dd8db8f47-n8g5p\" (UID: \"5ea56c9b-a13c-4e3d-a9c0-5a4d7c254e14\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7dd8db8f47-n8g5p" Apr 16 19:54:25.593649 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.593595 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tccvx\" (UniqueName: \"kubernetes.io/projected/c485156a-3973-41b9-9936-2cd58d6d6ea4-kube-api-access-tccvx\") pod \"insights-operator-585dfdc468-gm4s4\" (UID: \"c485156a-3973-41b9-9936-2cd58d6d6ea4\") " pod="openshift-insights/insights-operator-585dfdc468-gm4s4" Apr 16 19:54:25.593649 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.593600 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kvjhv\" (UniqueName: \"kubernetes.io/projected/2090cd61-18f9-425a-bf42-4b9628417aad-kube-api-access-kvjhv\") pod \"service-ca-operator-d6fc45fc5-qxcc8\" (UID: \"2090cd61-18f9-425a-bf42-4b9628417aad\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-qxcc8" Apr 16 19:54:25.594187 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.594143 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j7gqb\" (UniqueName: \"kubernetes.io/projected/d7cbe800-699f-48fe-9a8a-b74c32bf0dcc-kube-api-access-j7gqb\") pod \"console-operator-9d4b6777b-2xttw\" (UID: \"d7cbe800-699f-48fe-9a8a-b74c32bf0dcc\") " pod="openshift-console-operator/console-operator-9d4b6777b-2xttw" Apr 16 19:54:25.596018 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.595992 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vsjz2\" (UniqueName: \"kubernetes.io/projected/684e28e1-6369-483c-abf7-b3f82437af2c-kube-api-access-vsjz2\") pod \"cluster-samples-operator-6dc5bdb6b4-trj4c\" (UID: \"684e28e1-6369-483c-abf7-b3f82437af2c\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-trj4c" Apr 16 19:54:25.623396 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.622811 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-gm4s4" Apr 16 19:54:25.653715 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.653288 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-2xttw" Apr 16 19:54:25.686883 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.683946 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/856e9e38-1147-41be-a5de-fe7ea1bec4c3-klusterlet-config\") pod \"klusterlet-addon-workmgr-649f77c955-hl7lw\" (UID: \"856e9e38-1147-41be-a5de-fe7ea1bec4c3\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-649f77c955-hl7lw" Apr 16 19:54:25.686883 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.683992 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-74pjk\" (UniqueName: \"kubernetes.io/projected/a94af5f5-bc76-4166-81f8-5f322b1a86ab-kube-api-access-74pjk\") pod \"cluster-proxy-proxy-agent-78477dcc55-b42tf\" (UID: \"a94af5f5-bc76-4166-81f8-5f322b1a86ab\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-78477dcc55-b42tf" Apr 16 19:54:25.686883 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.684024 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/dea42027-362c-41e6-a940-a20b985788b0-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-lcvgf\" (UID: \"dea42027-362c-41e6-a940-a20b985788b0\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-lcvgf" Apr 16 19:54:25.686883 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.684056 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/40b37910-94fc-4da8-aa1d-163af810d004-metrics-tls\") pod \"dns-default-9clf4\" (UID: \"40b37910-94fc-4da8-aa1d-163af810d004\") " pod="openshift-dns/dns-default-9clf4" Apr 16 19:54:25.686883 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.684084 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/a94af5f5-bc76-4166-81f8-5f322b1a86ab-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-78477dcc55-b42tf\" (UID: \"a94af5f5-bc76-4166-81f8-5f322b1a86ab\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-78477dcc55-b42tf" Apr 16 19:54:25.686883 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.684170 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/856e9e38-1147-41be-a5de-fe7ea1bec4c3-tmp\") pod \"klusterlet-addon-workmgr-649f77c955-hl7lw\" (UID: \"856e9e38-1147-41be-a5de-fe7ea1bec4c3\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-649f77c955-hl7lw" Apr 16 19:54:25.686883 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.684214 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2676cacd-954e-4b7c-a85c-1b43b90f0471-cert\") pod \"ingress-canary-ppsrs\" (UID: \"2676cacd-954e-4b7c-a85c-1b43b90f0471\") " pod="openshift-ingress-canary/ingress-canary-ppsrs" Apr 16 19:54:25.686883 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.684240 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xf88s\" (UniqueName: \"kubernetes.io/projected/856e9e38-1147-41be-a5de-fe7ea1bec4c3-kube-api-access-xf88s\") pod \"klusterlet-addon-workmgr-649f77c955-hl7lw\" (UID: \"856e9e38-1147-41be-a5de-fe7ea1bec4c3\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-649f77c955-hl7lw" Apr 16 19:54:25.686883 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.684264 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/a94af5f5-bc76-4166-81f8-5f322b1a86ab-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-78477dcc55-b42tf\" (UID: \"a94af5f5-bc76-4166-81f8-5f322b1a86ab\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-78477dcc55-b42tf" Apr 16 19:54:25.686883 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.684323 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t6xmx\" (UniqueName: \"kubernetes.io/projected/528c7eb6-e54f-42e5-bd14-acbc205001c1-kube-api-access-t6xmx\") pod \"network-check-source-8894fc9bd-c8jsl\" (UID: \"528c7eb6-e54f-42e5-bd14-acbc205001c1\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-c8jsl" Apr 16 19:54:25.686883 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.684357 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/b9ff25d1-4296-4a79-9bfa-cad826fb48cb-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-rcxgg\" (UID: \"b9ff25d1-4296-4a79-9bfa-cad826fb48cb\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-rcxgg" Apr 16 19:54:25.686883 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.684383 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/b9ff25d1-4296-4a79-9bfa-cad826fb48cb-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-rcxgg\" (UID: \"b9ff25d1-4296-4a79-9bfa-cad826fb48cb\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-rcxgg" Apr 16 19:54:25.686883 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.684413 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/40b37910-94fc-4da8-aa1d-163af810d004-config-volume\") pod \"dns-default-9clf4\" (UID: \"40b37910-94fc-4da8-aa1d-163af810d004\") " pod="openshift-dns/dns-default-9clf4" Apr 16 19:54:25.686883 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.684438 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/a94af5f5-bc76-4166-81f8-5f322b1a86ab-ca\") pod \"cluster-proxy-proxy-agent-78477dcc55-b42tf\" (UID: \"a94af5f5-bc76-4166-81f8-5f322b1a86ab\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-78477dcc55-b42tf" Apr 16 19:54:25.686883 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.684466 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/40b37910-94fc-4da8-aa1d-163af810d004-tmp-dir\") pod \"dns-default-9clf4\" (UID: \"40b37910-94fc-4da8-aa1d-163af810d004\") " pod="openshift-dns/dns-default-9clf4" Apr 16 19:54:25.686883 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.684494 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/a94af5f5-bc76-4166-81f8-5f322b1a86ab-hub\") pod \"cluster-proxy-proxy-agent-78477dcc55-b42tf\" (UID: \"a94af5f5-bc76-4166-81f8-5f322b1a86ab\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-78477dcc55-b42tf" Apr 16 19:54:25.687706 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.684551 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/dea42027-362c-41e6-a940-a20b985788b0-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-lcvgf\" (UID: \"dea42027-362c-41e6-a940-a20b985788b0\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-lcvgf" Apr 16 19:54:25.687706 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.684579 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sn7vv\" (UniqueName: \"kubernetes.io/projected/40b37910-94fc-4da8-aa1d-163af810d004-kube-api-access-sn7vv\") pod \"dns-default-9clf4\" (UID: \"40b37910-94fc-4da8-aa1d-163af810d004\") " pod="openshift-dns/dns-default-9clf4" Apr 16 19:54:25.687706 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.684605 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j4696\" (UniqueName: \"kubernetes.io/projected/2676cacd-954e-4b7c-a85c-1b43b90f0471-kube-api-access-j4696\") pod \"ingress-canary-ppsrs\" (UID: \"2676cacd-954e-4b7c-a85c-1b43b90f0471\") " pod="openshift-ingress-canary/ingress-canary-ppsrs" Apr 16 19:54:25.687706 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.684634 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qffnj\" (UniqueName: \"kubernetes.io/projected/b9ff25d1-4296-4a79-9bfa-cad826fb48cb-kube-api-access-qffnj\") pod \"cluster-monitoring-operator-75587bd455-rcxgg\" (UID: \"b9ff25d1-4296-4a79-9bfa-cad826fb48cb\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-rcxgg" Apr 16 19:54:25.687706 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.684678 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/a94af5f5-bc76-4166-81f8-5f322b1a86ab-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-78477dcc55-b42tf\" (UID: \"a94af5f5-bc76-4166-81f8-5f322b1a86ab\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-78477dcc55-b42tf" Apr 16 19:54:25.687706 ip-10-0-128-48 kubenswrapper[2572]: E0416 19:54:25.685893 2572 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 19:54:25.687706 ip-10-0-128-48 kubenswrapper[2572]: E0416 19:54:25.685972 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b9ff25d1-4296-4a79-9bfa-cad826fb48cb-cluster-monitoring-operator-tls podName:b9ff25d1-4296-4a79-9bfa-cad826fb48cb nodeName:}" failed. No retries permitted until 2026-04-16 19:54:26.185953114 +0000 UTC m=+33.364715129 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/b9ff25d1-4296-4a79-9bfa-cad826fb48cb-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-rcxgg" (UID: "b9ff25d1-4296-4a79-9bfa-cad826fb48cb") : secret "cluster-monitoring-operator-tls" not found Apr 16 19:54:25.687706 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.686053 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/40b37910-94fc-4da8-aa1d-163af810d004-tmp-dir\") pod \"dns-default-9clf4\" (UID: \"40b37910-94fc-4da8-aa1d-163af810d004\") " pod="openshift-dns/dns-default-9clf4" Apr 16 19:54:25.687706 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.686548 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/40b37910-94fc-4da8-aa1d-163af810d004-config-volume\") pod \"dns-default-9clf4\" (UID: \"40b37910-94fc-4da8-aa1d-163af810d004\") " pod="openshift-dns/dns-default-9clf4" Apr 16 19:54:25.688205 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.686834 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/b9ff25d1-4296-4a79-9bfa-cad826fb48cb-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-rcxgg\" (UID: \"b9ff25d1-4296-4a79-9bfa-cad826fb48cb\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-rcxgg" Apr 16 19:54:25.690548 ip-10-0-128-48 kubenswrapper[2572]: E0416 19:54:25.687605 2572 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 19:54:25.690548 ip-10-0-128-48 kubenswrapper[2572]: E0416 19:54:25.688744 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dea42027-362c-41e6-a940-a20b985788b0-networking-console-plugin-cert podName:dea42027-362c-41e6-a940-a20b985788b0 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:26.188728447 +0000 UTC m=+33.367490450 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/dea42027-362c-41e6-a940-a20b985788b0-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-lcvgf" (UID: "dea42027-362c-41e6-a940-a20b985788b0") : secret "networking-console-plugin-cert" not found Apr 16 19:54:25.690548 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.688625 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/856e9e38-1147-41be-a5de-fe7ea1bec4c3-tmp\") pod \"klusterlet-addon-workmgr-649f77c955-hl7lw\" (UID: \"856e9e38-1147-41be-a5de-fe7ea1bec4c3\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-649f77c955-hl7lw" Apr 16 19:54:25.690548 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.688454 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/dea42027-362c-41e6-a940-a20b985788b0-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-lcvgf\" (UID: \"dea42027-362c-41e6-a940-a20b985788b0\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-lcvgf" Apr 16 19:54:25.690548 ip-10-0-128-48 kubenswrapper[2572]: E0416 19:54:25.688691 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 19:54:25.690548 ip-10-0-128-48 kubenswrapper[2572]: E0416 19:54:25.688804 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2676cacd-954e-4b7c-a85c-1b43b90f0471-cert podName:2676cacd-954e-4b7c-a85c-1b43b90f0471 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:26.188793589 +0000 UTC m=+33.367555594 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2676cacd-954e-4b7c-a85c-1b43b90f0471-cert") pod "ingress-canary-ppsrs" (UID: "2676cacd-954e-4b7c-a85c-1b43b90f0471") : secret "canary-serving-cert" not found Apr 16 19:54:25.690548 ip-10-0-128-48 kubenswrapper[2572]: E0416 19:54:25.689029 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 19:54:25.690548 ip-10-0-128-48 kubenswrapper[2572]: E0416 19:54:25.689068 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/40b37910-94fc-4da8-aa1d-163af810d004-metrics-tls podName:40b37910-94fc-4da8-aa1d-163af810d004 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:26.189055639 +0000 UTC m=+33.367817644 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/40b37910-94fc-4da8-aa1d-163af810d004-metrics-tls") pod "dns-default-9clf4" (UID: "40b37910-94fc-4da8-aa1d-163af810d004") : secret "dns-default-metrics-tls" not found Apr 16 19:54:25.690548 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.689485 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/a94af5f5-bc76-4166-81f8-5f322b1a86ab-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-78477dcc55-b42tf\" (UID: \"a94af5f5-bc76-4166-81f8-5f322b1a86ab\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-78477dcc55-b42tf" Apr 16 19:54:25.693151 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.693124 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/a94af5f5-bc76-4166-81f8-5f322b1a86ab-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-78477dcc55-b42tf\" (UID: \"a94af5f5-bc76-4166-81f8-5f322b1a86ab\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-78477dcc55-b42tf" Apr 16 19:54:25.693564 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.693539 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/856e9e38-1147-41be-a5de-fe7ea1bec4c3-klusterlet-config\") pod \"klusterlet-addon-workmgr-649f77c955-hl7lw\" (UID: \"856e9e38-1147-41be-a5de-fe7ea1bec4c3\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-649f77c955-hl7lw" Apr 16 19:54:25.696042 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.695404 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/a94af5f5-bc76-4166-81f8-5f322b1a86ab-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-78477dcc55-b42tf\" (UID: \"a94af5f5-bc76-4166-81f8-5f322b1a86ab\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-78477dcc55-b42tf" Apr 16 19:54:25.697011 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.696625 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca\" (UniqueName: \"kubernetes.io/secret/a94af5f5-bc76-4166-81f8-5f322b1a86ab-ca\") pod \"cluster-proxy-proxy-agent-78477dcc55-b42tf\" (UID: \"a94af5f5-bc76-4166-81f8-5f322b1a86ab\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-78477dcc55-b42tf" Apr 16 19:54:25.697867 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.697826 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub\" (UniqueName: \"kubernetes.io/secret/a94af5f5-bc76-4166-81f8-5f322b1a86ab-hub\") pod \"cluster-proxy-proxy-agent-78477dcc55-b42tf\" (UID: \"a94af5f5-bc76-4166-81f8-5f322b1a86ab\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-78477dcc55-b42tf" Apr 16 19:54:25.700047 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.699952 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t6xmx\" (UniqueName: \"kubernetes.io/projected/528c7eb6-e54f-42e5-bd14-acbc205001c1-kube-api-access-t6xmx\") pod \"network-check-source-8894fc9bd-c8jsl\" (UID: \"528c7eb6-e54f-42e5-bd14-acbc205001c1\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-c8jsl" Apr 16 19:54:25.701517 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.701453 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sn7vv\" (UniqueName: \"kubernetes.io/projected/40b37910-94fc-4da8-aa1d-163af810d004-kube-api-access-sn7vv\") pod \"dns-default-9clf4\" (UID: \"40b37910-94fc-4da8-aa1d-163af810d004\") " pod="openshift-dns/dns-default-9clf4" Apr 16 19:54:25.701886 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.701764 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-qxcc8" Apr 16 19:54:25.704446 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.704384 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j4696\" (UniqueName: \"kubernetes.io/projected/2676cacd-954e-4b7c-a85c-1b43b90f0471-kube-api-access-j4696\") pod \"ingress-canary-ppsrs\" (UID: \"2676cacd-954e-4b7c-a85c-1b43b90f0471\") " pod="openshift-ingress-canary/ingress-canary-ppsrs" Apr 16 19:54:25.708076 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.707502 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-74pjk\" (UniqueName: \"kubernetes.io/projected/a94af5f5-bc76-4166-81f8-5f322b1a86ab-kube-api-access-74pjk\") pod \"cluster-proxy-proxy-agent-78477dcc55-b42tf\" (UID: \"a94af5f5-bc76-4166-81f8-5f322b1a86ab\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-78477dcc55-b42tf" Apr 16 19:54:25.708076 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.708040 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xf88s\" (UniqueName: \"kubernetes.io/projected/856e9e38-1147-41be-a5de-fe7ea1bec4c3-kube-api-access-xf88s\") pod \"klusterlet-addon-workmgr-649f77c955-hl7lw\" (UID: \"856e9e38-1147-41be-a5de-fe7ea1bec4c3\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-649f77c955-hl7lw" Apr 16 19:54:25.711479 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.711161 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qffnj\" (UniqueName: \"kubernetes.io/projected/b9ff25d1-4296-4a79-9bfa-cad826fb48cb-kube-api-access-qffnj\") pod \"cluster-monitoring-operator-75587bd455-rcxgg\" (UID: \"b9ff25d1-4296-4a79-9bfa-cad826fb48cb\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-rcxgg" Apr 16 19:54:25.726642 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.726602 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7dd8db8f47-n8g5p" Apr 16 19:54:25.741227 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.741064 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-649f77c955-hl7lw" Apr 16 19:54:25.760489 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.758678 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-6cnrc"] Apr 16 19:54:25.769627 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.769191 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-c8jsl" Apr 16 19:54:25.775604 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.775292 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-78477dcc55-b42tf" Apr 16 19:54:25.787667 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.787308 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-g96dz"] Apr 16 19:54:25.802709 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:54:25.802663 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod499fb04b_1629_4d2a_8d4c_4b6f38ec093e.slice/crio-1e7918f1c9ea06741511c9921a3e5261663842460d0c7b9de83d22c5a9cc279a WatchSource:0}: Error finding container 1e7918f1c9ea06741511c9921a3e5261663842460d0c7b9de83d22c5a9cc279a: Status 404 returned error can't find the container with id 1e7918f1c9ea06741511c9921a3e5261663842460d0c7b9de83d22c5a9cc279a Apr 16 19:54:25.823980 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.822431 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-gm4s4"] Apr 16 19:54:25.863171 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.862953 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-2xttw"] Apr 16 19:54:25.872485 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:54:25.872453 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd7cbe800_699f_48fe_9a8a_b74c32bf0dcc.slice/crio-f6a0612ff95ca086c4238dd4134209250baf6e830260a4b358679d1d86c5680f WatchSource:0}: Error finding container f6a0612ff95ca086c4238dd4134209250baf6e830260a4b358679d1d86c5680f: Status 404 returned error can't find the container with id f6a0612ff95ca086c4238dd4134209250baf6e830260a4b358679d1d86c5680f Apr 16 19:54:25.904560 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.904495 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-qxcc8"] Apr 16 19:54:25.912584 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:54:25.912535 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2090cd61_18f9_425a_bf42_4b9628417aad.slice/crio-3b889fa1fb448300038d5c2afbcf328e72e98ffb03ded82e9a98ca3c4026b8e2 WatchSource:0}: Error finding container 3b889fa1fb448300038d5c2afbcf328e72e98ffb03ded82e9a98ca3c4026b8e2: Status 404 returned error can't find the container with id 3b889fa1fb448300038d5c2afbcf328e72e98ffb03ded82e9a98ca3c4026b8e2 Apr 16 19:54:25.936649 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.936469 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7dd8db8f47-n8g5p"] Apr 16 19:54:25.940778 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:54:25.940677 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5ea56c9b_a13c_4e3d_a9c0_5a4d7c254e14.slice/crio-8ef78eea28dc2f398620c9f1a4e1a94839134a89a08f42b1b77a3fdf4042a018 WatchSource:0}: Error finding container 8ef78eea28dc2f398620c9f1a4e1a94839134a89a08f42b1b77a3fdf4042a018: Status 404 returned error can't find the container with id 8ef78eea28dc2f398620c9f1a4e1a94839134a89a08f42b1b77a3fdf4042a018 Apr 16 19:54:25.943060 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.943027 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-649f77c955-hl7lw"] Apr 16 19:54:25.946392 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:54:25.946360 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod856e9e38_1147_41be_a5de_fe7ea1bec4c3.slice/crio-783e2b3d4292abc4dae73b2ca68fffebfc9fb4ac825537506be208d340ae0427 WatchSource:0}: Error finding container 783e2b3d4292abc4dae73b2ca68fffebfc9fb4ac825537506be208d340ae0427: Status 404 returned error can't find the container with id 783e2b3d4292abc4dae73b2ca68fffebfc9fb4ac825537506be208d340ae0427 Apr 16 19:54:25.973466 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.973438 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-c8jsl"] Apr 16 19:54:25.976351 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:54:25.976316 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod528c7eb6_e54f_42e5_bd14_acbc205001c1.slice/crio-889fb71762003dad25237f95d2f6a04f6ae69dd284be123a69f2358848036db1 WatchSource:0}: Error finding container 889fb71762003dad25237f95d2f6a04f6ae69dd284be123a69f2358848036db1: Status 404 returned error can't find the container with id 889fb71762003dad25237f95d2f6a04f6ae69dd284be123a69f2358848036db1 Apr 16 19:54:25.980893 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.980868 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-78477dcc55-b42tf"] Apr 16 19:54:25.983698 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:54:25.983677 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda94af5f5_bc76_4166_81f8_5f322b1a86ab.slice/crio-6e2ff9045e3ffefabbb552080f3a47185e361188ee5f8839c0a95e68c8a5afc5 WatchSource:0}: Error finding container 6e2ff9045e3ffefabbb552080f3a47185e361188ee5f8839c0a95e68c8a5afc5: Status 404 returned error can't find the container with id 6e2ff9045e3ffefabbb552080f3a47185e361188ee5f8839c0a95e68c8a5afc5 Apr 16 19:54:25.988591 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.988567 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/458f83e2-e97a-457a-9081-a5ae099b6973-metrics-certs\") pod \"network-metrics-daemon-5j478\" (UID: \"458f83e2-e97a-457a-9081-a5ae099b6973\") " pod="openshift-multus/network-metrics-daemon-5j478" Apr 16 19:54:25.988717 ip-10-0-128-48 kubenswrapper[2572]: E0416 19:54:25.988697 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 19:54:25.988815 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.988695 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c37c23a9-747d-419a-8a58-907dc11ecf6d-registry-tls\") pod \"image-registry-656d5cd769-kngz2\" (UID: \"c37c23a9-747d-419a-8a58-907dc11ecf6d\") " pod="openshift-image-registry/image-registry-656d5cd769-kngz2" Apr 16 19:54:25.988815 ip-10-0-128-48 kubenswrapper[2572]: E0416 19:54:25.988758 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/458f83e2-e97a-457a-9081-a5ae099b6973-metrics-certs podName:458f83e2-e97a-457a-9081-a5ae099b6973 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:57.98873861 +0000 UTC m=+65.167500630 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/458f83e2-e97a-457a-9081-a5ae099b6973-metrics-certs") pod "network-metrics-daemon-5j478" (UID: "458f83e2-e97a-457a-9081-a5ae099b6973") : secret "metrics-daemon-secret" not found Apr 16 19:54:25.988815 ip-10-0-128-48 kubenswrapper[2572]: E0416 19:54:25.988777 2572 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 19:54:25.988815 ip-10-0-128-48 kubenswrapper[2572]: E0416 19:54:25.988790 2572 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-656d5cd769-kngz2: secret "image-registry-tls" not found Apr 16 19:54:25.989031 ip-10-0-128-48 kubenswrapper[2572]: E0416 19:54:25.988834 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c37c23a9-747d-419a-8a58-907dc11ecf6d-registry-tls podName:c37c23a9-747d-419a-8a58-907dc11ecf6d nodeName:}" failed. No retries permitted until 2026-04-16 19:54:26.988818088 +0000 UTC m=+34.167580098 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/c37c23a9-747d-419a-8a58-907dc11ecf6d-registry-tls") pod "image-registry-656d5cd769-kngz2" (UID: "c37c23a9-747d-419a-8a58-907dc11ecf6d") : secret "image-registry-tls" not found Apr 16 19:54:25.989031 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.988875 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/142d9ab9-4b05-479d-b198-2760a09292d1-service-ca-bundle\") pod \"router-default-74fc646496-l58qj\" (UID: \"142d9ab9-4b05-479d-b198-2760a09292d1\") " pod="openshift-ingress/router-default-74fc646496-l58qj" Apr 16 19:54:25.989031 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:25.988911 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/142d9ab9-4b05-479d-b198-2760a09292d1-metrics-certs\") pod \"router-default-74fc646496-l58qj\" (UID: \"142d9ab9-4b05-479d-b198-2760a09292d1\") " pod="openshift-ingress/router-default-74fc646496-l58qj" Apr 16 19:54:25.989031 ip-10-0-128-48 kubenswrapper[2572]: E0416 19:54:25.988984 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/142d9ab9-4b05-479d-b198-2760a09292d1-service-ca-bundle podName:142d9ab9-4b05-479d-b198-2760a09292d1 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:26.98897327 +0000 UTC m=+34.167735277 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/142d9ab9-4b05-479d-b198-2760a09292d1-service-ca-bundle") pod "router-default-74fc646496-l58qj" (UID: "142d9ab9-4b05-479d-b198-2760a09292d1") : configmap references non-existent config key: service-ca.crt Apr 16 19:54:25.989031 ip-10-0-128-48 kubenswrapper[2572]: E0416 19:54:25.988995 2572 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 19:54:25.989031 ip-10-0-128-48 kubenswrapper[2572]: E0416 19:54:25.989029 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/142d9ab9-4b05-479d-b198-2760a09292d1-metrics-certs podName:142d9ab9-4b05-479d-b198-2760a09292d1 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:26.989017762 +0000 UTC m=+34.167779766 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/142d9ab9-4b05-479d-b198-2760a09292d1-metrics-certs") pod "router-default-74fc646496-l58qj" (UID: "142d9ab9-4b05-479d-b198-2760a09292d1") : secret "router-metrics-certs-default" not found Apr 16 19:54:26.089741 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:26.089708 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/684e28e1-6369-483c-abf7-b3f82437af2c-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-trj4c\" (UID: \"684e28e1-6369-483c-abf7-b3f82437af2c\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-trj4c" Apr 16 19:54:26.089900 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:26.089821 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lk7dq\" (UniqueName: \"kubernetes.io/projected/dc2e7638-ebb5-4713-a221-8c885ed0b19d-kube-api-access-lk7dq\") pod \"network-check-target-qvsgg\" (UID: \"dc2e7638-ebb5-4713-a221-8c885ed0b19d\") " pod="openshift-network-diagnostics/network-check-target-qvsgg" Apr 16 19:54:26.089900 ip-10-0-128-48 kubenswrapper[2572]: E0416 19:54:26.089865 2572 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 19:54:26.090023 ip-10-0-128-48 kubenswrapper[2572]: E0416 19:54:26.089954 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/684e28e1-6369-483c-abf7-b3f82437af2c-samples-operator-tls podName:684e28e1-6369-483c-abf7-b3f82437af2c nodeName:}" failed. No retries permitted until 2026-04-16 19:54:27.089922134 +0000 UTC m=+34.268684145 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/684e28e1-6369-483c-abf7-b3f82437af2c-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-trj4c" (UID: "684e28e1-6369-483c-abf7-b3f82437af2c") : secret "samples-operator-tls" not found Apr 16 19:54:26.094847 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:26.094824 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lk7dq\" (UniqueName: \"kubernetes.io/projected/dc2e7638-ebb5-4713-a221-8c885ed0b19d-kube-api-access-lk7dq\") pod \"network-check-target-qvsgg\" (UID: \"dc2e7638-ebb5-4713-a221-8c885ed0b19d\") " pod="openshift-network-diagnostics/network-check-target-qvsgg" Apr 16 19:54:26.191064 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:26.190977 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/40b37910-94fc-4da8-aa1d-163af810d004-metrics-tls\") pod \"dns-default-9clf4\" (UID: \"40b37910-94fc-4da8-aa1d-163af810d004\") " pod="openshift-dns/dns-default-9clf4" Apr 16 19:54:26.191525 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:26.191133 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2676cacd-954e-4b7c-a85c-1b43b90f0471-cert\") pod \"ingress-canary-ppsrs\" (UID: \"2676cacd-954e-4b7c-a85c-1b43b90f0471\") " pod="openshift-ingress-canary/ingress-canary-ppsrs" Apr 16 19:54:26.191525 ip-10-0-128-48 kubenswrapper[2572]: E0416 19:54:26.191197 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 19:54:26.191525 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:26.191211 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/b9ff25d1-4296-4a79-9bfa-cad826fb48cb-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-rcxgg\" (UID: \"b9ff25d1-4296-4a79-9bfa-cad826fb48cb\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-rcxgg" Apr 16 19:54:26.191525 ip-10-0-128-48 kubenswrapper[2572]: E0416 19:54:26.191282 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/40b37910-94fc-4da8-aa1d-163af810d004-metrics-tls podName:40b37910-94fc-4da8-aa1d-163af810d004 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:27.191262487 +0000 UTC m=+34.370024496 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/40b37910-94fc-4da8-aa1d-163af810d004-metrics-tls") pod "dns-default-9clf4" (UID: "40b37910-94fc-4da8-aa1d-163af810d004") : secret "dns-default-metrics-tls" not found Apr 16 19:54:26.191525 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:26.191304 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/dea42027-362c-41e6-a940-a20b985788b0-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-lcvgf\" (UID: \"dea42027-362c-41e6-a940-a20b985788b0\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-lcvgf" Apr 16 19:54:26.191525 ip-10-0-128-48 kubenswrapper[2572]: E0416 19:54:26.191442 2572 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 19:54:26.191525 ip-10-0-128-48 kubenswrapper[2572]: E0416 19:54:26.191486 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dea42027-362c-41e6-a940-a20b985788b0-networking-console-plugin-cert podName:dea42027-362c-41e6-a940-a20b985788b0 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:27.191474627 +0000 UTC m=+34.370236631 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/dea42027-362c-41e6-a940-a20b985788b0-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-lcvgf" (UID: "dea42027-362c-41e6-a940-a20b985788b0") : secret "networking-console-plugin-cert" not found Apr 16 19:54:26.191867 ip-10-0-128-48 kubenswrapper[2572]: E0416 19:54:26.191544 2572 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 19:54:26.191867 ip-10-0-128-48 kubenswrapper[2572]: E0416 19:54:26.191557 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 19:54:26.191867 ip-10-0-128-48 kubenswrapper[2572]: E0416 19:54:26.191587 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b9ff25d1-4296-4a79-9bfa-cad826fb48cb-cluster-monitoring-operator-tls podName:b9ff25d1-4296-4a79-9bfa-cad826fb48cb nodeName:}" failed. No retries permitted until 2026-04-16 19:54:27.191575014 +0000 UTC m=+34.370337021 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/b9ff25d1-4296-4a79-9bfa-cad826fb48cb-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-rcxgg" (UID: "b9ff25d1-4296-4a79-9bfa-cad826fb48cb") : secret "cluster-monitoring-operator-tls" not found Apr 16 19:54:26.191867 ip-10-0-128-48 kubenswrapper[2572]: E0416 19:54:26.191614 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2676cacd-954e-4b7c-a85c-1b43b90f0471-cert podName:2676cacd-954e-4b7c-a85c-1b43b90f0471 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:27.191597316 +0000 UTC m=+34.370359330 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2676cacd-954e-4b7c-a85c-1b43b90f0471-cert") pod "ingress-canary-ppsrs" (UID: "2676cacd-954e-4b7c-a85c-1b43b90f0471") : secret "canary-serving-cert" not found Apr 16 19:54:26.292169 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:26.292126 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/fec5c910-3cf0-49a0-b436-62a7236c7d68-original-pull-secret\") pod \"global-pull-secret-syncer-9mwx9\" (UID: \"fec5c910-3cf0-49a0-b436-62a7236c7d68\") " pod="kube-system/global-pull-secret-syncer-9mwx9" Apr 16 19:54:26.294674 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:26.294648 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qvsgg" Apr 16 19:54:26.294886 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:26.294870 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/fec5c910-3cf0-49a0-b436-62a7236c7d68-original-pull-secret\") pod \"global-pull-secret-syncer-9mwx9\" (UID: \"fec5c910-3cf0-49a0-b436-62a7236c7d68\") " pod="kube-system/global-pull-secret-syncer-9mwx9" Apr 16 19:54:26.535304 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:26.535198 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-78477dcc55-b42tf" event={"ID":"a94af5f5-bc76-4166-81f8-5f322b1a86ab","Type":"ContainerStarted","Data":"6e2ff9045e3ffefabbb552080f3a47185e361188ee5f8839c0a95e68c8a5afc5"} Apr 16 19:54:26.536706 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:26.536661 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-6cnrc" event={"ID":"72d1ed32-dd1f-4c30-adc3-db411de6c394","Type":"ContainerStarted","Data":"19989bea4f1763c8b96b070af15f73084eb61362c971daa330e6ea10ad5d0ac3"} Apr 16 19:54:26.538118 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:26.538077 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-qxcc8" event={"ID":"2090cd61-18f9-425a-bf42-4b9628417aad","Type":"ContainerStarted","Data":"3b889fa1fb448300038d5c2afbcf328e72e98ffb03ded82e9a98ca3c4026b8e2"} Apr 16 19:54:26.539414 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:26.539365 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-g96dz" event={"ID":"499fb04b-1629-4d2a-8d4c-4b6f38ec093e","Type":"ContainerStarted","Data":"1e7918f1c9ea06741511c9921a3e5261663842460d0c7b9de83d22c5a9cc279a"} Apr 16 19:54:26.540602 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:26.540559 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-c8jsl" event={"ID":"528c7eb6-e54f-42e5-bd14-acbc205001c1","Type":"ContainerStarted","Data":"889fb71762003dad25237f95d2f6a04f6ae69dd284be123a69f2358848036db1"} Apr 16 19:54:26.542244 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:26.542192 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7dd8db8f47-n8g5p" event={"ID":"5ea56c9b-a13c-4e3d-a9c0-5a4d7c254e14","Type":"ContainerStarted","Data":"8ef78eea28dc2f398620c9f1a4e1a94839134a89a08f42b1b77a3fdf4042a018"} Apr 16 19:54:26.543460 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:26.543415 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-gm4s4" event={"ID":"c485156a-3973-41b9-9936-2cd58d6d6ea4","Type":"ContainerStarted","Data":"17d690bd222fe79dfcc86b40b11189365d50edd738999927d8f7250746291bad"} Apr 16 19:54:26.544786 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:26.544746 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-649f77c955-hl7lw" event={"ID":"856e9e38-1147-41be-a5de-fe7ea1bec4c3","Type":"ContainerStarted","Data":"783e2b3d4292abc4dae73b2ca68fffebfc9fb4ac825537506be208d340ae0427"} Apr 16 19:54:26.546360 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:26.546324 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-2xttw" event={"ID":"d7cbe800-699f-48fe-9a8a-b74c32bf0dcc","Type":"ContainerStarted","Data":"f6a0612ff95ca086c4238dd4134209250baf6e830260a4b358679d1d86c5680f"} Apr 16 19:54:26.587025 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:26.587002 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-9mwx9" Apr 16 19:54:26.999707 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:26.999677 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c37c23a9-747d-419a-8a58-907dc11ecf6d-registry-tls\") pod \"image-registry-656d5cd769-kngz2\" (UID: \"c37c23a9-747d-419a-8a58-907dc11ecf6d\") " pod="openshift-image-registry/image-registry-656d5cd769-kngz2" Apr 16 19:54:26.999905 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:26.999751 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/142d9ab9-4b05-479d-b198-2760a09292d1-service-ca-bundle\") pod \"router-default-74fc646496-l58qj\" (UID: \"142d9ab9-4b05-479d-b198-2760a09292d1\") " pod="openshift-ingress/router-default-74fc646496-l58qj" Apr 16 19:54:26.999905 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:26.999781 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/142d9ab9-4b05-479d-b198-2760a09292d1-metrics-certs\") pod \"router-default-74fc646496-l58qj\" (UID: \"142d9ab9-4b05-479d-b198-2760a09292d1\") " pod="openshift-ingress/router-default-74fc646496-l58qj" Apr 16 19:54:26.999905 ip-10-0-128-48 kubenswrapper[2572]: E0416 19:54:26.999826 2572 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 19:54:26.999905 ip-10-0-128-48 kubenswrapper[2572]: E0416 19:54:26.999845 2572 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-656d5cd769-kngz2: secret "image-registry-tls" not found Apr 16 19:54:26.999905 ip-10-0-128-48 kubenswrapper[2572]: E0416 19:54:26.999893 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c37c23a9-747d-419a-8a58-907dc11ecf6d-registry-tls podName:c37c23a9-747d-419a-8a58-907dc11ecf6d nodeName:}" failed. No retries permitted until 2026-04-16 19:54:28.99987804 +0000 UTC m=+36.178640042 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/c37c23a9-747d-419a-8a58-907dc11ecf6d-registry-tls") pod "image-registry-656d5cd769-kngz2" (UID: "c37c23a9-747d-419a-8a58-907dc11ecf6d") : secret "image-registry-tls" not found Apr 16 19:54:26.999905 ip-10-0-128-48 kubenswrapper[2572]: E0416 19:54:26.999898 2572 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 19:54:27.000164 ip-10-0-128-48 kubenswrapper[2572]: E0416 19:54:26.999905 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/142d9ab9-4b05-479d-b198-2760a09292d1-service-ca-bundle podName:142d9ab9-4b05-479d-b198-2760a09292d1 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:28.999898997 +0000 UTC m=+36.178661000 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/142d9ab9-4b05-479d-b198-2760a09292d1-service-ca-bundle") pod "router-default-74fc646496-l58qj" (UID: "142d9ab9-4b05-479d-b198-2760a09292d1") : configmap references non-existent config key: service-ca.crt Apr 16 19:54:27.000164 ip-10-0-128-48 kubenswrapper[2572]: E0416 19:54:26.999956 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/142d9ab9-4b05-479d-b198-2760a09292d1-metrics-certs podName:142d9ab9-4b05-479d-b198-2760a09292d1 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:28.999937547 +0000 UTC m=+36.178699552 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/142d9ab9-4b05-479d-b198-2760a09292d1-metrics-certs") pod "router-default-74fc646496-l58qj" (UID: "142d9ab9-4b05-479d-b198-2760a09292d1") : secret "router-metrics-certs-default" not found Apr 16 19:54:27.101128 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:27.101058 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/684e28e1-6369-483c-abf7-b3f82437af2c-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-trj4c\" (UID: \"684e28e1-6369-483c-abf7-b3f82437af2c\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-trj4c" Apr 16 19:54:27.101303 ip-10-0-128-48 kubenswrapper[2572]: E0416 19:54:27.101239 2572 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 19:54:27.101377 ip-10-0-128-48 kubenswrapper[2572]: E0416 19:54:27.101321 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/684e28e1-6369-483c-abf7-b3f82437af2c-samples-operator-tls podName:684e28e1-6369-483c-abf7-b3f82437af2c nodeName:}" failed. No retries permitted until 2026-04-16 19:54:29.101300411 +0000 UTC m=+36.280062416 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/684e28e1-6369-483c-abf7-b3f82437af2c-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-trj4c" (UID: "684e28e1-6369-483c-abf7-b3f82437af2c") : secret "samples-operator-tls" not found Apr 16 19:54:27.201955 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:27.201924 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2676cacd-954e-4b7c-a85c-1b43b90f0471-cert\") pod \"ingress-canary-ppsrs\" (UID: \"2676cacd-954e-4b7c-a85c-1b43b90f0471\") " pod="openshift-ingress-canary/ingress-canary-ppsrs" Apr 16 19:54:27.202421 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:27.201986 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/b9ff25d1-4296-4a79-9bfa-cad826fb48cb-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-rcxgg\" (UID: \"b9ff25d1-4296-4a79-9bfa-cad826fb48cb\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-rcxgg" Apr 16 19:54:27.202421 ip-10-0-128-48 kubenswrapper[2572]: E0416 19:54:27.202107 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 19:54:27.202421 ip-10-0-128-48 kubenswrapper[2572]: E0416 19:54:27.202128 2572 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 19:54:27.202421 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:27.202150 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/dea42027-362c-41e6-a940-a20b985788b0-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-lcvgf\" (UID: \"dea42027-362c-41e6-a940-a20b985788b0\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-lcvgf" Apr 16 19:54:27.202421 ip-10-0-128-48 kubenswrapper[2572]: E0416 19:54:27.202178 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2676cacd-954e-4b7c-a85c-1b43b90f0471-cert podName:2676cacd-954e-4b7c-a85c-1b43b90f0471 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:29.202158983 +0000 UTC m=+36.380921004 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2676cacd-954e-4b7c-a85c-1b43b90f0471-cert") pod "ingress-canary-ppsrs" (UID: "2676cacd-954e-4b7c-a85c-1b43b90f0471") : secret "canary-serving-cert" not found Apr 16 19:54:27.202421 ip-10-0-128-48 kubenswrapper[2572]: E0416 19:54:27.202222 2572 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 19:54:27.202421 ip-10-0-128-48 kubenswrapper[2572]: E0416 19:54:27.202259 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b9ff25d1-4296-4a79-9bfa-cad826fb48cb-cluster-monitoring-operator-tls podName:b9ff25d1-4296-4a79-9bfa-cad826fb48cb nodeName:}" failed. No retries permitted until 2026-04-16 19:54:29.202236978 +0000 UTC m=+36.380998980 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/b9ff25d1-4296-4a79-9bfa-cad826fb48cb-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-rcxgg" (UID: "b9ff25d1-4296-4a79-9bfa-cad826fb48cb") : secret "cluster-monitoring-operator-tls" not found Apr 16 19:54:27.202421 ip-10-0-128-48 kubenswrapper[2572]: E0416 19:54:27.202291 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dea42027-362c-41e6-a940-a20b985788b0-networking-console-plugin-cert podName:dea42027-362c-41e6-a940-a20b985788b0 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:29.202278607 +0000 UTC m=+36.381040614 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/dea42027-362c-41e6-a940-a20b985788b0-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-lcvgf" (UID: "dea42027-362c-41e6-a940-a20b985788b0") : secret "networking-console-plugin-cert" not found Apr 16 19:54:27.202421 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:27.202377 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/40b37910-94fc-4da8-aa1d-163af810d004-metrics-tls\") pod \"dns-default-9clf4\" (UID: \"40b37910-94fc-4da8-aa1d-163af810d004\") " pod="openshift-dns/dns-default-9clf4" Apr 16 19:54:27.202757 ip-10-0-128-48 kubenswrapper[2572]: E0416 19:54:27.202476 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 19:54:27.202757 ip-10-0-128-48 kubenswrapper[2572]: E0416 19:54:27.202524 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/40b37910-94fc-4da8-aa1d-163af810d004-metrics-tls podName:40b37910-94fc-4da8-aa1d-163af810d004 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:29.202512363 +0000 UTC m=+36.381274368 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/40b37910-94fc-4da8-aa1d-163af810d004-metrics-tls") pod "dns-default-9clf4" (UID: "40b37910-94fc-4da8-aa1d-163af810d004") : secret "dns-default-metrics-tls" not found Apr 16 19:54:29.025917 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:29.024690 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c37c23a9-747d-419a-8a58-907dc11ecf6d-registry-tls\") pod \"image-registry-656d5cd769-kngz2\" (UID: \"c37c23a9-747d-419a-8a58-907dc11ecf6d\") " pod="openshift-image-registry/image-registry-656d5cd769-kngz2" Apr 16 19:54:29.025917 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:29.024787 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/142d9ab9-4b05-479d-b198-2760a09292d1-service-ca-bundle\") pod \"router-default-74fc646496-l58qj\" (UID: \"142d9ab9-4b05-479d-b198-2760a09292d1\") " pod="openshift-ingress/router-default-74fc646496-l58qj" Apr 16 19:54:29.025917 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:29.024824 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/142d9ab9-4b05-479d-b198-2760a09292d1-metrics-certs\") pod \"router-default-74fc646496-l58qj\" (UID: \"142d9ab9-4b05-479d-b198-2760a09292d1\") " pod="openshift-ingress/router-default-74fc646496-l58qj" Apr 16 19:54:29.025917 ip-10-0-128-48 kubenswrapper[2572]: E0416 19:54:29.024992 2572 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 19:54:29.025917 ip-10-0-128-48 kubenswrapper[2572]: E0416 19:54:29.025057 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/142d9ab9-4b05-479d-b198-2760a09292d1-metrics-certs podName:142d9ab9-4b05-479d-b198-2760a09292d1 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:33.025036784 +0000 UTC m=+40.203798793 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/142d9ab9-4b05-479d-b198-2760a09292d1-metrics-certs") pod "router-default-74fc646496-l58qj" (UID: "142d9ab9-4b05-479d-b198-2760a09292d1") : secret "router-metrics-certs-default" not found Apr 16 19:54:29.025917 ip-10-0-128-48 kubenswrapper[2572]: E0416 19:54:29.025483 2572 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 19:54:29.025917 ip-10-0-128-48 kubenswrapper[2572]: E0416 19:54:29.025498 2572 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-656d5cd769-kngz2: secret "image-registry-tls" not found Apr 16 19:54:29.025917 ip-10-0-128-48 kubenswrapper[2572]: E0416 19:54:29.025750 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c37c23a9-747d-419a-8a58-907dc11ecf6d-registry-tls podName:c37c23a9-747d-419a-8a58-907dc11ecf6d nodeName:}" failed. No retries permitted until 2026-04-16 19:54:33.025731548 +0000 UTC m=+40.204493571 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/c37c23a9-747d-419a-8a58-907dc11ecf6d-registry-tls") pod "image-registry-656d5cd769-kngz2" (UID: "c37c23a9-747d-419a-8a58-907dc11ecf6d") : secret "image-registry-tls" not found Apr 16 19:54:29.025917 ip-10-0-128-48 kubenswrapper[2572]: E0416 19:54:29.025831 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/142d9ab9-4b05-479d-b198-2760a09292d1-service-ca-bundle podName:142d9ab9-4b05-479d-b198-2760a09292d1 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:33.025820291 +0000 UTC m=+40.204582309 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/142d9ab9-4b05-479d-b198-2760a09292d1-service-ca-bundle") pod "router-default-74fc646496-l58qj" (UID: "142d9ab9-4b05-479d-b198-2760a09292d1") : configmap references non-existent config key: service-ca.crt Apr 16 19:54:29.126695 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:29.125959 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/684e28e1-6369-483c-abf7-b3f82437af2c-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-trj4c\" (UID: \"684e28e1-6369-483c-abf7-b3f82437af2c\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-trj4c" Apr 16 19:54:29.126695 ip-10-0-128-48 kubenswrapper[2572]: E0416 19:54:29.126256 2572 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 19:54:29.126695 ip-10-0-128-48 kubenswrapper[2572]: E0416 19:54:29.126324 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/684e28e1-6369-483c-abf7-b3f82437af2c-samples-operator-tls podName:684e28e1-6369-483c-abf7-b3f82437af2c nodeName:}" failed. No retries permitted until 2026-04-16 19:54:33.126305135 +0000 UTC m=+40.305067156 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/684e28e1-6369-483c-abf7-b3f82437af2c-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-trj4c" (UID: "684e28e1-6369-483c-abf7-b3f82437af2c") : secret "samples-operator-tls" not found Apr 16 19:54:29.170521 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:29.170467 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-qvsgg"] Apr 16 19:54:29.172086 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:54:29.172053 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddc2e7638_ebb5_4713_a221_8c885ed0b19d.slice/crio-56687f7358a45a831cf3ef96b252ea2e0f52730056bb5add798fd97d7206c424 WatchSource:0}: Error finding container 56687f7358a45a831cf3ef96b252ea2e0f52730056bb5add798fd97d7206c424: Status 404 returned error can't find the container with id 56687f7358a45a831cf3ef96b252ea2e0f52730056bb5add798fd97d7206c424 Apr 16 19:54:29.194390 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:29.194256 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-9mwx9"] Apr 16 19:54:29.200185 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:54:29.200150 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfec5c910_3cf0_49a0_b436_62a7236c7d68.slice/crio-9fd385664b46abd56139303fb65b4286a51bb3474ca4a34eec73e4f40c3156ee WatchSource:0}: Error finding container 9fd385664b46abd56139303fb65b4286a51bb3474ca4a34eec73e4f40c3156ee: Status 404 returned error can't find the container with id 9fd385664b46abd56139303fb65b4286a51bb3474ca4a34eec73e4f40c3156ee Apr 16 19:54:29.228566 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:29.227471 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/b9ff25d1-4296-4a79-9bfa-cad826fb48cb-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-rcxgg\" (UID: \"b9ff25d1-4296-4a79-9bfa-cad826fb48cb\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-rcxgg" Apr 16 19:54:29.228566 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:29.227548 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/dea42027-362c-41e6-a940-a20b985788b0-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-lcvgf\" (UID: \"dea42027-362c-41e6-a940-a20b985788b0\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-lcvgf" Apr 16 19:54:29.228566 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:29.227627 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/40b37910-94fc-4da8-aa1d-163af810d004-metrics-tls\") pod \"dns-default-9clf4\" (UID: \"40b37910-94fc-4da8-aa1d-163af810d004\") " pod="openshift-dns/dns-default-9clf4" Apr 16 19:54:29.228566 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:29.227705 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2676cacd-954e-4b7c-a85c-1b43b90f0471-cert\") pod \"ingress-canary-ppsrs\" (UID: \"2676cacd-954e-4b7c-a85c-1b43b90f0471\") " pod="openshift-ingress-canary/ingress-canary-ppsrs" Apr 16 19:54:29.228566 ip-10-0-128-48 kubenswrapper[2572]: E0416 19:54:29.227840 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 19:54:29.228566 ip-10-0-128-48 kubenswrapper[2572]: E0416 19:54:29.227889 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2676cacd-954e-4b7c-a85c-1b43b90f0471-cert podName:2676cacd-954e-4b7c-a85c-1b43b90f0471 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:33.227871757 +0000 UTC m=+40.406633773 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2676cacd-954e-4b7c-a85c-1b43b90f0471-cert") pod "ingress-canary-ppsrs" (UID: "2676cacd-954e-4b7c-a85c-1b43b90f0471") : secret "canary-serving-cert" not found Apr 16 19:54:29.228566 ip-10-0-128-48 kubenswrapper[2572]: E0416 19:54:29.228286 2572 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 19:54:29.228566 ip-10-0-128-48 kubenswrapper[2572]: E0416 19:54:29.228335 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b9ff25d1-4296-4a79-9bfa-cad826fb48cb-cluster-monitoring-operator-tls podName:b9ff25d1-4296-4a79-9bfa-cad826fb48cb nodeName:}" failed. No retries permitted until 2026-04-16 19:54:33.228319701 +0000 UTC m=+40.407081707 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/b9ff25d1-4296-4a79-9bfa-cad826fb48cb-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-rcxgg" (UID: "b9ff25d1-4296-4a79-9bfa-cad826fb48cb") : secret "cluster-monitoring-operator-tls" not found Apr 16 19:54:29.228566 ip-10-0-128-48 kubenswrapper[2572]: E0416 19:54:29.228388 2572 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 19:54:29.228566 ip-10-0-128-48 kubenswrapper[2572]: E0416 19:54:29.228421 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dea42027-362c-41e6-a940-a20b985788b0-networking-console-plugin-cert podName:dea42027-362c-41e6-a940-a20b985788b0 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:33.228410732 +0000 UTC m=+40.407172737 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/dea42027-362c-41e6-a940-a20b985788b0-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-lcvgf" (UID: "dea42027-362c-41e6-a940-a20b985788b0") : secret "networking-console-plugin-cert" not found Apr 16 19:54:29.228566 ip-10-0-128-48 kubenswrapper[2572]: E0416 19:54:29.228472 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 19:54:29.228566 ip-10-0-128-48 kubenswrapper[2572]: E0416 19:54:29.228499 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/40b37910-94fc-4da8-aa1d-163af810d004-metrics-tls podName:40b37910-94fc-4da8-aa1d-163af810d004 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:33.228489789 +0000 UTC m=+40.407251794 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/40b37910-94fc-4da8-aa1d-163af810d004-metrics-tls") pod "dns-default-9clf4" (UID: "40b37910-94fc-4da8-aa1d-163af810d004") : secret "dns-default-metrics-tls" not found Apr 16 19:54:29.563535 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:29.563387 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-j48bx" event={"ID":"2cd1935c-d57b-4e12-881b-0c81444e85ac","Type":"ContainerStarted","Data":"a25257cc9ebad526b9ed95f8f5ebf86b4b97778738fac137b724d3a11a5a5f16"} Apr 16 19:54:29.566562 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:29.566300 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-6cnrc" event={"ID":"72d1ed32-dd1f-4c30-adc3-db411de6c394","Type":"ContainerStarted","Data":"4b13d4fa9e6ec2d698ec5cb2a367278901e26090725873e56134bd0dcbf6622c"} Apr 16 19:54:29.571830 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:29.571777 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-9mwx9" event={"ID":"fec5c910-3cf0-49a0-b436-62a7236c7d68","Type":"ContainerStarted","Data":"9fd385664b46abd56139303fb65b4286a51bb3474ca4a34eec73e4f40c3156ee"} Apr 16 19:54:29.577340 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:29.576343 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-qvsgg" event={"ID":"dc2e7638-ebb5-4713-a221-8c885ed0b19d","Type":"ContainerStarted","Data":"56687f7358a45a831cf3ef96b252ea2e0f52730056bb5add798fd97d7206c424"} Apr 16 19:54:29.623239 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:29.622922 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-6cnrc" podStartSLOduration=27.432682809 podStartE2EDuration="30.622902621s" podCreationTimestamp="2026-04-16 19:53:59 +0000 UTC" firstStartedPulling="2026-04-16 19:54:25.790364367 +0000 UTC m=+32.969126384" lastFinishedPulling="2026-04-16 19:54:28.980584179 +0000 UTC m=+36.159346196" observedRunningTime="2026-04-16 19:54:29.620948978 +0000 UTC m=+36.799711004" watchObservedRunningTime="2026-04-16 19:54:29.622902621 +0000 UTC m=+36.801664647" Apr 16 19:54:30.610851 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:30.609530 2572 generic.go:358] "Generic (PLEG): container finished" podID="2cd1935c-d57b-4e12-881b-0c81444e85ac" containerID="a25257cc9ebad526b9ed95f8f5ebf86b4b97778738fac137b724d3a11a5a5f16" exitCode=0 Apr 16 19:54:30.610851 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:30.610596 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-j48bx" event={"ID":"2cd1935c-d57b-4e12-881b-0c81444e85ac","Type":"ContainerDied","Data":"a25257cc9ebad526b9ed95f8f5ebf86b4b97778738fac137b724d3a11a5a5f16"} Apr 16 19:54:31.633284 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:31.633249 2572 generic.go:358] "Generic (PLEG): container finished" podID="2cd1935c-d57b-4e12-881b-0c81444e85ac" containerID="25d83048d53d57fb30dcbac6b02a942d47da048e686642c239f9e23f916df28f" exitCode=0 Apr 16 19:54:31.633751 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:31.633306 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-j48bx" event={"ID":"2cd1935c-d57b-4e12-881b-0c81444e85ac","Type":"ContainerDied","Data":"25d83048d53d57fb30dcbac6b02a942d47da048e686642c239f9e23f916df28f"} Apr 16 19:54:33.071766 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:33.071721 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/142d9ab9-4b05-479d-b198-2760a09292d1-service-ca-bundle\") pod \"router-default-74fc646496-l58qj\" (UID: \"142d9ab9-4b05-479d-b198-2760a09292d1\") " pod="openshift-ingress/router-default-74fc646496-l58qj" Apr 16 19:54:33.072222 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:33.071782 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/142d9ab9-4b05-479d-b198-2760a09292d1-metrics-certs\") pod \"router-default-74fc646496-l58qj\" (UID: \"142d9ab9-4b05-479d-b198-2760a09292d1\") " pod="openshift-ingress/router-default-74fc646496-l58qj" Apr 16 19:54:33.072222 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:33.071921 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c37c23a9-747d-419a-8a58-907dc11ecf6d-registry-tls\") pod \"image-registry-656d5cd769-kngz2\" (UID: \"c37c23a9-747d-419a-8a58-907dc11ecf6d\") " pod="openshift-image-registry/image-registry-656d5cd769-kngz2" Apr 16 19:54:33.072222 ip-10-0-128-48 kubenswrapper[2572]: E0416 19:54:33.072081 2572 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 19:54:33.072222 ip-10-0-128-48 kubenswrapper[2572]: E0416 19:54:33.072112 2572 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-656d5cd769-kngz2: secret "image-registry-tls" not found Apr 16 19:54:33.072222 ip-10-0-128-48 kubenswrapper[2572]: E0416 19:54:33.072171 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c37c23a9-747d-419a-8a58-907dc11ecf6d-registry-tls podName:c37c23a9-747d-419a-8a58-907dc11ecf6d nodeName:}" failed. No retries permitted until 2026-04-16 19:54:41.072152371 +0000 UTC m=+48.250914377 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/c37c23a9-747d-419a-8a58-907dc11ecf6d-registry-tls") pod "image-registry-656d5cd769-kngz2" (UID: "c37c23a9-747d-419a-8a58-907dc11ecf6d") : secret "image-registry-tls" not found Apr 16 19:54:33.072609 ip-10-0-128-48 kubenswrapper[2572]: E0416 19:54:33.072592 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/142d9ab9-4b05-479d-b198-2760a09292d1-service-ca-bundle podName:142d9ab9-4b05-479d-b198-2760a09292d1 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:41.072576086 +0000 UTC m=+48.251338104 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/142d9ab9-4b05-479d-b198-2760a09292d1-service-ca-bundle") pod "router-default-74fc646496-l58qj" (UID: "142d9ab9-4b05-479d-b198-2760a09292d1") : configmap references non-existent config key: service-ca.crt Apr 16 19:54:33.072682 ip-10-0-128-48 kubenswrapper[2572]: E0416 19:54:33.072669 2572 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 19:54:33.072734 ip-10-0-128-48 kubenswrapper[2572]: E0416 19:54:33.072706 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/142d9ab9-4b05-479d-b198-2760a09292d1-metrics-certs podName:142d9ab9-4b05-479d-b198-2760a09292d1 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:41.072694268 +0000 UTC m=+48.251456277 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/142d9ab9-4b05-479d-b198-2760a09292d1-metrics-certs") pod "router-default-74fc646496-l58qj" (UID: "142d9ab9-4b05-479d-b198-2760a09292d1") : secret "router-metrics-certs-default" not found Apr 16 19:54:33.172718 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:33.172629 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/684e28e1-6369-483c-abf7-b3f82437af2c-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-trj4c\" (UID: \"684e28e1-6369-483c-abf7-b3f82437af2c\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-trj4c" Apr 16 19:54:33.172898 ip-10-0-128-48 kubenswrapper[2572]: E0416 19:54:33.172881 2572 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 19:54:33.172969 ip-10-0-128-48 kubenswrapper[2572]: E0416 19:54:33.172946 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/684e28e1-6369-483c-abf7-b3f82437af2c-samples-operator-tls podName:684e28e1-6369-483c-abf7-b3f82437af2c nodeName:}" failed. No retries permitted until 2026-04-16 19:54:41.172927267 +0000 UTC m=+48.351689288 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/684e28e1-6369-483c-abf7-b3f82437af2c-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-trj4c" (UID: "684e28e1-6369-483c-abf7-b3f82437af2c") : secret "samples-operator-tls" not found Apr 16 19:54:33.273764 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:33.273685 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/dea42027-362c-41e6-a940-a20b985788b0-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-lcvgf\" (UID: \"dea42027-362c-41e6-a940-a20b985788b0\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-lcvgf" Apr 16 19:54:33.273931 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:33.273792 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/40b37910-94fc-4da8-aa1d-163af810d004-metrics-tls\") pod \"dns-default-9clf4\" (UID: \"40b37910-94fc-4da8-aa1d-163af810d004\") " pod="openshift-dns/dns-default-9clf4" Apr 16 19:54:33.273931 ip-10-0-128-48 kubenswrapper[2572]: E0416 19:54:33.273807 2572 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 19:54:33.273931 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:33.273870 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2676cacd-954e-4b7c-a85c-1b43b90f0471-cert\") pod \"ingress-canary-ppsrs\" (UID: \"2676cacd-954e-4b7c-a85c-1b43b90f0471\") " pod="openshift-ingress-canary/ingress-canary-ppsrs" Apr 16 19:54:33.273931 ip-10-0-128-48 kubenswrapper[2572]: E0416 19:54:33.273925 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 19:54:33.274401 ip-10-0-128-48 kubenswrapper[2572]: E0416 19:54:33.273948 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dea42027-362c-41e6-a940-a20b985788b0-networking-console-plugin-cert podName:dea42027-362c-41e6-a940-a20b985788b0 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:41.273889276 +0000 UTC m=+48.452651280 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/dea42027-362c-41e6-a940-a20b985788b0-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-lcvgf" (UID: "dea42027-362c-41e6-a940-a20b985788b0") : secret "networking-console-plugin-cert" not found Apr 16 19:54:33.274401 ip-10-0-128-48 kubenswrapper[2572]: E0416 19:54:33.273968 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/40b37910-94fc-4da8-aa1d-163af810d004-metrics-tls podName:40b37910-94fc-4da8-aa1d-163af810d004 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:41.273958363 +0000 UTC m=+48.452720366 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/40b37910-94fc-4da8-aa1d-163af810d004-metrics-tls") pod "dns-default-9clf4" (UID: "40b37910-94fc-4da8-aa1d-163af810d004") : secret "dns-default-metrics-tls" not found Apr 16 19:54:33.274401 ip-10-0-128-48 kubenswrapper[2572]: E0416 19:54:33.274015 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 19:54:33.274401 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:33.274018 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/b9ff25d1-4296-4a79-9bfa-cad826fb48cb-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-rcxgg\" (UID: \"b9ff25d1-4296-4a79-9bfa-cad826fb48cb\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-rcxgg" Apr 16 19:54:33.274401 ip-10-0-128-48 kubenswrapper[2572]: E0416 19:54:33.274050 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2676cacd-954e-4b7c-a85c-1b43b90f0471-cert podName:2676cacd-954e-4b7c-a85c-1b43b90f0471 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:41.27403459 +0000 UTC m=+48.452796593 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2676cacd-954e-4b7c-a85c-1b43b90f0471-cert") pod "ingress-canary-ppsrs" (UID: "2676cacd-954e-4b7c-a85c-1b43b90f0471") : secret "canary-serving-cert" not found Apr 16 19:54:33.274401 ip-10-0-128-48 kubenswrapper[2572]: E0416 19:54:33.274124 2572 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 19:54:33.274401 ip-10-0-128-48 kubenswrapper[2572]: E0416 19:54:33.274171 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b9ff25d1-4296-4a79-9bfa-cad826fb48cb-cluster-monitoring-operator-tls podName:b9ff25d1-4296-4a79-9bfa-cad826fb48cb nodeName:}" failed. No retries permitted until 2026-04-16 19:54:41.274160019 +0000 UTC m=+48.452922044 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/b9ff25d1-4296-4a79-9bfa-cad826fb48cb-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-rcxgg" (UID: "b9ff25d1-4296-4a79-9bfa-cad826fb48cb") : secret "cluster-monitoring-operator-tls" not found Apr 16 19:54:41.143719 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:41.143679 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/142d9ab9-4b05-479d-b198-2760a09292d1-service-ca-bundle\") pod \"router-default-74fc646496-l58qj\" (UID: \"142d9ab9-4b05-479d-b198-2760a09292d1\") " pod="openshift-ingress/router-default-74fc646496-l58qj" Apr 16 19:54:41.143719 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:41.143724 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/142d9ab9-4b05-479d-b198-2760a09292d1-metrics-certs\") pod \"router-default-74fc646496-l58qj\" (UID: \"142d9ab9-4b05-479d-b198-2760a09292d1\") " pod="openshift-ingress/router-default-74fc646496-l58qj" Apr 16 19:54:41.144150 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:41.143799 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c37c23a9-747d-419a-8a58-907dc11ecf6d-registry-tls\") pod \"image-registry-656d5cd769-kngz2\" (UID: \"c37c23a9-747d-419a-8a58-907dc11ecf6d\") " pod="openshift-image-registry/image-registry-656d5cd769-kngz2" Apr 16 19:54:41.144150 ip-10-0-128-48 kubenswrapper[2572]: E0416 19:54:41.143839 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/142d9ab9-4b05-479d-b198-2760a09292d1-service-ca-bundle podName:142d9ab9-4b05-479d-b198-2760a09292d1 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:57.143822096 +0000 UTC m=+64.322584115 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/142d9ab9-4b05-479d-b198-2760a09292d1-service-ca-bundle") pod "router-default-74fc646496-l58qj" (UID: "142d9ab9-4b05-479d-b198-2760a09292d1") : configmap references non-existent config key: service-ca.crt Apr 16 19:54:41.144150 ip-10-0-128-48 kubenswrapper[2572]: E0416 19:54:41.143886 2572 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 19:54:41.144150 ip-10-0-128-48 kubenswrapper[2572]: E0416 19:54:41.143889 2572 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 19:54:41.144150 ip-10-0-128-48 kubenswrapper[2572]: E0416 19:54:41.143941 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/142d9ab9-4b05-479d-b198-2760a09292d1-metrics-certs podName:142d9ab9-4b05-479d-b198-2760a09292d1 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:57.143928375 +0000 UTC m=+64.322690382 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/142d9ab9-4b05-479d-b198-2760a09292d1-metrics-certs") pod "router-default-74fc646496-l58qj" (UID: "142d9ab9-4b05-479d-b198-2760a09292d1") : secret "router-metrics-certs-default" not found Apr 16 19:54:41.144150 ip-10-0-128-48 kubenswrapper[2572]: E0416 19:54:41.143896 2572 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-656d5cd769-kngz2: secret "image-registry-tls" not found Apr 16 19:54:41.144150 ip-10-0-128-48 kubenswrapper[2572]: E0416 19:54:41.143973 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c37c23a9-747d-419a-8a58-907dc11ecf6d-registry-tls podName:c37c23a9-747d-419a-8a58-907dc11ecf6d nodeName:}" failed. No retries permitted until 2026-04-16 19:54:57.143966822 +0000 UTC m=+64.322728825 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/c37c23a9-747d-419a-8a58-907dc11ecf6d-registry-tls") pod "image-registry-656d5cd769-kngz2" (UID: "c37c23a9-747d-419a-8a58-907dc11ecf6d") : secret "image-registry-tls" not found Apr 16 19:54:41.244252 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:41.244221 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/684e28e1-6369-483c-abf7-b3f82437af2c-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-trj4c\" (UID: \"684e28e1-6369-483c-abf7-b3f82437af2c\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-trj4c" Apr 16 19:54:41.244657 ip-10-0-128-48 kubenswrapper[2572]: E0416 19:54:41.244632 2572 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 19:54:41.244754 ip-10-0-128-48 kubenswrapper[2572]: E0416 19:54:41.244725 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/684e28e1-6369-483c-abf7-b3f82437af2c-samples-operator-tls podName:684e28e1-6369-483c-abf7-b3f82437af2c nodeName:}" failed. No retries permitted until 2026-04-16 19:54:57.244704261 +0000 UTC m=+64.423466282 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/684e28e1-6369-483c-abf7-b3f82437af2c-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-trj4c" (UID: "684e28e1-6369-483c-abf7-b3f82437af2c") : secret "samples-operator-tls" not found Apr 16 19:54:41.344918 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:41.344887 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/40b37910-94fc-4da8-aa1d-163af810d004-metrics-tls\") pod \"dns-default-9clf4\" (UID: \"40b37910-94fc-4da8-aa1d-163af810d004\") " pod="openshift-dns/dns-default-9clf4" Apr 16 19:54:41.345114 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:41.344950 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2676cacd-954e-4b7c-a85c-1b43b90f0471-cert\") pod \"ingress-canary-ppsrs\" (UID: \"2676cacd-954e-4b7c-a85c-1b43b90f0471\") " pod="openshift-ingress-canary/ingress-canary-ppsrs" Apr 16 19:54:41.345114 ip-10-0-128-48 kubenswrapper[2572]: E0416 19:54:41.345024 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 19:54:41.345114 ip-10-0-128-48 kubenswrapper[2572]: E0416 19:54:41.345061 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 19:54:41.345114 ip-10-0-128-48 kubenswrapper[2572]: E0416 19:54:41.345076 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2676cacd-954e-4b7c-a85c-1b43b90f0471-cert podName:2676cacd-954e-4b7c-a85c-1b43b90f0471 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:57.345062367 +0000 UTC m=+64.523824369 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2676cacd-954e-4b7c-a85c-1b43b90f0471-cert") pod "ingress-canary-ppsrs" (UID: "2676cacd-954e-4b7c-a85c-1b43b90f0471") : secret "canary-serving-cert" not found Apr 16 19:54:41.345348 ip-10-0-128-48 kubenswrapper[2572]: E0416 19:54:41.345153 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/40b37910-94fc-4da8-aa1d-163af810d004-metrics-tls podName:40b37910-94fc-4da8-aa1d-163af810d004 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:57.345135119 +0000 UTC m=+64.523897129 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/40b37910-94fc-4da8-aa1d-163af810d004-metrics-tls") pod "dns-default-9clf4" (UID: "40b37910-94fc-4da8-aa1d-163af810d004") : secret "dns-default-metrics-tls" not found Apr 16 19:54:41.345348 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:41.345183 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/b9ff25d1-4296-4a79-9bfa-cad826fb48cb-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-rcxgg\" (UID: \"b9ff25d1-4296-4a79-9bfa-cad826fb48cb\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-rcxgg" Apr 16 19:54:41.345348 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:41.345247 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/dea42027-362c-41e6-a940-a20b985788b0-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-lcvgf\" (UID: \"dea42027-362c-41e6-a940-a20b985788b0\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-lcvgf" Apr 16 19:54:41.345348 ip-10-0-128-48 kubenswrapper[2572]: E0416 19:54:41.345317 2572 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 19:54:41.345508 ip-10-0-128-48 kubenswrapper[2572]: E0416 19:54:41.345366 2572 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 19:54:41.345508 ip-10-0-128-48 kubenswrapper[2572]: E0416 19:54:41.345370 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b9ff25d1-4296-4a79-9bfa-cad826fb48cb-cluster-monitoring-operator-tls podName:b9ff25d1-4296-4a79-9bfa-cad826fb48cb nodeName:}" failed. No retries permitted until 2026-04-16 19:54:57.345359181 +0000 UTC m=+64.524121185 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/b9ff25d1-4296-4a79-9bfa-cad826fb48cb-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-rcxgg" (UID: "b9ff25d1-4296-4a79-9bfa-cad826fb48cb") : secret "cluster-monitoring-operator-tls" not found Apr 16 19:54:41.345508 ip-10-0-128-48 kubenswrapper[2572]: E0416 19:54:41.345416 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dea42027-362c-41e6-a940-a20b985788b0-networking-console-plugin-cert podName:dea42027-362c-41e6-a940-a20b985788b0 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:57.34540401 +0000 UTC m=+64.524166018 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/dea42027-362c-41e6-a940-a20b985788b0-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-lcvgf" (UID: "dea42027-362c-41e6-a940-a20b985788b0") : secret "networking-console-plugin-cert" not found Apr 16 19:54:44.664172 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:44.664061 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7dd8db8f47-n8g5p" event={"ID":"5ea56c9b-a13c-4e3d-a9c0-5a4d7c254e14","Type":"ContainerStarted","Data":"be99fc3afe7ebd34898133f47734d8bc859c58e3a0ec8d5d26ed41222c03c8b5"} Apr 16 19:54:44.666008 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:44.665984 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-gm4s4" event={"ID":"c485156a-3973-41b9-9936-2cd58d6d6ea4","Type":"ContainerStarted","Data":"5e807dbf4e4f915719997d8ae94049b27f84f96d60e855469438293fd679d64b"} Apr 16 19:54:44.669672 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:44.669629 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-j48bx" event={"ID":"2cd1935c-d57b-4e12-881b-0c81444e85ac","Type":"ContainerStarted","Data":"c46b380e462fb190cca2a99fa9e7f0741d954bd970a6032f10d895eb7d728a0e"} Apr 16 19:54:44.671010 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:44.670986 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-649f77c955-hl7lw" event={"ID":"856e9e38-1147-41be-a5de-fe7ea1bec4c3","Type":"ContainerStarted","Data":"cbea64bfa0d907922a69d0d936d3b88ac8dcb33d01f38135c8e8bb8c8fda71ae"} Apr 16 19:54:44.671180 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:44.671167 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-649f77c955-hl7lw" Apr 16 19:54:44.672648 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:44.672624 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2xttw_d7cbe800-699f-48fe-9a8a-b74c32bf0dcc/console-operator/0.log" Apr 16 19:54:44.672754 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:44.672663 2572 generic.go:358] "Generic (PLEG): container finished" podID="d7cbe800-699f-48fe-9a8a-b74c32bf0dcc" containerID="438d6a4868a80b904f76da055a5dfce5bc95ff7a60189c7996aae252277d5b73" exitCode=255 Apr 16 19:54:44.672754 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:44.672741 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-2xttw" event={"ID":"d7cbe800-699f-48fe-9a8a-b74c32bf0dcc","Type":"ContainerDied","Data":"438d6a4868a80b904f76da055a5dfce5bc95ff7a60189c7996aae252277d5b73"} Apr 16 19:54:44.673203 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:44.673176 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-649f77c955-hl7lw" Apr 16 19:54:44.673317 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:44.673296 2572 scope.go:117] "RemoveContainer" containerID="438d6a4868a80b904f76da055a5dfce5bc95ff7a60189c7996aae252277d5b73" Apr 16 19:54:44.674478 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:44.674458 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-78477dcc55-b42tf" event={"ID":"a94af5f5-bc76-4166-81f8-5f322b1a86ab","Type":"ContainerStarted","Data":"548fbf5b70b936ba22531d4341552e7ee92d7076f4f59f9deb79d33c6ad1a7a0"} Apr 16 19:54:44.675906 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:44.675889 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-9mwx9" event={"ID":"fec5c910-3cf0-49a0-b436-62a7236c7d68","Type":"ContainerStarted","Data":"310f100bc76e0cf47578802993dba090f1044f049bcecb00149a9ae2f58ee7fb"} Apr 16 19:54:44.677661 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:44.677629 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-qxcc8" event={"ID":"2090cd61-18f9-425a-bf42-4b9628417aad","Type":"ContainerStarted","Data":"5530e97b315240bcac14d72022d5ddbd8912ce72f3266cc2034f5c9eba4da8d8"} Apr 16 19:54:44.679117 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:44.679079 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-g96dz" event={"ID":"499fb04b-1629-4d2a-8d4c-4b6f38ec093e","Type":"ContainerStarted","Data":"39b55aeab78a93c58cdce7f8f78fecc34e47191d6777d14076796b5b1cc8a17e"} Apr 16 19:54:44.680499 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:44.680480 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-qvsgg" event={"ID":"dc2e7638-ebb5-4713-a221-8c885ed0b19d","Type":"ContainerStarted","Data":"39829e2fe53ef3661bbec5d38795d317057866bf7f5eb4aab951247d0ae76baf"} Apr 16 19:54:44.680601 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:44.680576 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-qvsgg" Apr 16 19:54:44.681894 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:44.681855 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-c8jsl" event={"ID":"528c7eb6-e54f-42e5-bd14-acbc205001c1","Type":"ContainerStarted","Data":"64ebc8f29635ab243b7b731fbdf587ba4c53122db50964d463305785674d9b33"} Apr 16 19:54:44.683737 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:44.683680 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7dd8db8f47-n8g5p" podStartSLOduration=30.598454636 podStartE2EDuration="48.683666556s" podCreationTimestamp="2026-04-16 19:53:56 +0000 UTC" firstStartedPulling="2026-04-16 19:54:25.944251455 +0000 UTC m=+33.123013475" lastFinishedPulling="2026-04-16 19:54:44.029463386 +0000 UTC m=+51.208225395" observedRunningTime="2026-04-16 19:54:44.682221758 +0000 UTC m=+51.860983786" watchObservedRunningTime="2026-04-16 19:54:44.683666556 +0000 UTC m=+51.862428581" Apr 16 19:54:44.706787 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:44.706746 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-j48bx" podStartSLOduration=17.268399409 podStartE2EDuration="51.706735215s" podCreationTimestamp="2026-04-16 19:53:53 +0000 UTC" firstStartedPulling="2026-04-16 19:53:54.547066556 +0000 UTC m=+1.725828560" lastFinishedPulling="2026-04-16 19:54:28.985402356 +0000 UTC m=+36.164164366" observedRunningTime="2026-04-16 19:54:44.704873547 +0000 UTC m=+51.883635572" watchObservedRunningTime="2026-04-16 19:54:44.706735215 +0000 UTC m=+51.885497241" Apr 16 19:54:44.721908 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:44.721864 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-g96dz" podStartSLOduration=27.499162214 podStartE2EDuration="45.721850603s" podCreationTimestamp="2026-04-16 19:53:59 +0000 UTC" firstStartedPulling="2026-04-16 19:54:25.805998902 +0000 UTC m=+32.984760905" lastFinishedPulling="2026-04-16 19:54:44.02868729 +0000 UTC m=+51.207449294" observedRunningTime="2026-04-16 19:54:44.720514129 +0000 UTC m=+51.899276155" watchObservedRunningTime="2026-04-16 19:54:44.721850603 +0000 UTC m=+51.900612629" Apr 16 19:54:44.739704 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:44.739656 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-649f77c955-hl7lw" podStartSLOduration=30.659100246 podStartE2EDuration="48.739641795s" podCreationTimestamp="2026-04-16 19:53:56 +0000 UTC" firstStartedPulling="2026-04-16 19:54:25.94848719 +0000 UTC m=+33.127249195" lastFinishedPulling="2026-04-16 19:54:44.029028735 +0000 UTC m=+51.207790744" observedRunningTime="2026-04-16 19:54:44.738386621 +0000 UTC m=+51.917148647" watchObservedRunningTime="2026-04-16 19:54:44.739641795 +0000 UTC m=+51.918403821" Apr 16 19:54:44.756077 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:44.756033 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-c8jsl" podStartSLOduration=21.704024249 podStartE2EDuration="39.756018685s" podCreationTimestamp="2026-04-16 19:54:05 +0000 UTC" firstStartedPulling="2026-04-16 19:54:25.97831629 +0000 UTC m=+33.157078293" lastFinishedPulling="2026-04-16 19:54:44.030310721 +0000 UTC m=+51.209072729" observedRunningTime="2026-04-16 19:54:44.755545982 +0000 UTC m=+51.934308011" watchObservedRunningTime="2026-04-16 19:54:44.756018685 +0000 UTC m=+51.934780712" Apr 16 19:54:44.825256 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:44.825141 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-585dfdc468-gm4s4" podStartSLOduration=27.807445742 podStartE2EDuration="45.825120619s" podCreationTimestamp="2026-04-16 19:53:59 +0000 UTC" firstStartedPulling="2026-04-16 19:54:25.846277306 +0000 UTC m=+33.025039308" lastFinishedPulling="2026-04-16 19:54:43.863952164 +0000 UTC m=+51.042714185" observedRunningTime="2026-04-16 19:54:44.801009723 +0000 UTC m=+51.979771748" watchObservedRunningTime="2026-04-16 19:54:44.825120619 +0000 UTC m=+52.003882644" Apr 16 19:54:44.853920 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:44.853864 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-9mwx9" podStartSLOduration=36.026051417 podStartE2EDuration="50.853844735s" podCreationTimestamp="2026-04-16 19:53:54 +0000 UTC" firstStartedPulling="2026-04-16 19:54:29.202315832 +0000 UTC m=+36.381077840" lastFinishedPulling="2026-04-16 19:54:44.030109153 +0000 UTC m=+51.208871158" observedRunningTime="2026-04-16 19:54:44.826202608 +0000 UTC m=+52.004964634" watchObservedRunningTime="2026-04-16 19:54:44.853844735 +0000 UTC m=+52.032606762" Apr 16 19:54:44.854812 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:44.854772 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-qvsgg" podStartSLOduration=36.986866247 podStartE2EDuration="51.854759838s" podCreationTimestamp="2026-04-16 19:53:53 +0000 UTC" firstStartedPulling="2026-04-16 19:54:29.175810714 +0000 UTC m=+36.354572719" lastFinishedPulling="2026-04-16 19:54:44.043704302 +0000 UTC m=+51.222466310" observedRunningTime="2026-04-16 19:54:44.852327083 +0000 UTC m=+52.031089118" watchObservedRunningTime="2026-04-16 19:54:44.854759838 +0000 UTC m=+52.033521863" Apr 16 19:54:44.879842 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:44.879798 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-qxcc8" podStartSLOduration=27.76643139 podStartE2EDuration="45.879782131s" podCreationTimestamp="2026-04-16 19:53:59 +0000 UTC" firstStartedPulling="2026-04-16 19:54:25.915160473 +0000 UTC m=+33.093922492" lastFinishedPulling="2026-04-16 19:54:44.02851122 +0000 UTC m=+51.207273233" observedRunningTime="2026-04-16 19:54:44.878748458 +0000 UTC m=+52.057510481" watchObservedRunningTime="2026-04-16 19:54:44.879782131 +0000 UTC m=+52.058544159" Apr 16 19:54:45.654795 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:45.654731 2572 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-2xttw" Apr 16 19:54:45.654795 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:45.654774 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-2xttw" Apr 16 19:54:45.693987 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:45.693960 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2xttw_d7cbe800-699f-48fe-9a8a-b74c32bf0dcc/console-operator/1.log" Apr 16 19:54:45.694439 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:45.694400 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2xttw_d7cbe800-699f-48fe-9a8a-b74c32bf0dcc/console-operator/0.log" Apr 16 19:54:45.694506 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:45.694443 2572 generic.go:358] "Generic (PLEG): container finished" podID="d7cbe800-699f-48fe-9a8a-b74c32bf0dcc" containerID="45992912529de7daf651cfb5e4a87f799f4ec62dd5f081418a479228db49f669" exitCode=255 Apr 16 19:54:45.694577 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:45.694553 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-2xttw" event={"ID":"d7cbe800-699f-48fe-9a8a-b74c32bf0dcc","Type":"ContainerDied","Data":"45992912529de7daf651cfb5e4a87f799f4ec62dd5f081418a479228db49f669"} Apr 16 19:54:45.694632 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:45.694597 2572 scope.go:117] "RemoveContainer" containerID="438d6a4868a80b904f76da055a5dfce5bc95ff7a60189c7996aae252277d5b73" Apr 16 19:54:45.694847 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:45.694829 2572 scope.go:117] "RemoveContainer" containerID="45992912529de7daf651cfb5e4a87f799f4ec62dd5f081418a479228db49f669" Apr 16 19:54:45.695067 ip-10-0-128-48 kubenswrapper[2572]: E0416 19:54:45.695048 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-2xttw_openshift-console-operator(d7cbe800-699f-48fe-9a8a-b74c32bf0dcc)\"" pod="openshift-console-operator/console-operator-9d4b6777b-2xttw" podUID="d7cbe800-699f-48fe-9a8a-b74c32bf0dcc" Apr 16 19:54:46.698023 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:46.697993 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2xttw_d7cbe800-699f-48fe-9a8a-b74c32bf0dcc/console-operator/1.log" Apr 16 19:54:46.698402 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:46.698386 2572 scope.go:117] "RemoveContainer" containerID="45992912529de7daf651cfb5e4a87f799f4ec62dd5f081418a479228db49f669" Apr 16 19:54:46.698564 ip-10-0-128-48 kubenswrapper[2572]: E0416 19:54:46.698548 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-2xttw_openshift-console-operator(d7cbe800-699f-48fe-9a8a-b74c32bf0dcc)\"" pod="openshift-console-operator/console-operator-9d4b6777b-2xttw" podUID="d7cbe800-699f-48fe-9a8a-b74c32bf0dcc" Apr 16 19:54:48.113142 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:48.113116 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-t4xmk_e34d0c6b-2806-42a6-9665-4b769fb05f24/dns-node-resolver/0.log" Apr 16 19:54:48.705704 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:48.705671 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-78477dcc55-b42tf" event={"ID":"a94af5f5-bc76-4166-81f8-5f322b1a86ab","Type":"ContainerStarted","Data":"fa90e674137d116c9ab9e146edb1115f9f9e54d3250ce8ddaa1a2db50ac49b63"} Apr 16 19:54:48.705869 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:48.705715 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-78477dcc55-b42tf" event={"ID":"a94af5f5-bc76-4166-81f8-5f322b1a86ab","Type":"ContainerStarted","Data":"b7d4b22ccc5f51d81d63d34c001b5e12de003d2c2ea123b786e9a27d993d3f35"} Apr 16 19:54:48.713310 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:48.713291 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-2jzjc_b9261443-4578-43ee-abc0-7931d8ab9f10/node-ca/0.log" Apr 16 19:54:48.728028 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:48.727981 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-78477dcc55-b42tf" podStartSLOduration=30.977238089 podStartE2EDuration="52.727967629s" podCreationTimestamp="2026-04-16 19:53:56 +0000 UTC" firstStartedPulling="2026-04-16 19:54:25.985747675 +0000 UTC m=+33.164509680" lastFinishedPulling="2026-04-16 19:54:47.736477216 +0000 UTC m=+54.915239220" observedRunningTime="2026-04-16 19:54:48.725808162 +0000 UTC m=+55.904570187" watchObservedRunningTime="2026-04-16 19:54:48.727967629 +0000 UTC m=+55.906729653" Apr 16 19:54:50.537249 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:50.537217 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-bsbmz" Apr 16 19:54:55.653790 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:55.653758 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-2xttw" Apr 16 19:54:55.653790 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:55.653797 2572 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-2xttw" Apr 16 19:54:55.654209 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:55.654183 2572 scope.go:117] "RemoveContainer" containerID="45992912529de7daf651cfb5e4a87f799f4ec62dd5f081418a479228db49f669" Apr 16 19:54:55.729072 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:55.729047 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2xttw_d7cbe800-699f-48fe-9a8a-b74c32bf0dcc/console-operator/1.log" Apr 16 19:54:55.729209 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:55.729137 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-2xttw" event={"ID":"d7cbe800-699f-48fe-9a8a-b74c32bf0dcc","Type":"ContainerStarted","Data":"619193c81ee6e343be0256b7ab2e34ac2e84ee49f06f0fb5a374ee468cdd9a7f"} Apr 16 19:54:55.729470 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:55.729455 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-2xttw" Apr 16 19:54:55.730437 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:55.730415 2572 patch_prober.go:28] interesting pod/console-operator-9d4b6777b-2xttw container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.132.0.10:8443/readyz\": dial tcp 10.132.0.10:8443: connect: connection refused" start-of-body= Apr 16 19:54:55.730517 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:55.730455 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-9d4b6777b-2xttw" podUID="d7cbe800-699f-48fe-9a8a-b74c32bf0dcc" containerName="console-operator" probeResult="failure" output="Get \"https://10.132.0.10:8443/readyz\": dial tcp 10.132.0.10:8443: connect: connection refused" Apr 16 19:54:55.754677 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:55.754637 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-9d4b6777b-2xttw" podStartSLOduration=38.601228385 podStartE2EDuration="56.754622173s" podCreationTimestamp="2026-04-16 19:53:59 +0000 UTC" firstStartedPulling="2026-04-16 19:54:25.875722967 +0000 UTC m=+33.054484988" lastFinishedPulling="2026-04-16 19:54:44.029116773 +0000 UTC m=+51.207878776" observedRunningTime="2026-04-16 19:54:55.753297751 +0000 UTC m=+62.932059777" watchObservedRunningTime="2026-04-16 19:54:55.754622173 +0000 UTC m=+62.933384200" Apr 16 19:54:56.732374 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:56.732345 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2xttw_d7cbe800-699f-48fe-9a8a-b74c32bf0dcc/console-operator/2.log" Apr 16 19:54:56.732758 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:56.732710 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2xttw_d7cbe800-699f-48fe-9a8a-b74c32bf0dcc/console-operator/1.log" Apr 16 19:54:56.732758 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:56.732742 2572 generic.go:358] "Generic (PLEG): container finished" podID="d7cbe800-699f-48fe-9a8a-b74c32bf0dcc" containerID="619193c81ee6e343be0256b7ab2e34ac2e84ee49f06f0fb5a374ee468cdd9a7f" exitCode=255 Apr 16 19:54:56.732835 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:56.732812 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-2xttw" event={"ID":"d7cbe800-699f-48fe-9a8a-b74c32bf0dcc","Type":"ContainerDied","Data":"619193c81ee6e343be0256b7ab2e34ac2e84ee49f06f0fb5a374ee468cdd9a7f"} Apr 16 19:54:56.732868 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:56.732842 2572 scope.go:117] "RemoveContainer" containerID="45992912529de7daf651cfb5e4a87f799f4ec62dd5f081418a479228db49f669" Apr 16 19:54:56.733007 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:56.732990 2572 scope.go:117] "RemoveContainer" containerID="619193c81ee6e343be0256b7ab2e34ac2e84ee49f06f0fb5a374ee468cdd9a7f" Apr 16 19:54:56.733234 ip-10-0-128-48 kubenswrapper[2572]: E0416 19:54:56.733211 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-9d4b6777b-2xttw_openshift-console-operator(d7cbe800-699f-48fe-9a8a-b74c32bf0dcc)\"" pod="openshift-console-operator/console-operator-9d4b6777b-2xttw" podUID="d7cbe800-699f-48fe-9a8a-b74c32bf0dcc" Apr 16 19:54:57.179908 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:57.179877 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/142d9ab9-4b05-479d-b198-2760a09292d1-service-ca-bundle\") pod \"router-default-74fc646496-l58qj\" (UID: \"142d9ab9-4b05-479d-b198-2760a09292d1\") " pod="openshift-ingress/router-default-74fc646496-l58qj" Apr 16 19:54:57.180068 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:57.179915 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/142d9ab9-4b05-479d-b198-2760a09292d1-metrics-certs\") pod \"router-default-74fc646496-l58qj\" (UID: \"142d9ab9-4b05-479d-b198-2760a09292d1\") " pod="openshift-ingress/router-default-74fc646496-l58qj" Apr 16 19:54:57.180068 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:57.180029 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c37c23a9-747d-419a-8a58-907dc11ecf6d-registry-tls\") pod \"image-registry-656d5cd769-kngz2\" (UID: \"c37c23a9-747d-419a-8a58-907dc11ecf6d\") " pod="openshift-image-registry/image-registry-656d5cd769-kngz2" Apr 16 19:54:57.180170 ip-10-0-128-48 kubenswrapper[2572]: E0416 19:54:57.180122 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/142d9ab9-4b05-479d-b198-2760a09292d1-service-ca-bundle podName:142d9ab9-4b05-479d-b198-2760a09292d1 nodeName:}" failed. No retries permitted until 2026-04-16 19:55:29.180084203 +0000 UTC m=+96.358846224 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/142d9ab9-4b05-479d-b198-2760a09292d1-service-ca-bundle") pod "router-default-74fc646496-l58qj" (UID: "142d9ab9-4b05-479d-b198-2760a09292d1") : configmap references non-existent config key: service-ca.crt Apr 16 19:54:57.183310 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:57.183283 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c37c23a9-747d-419a-8a58-907dc11ecf6d-registry-tls\") pod \"image-registry-656d5cd769-kngz2\" (UID: \"c37c23a9-747d-419a-8a58-907dc11ecf6d\") " pod="openshift-image-registry/image-registry-656d5cd769-kngz2" Apr 16 19:54:57.183411 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:57.183337 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/142d9ab9-4b05-479d-b198-2760a09292d1-metrics-certs\") pod \"router-default-74fc646496-l58qj\" (UID: \"142d9ab9-4b05-479d-b198-2760a09292d1\") " pod="openshift-ingress/router-default-74fc646496-l58qj" Apr 16 19:54:57.280983 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:57.280958 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/684e28e1-6369-483c-abf7-b3f82437af2c-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-trj4c\" (UID: \"684e28e1-6369-483c-abf7-b3f82437af2c\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-trj4c" Apr 16 19:54:57.282973 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:57.282954 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/684e28e1-6369-483c-abf7-b3f82437af2c-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-trj4c\" (UID: \"684e28e1-6369-483c-abf7-b3f82437af2c\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-trj4c" Apr 16 19:54:57.382260 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:57.382238 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/40b37910-94fc-4da8-aa1d-163af810d004-metrics-tls\") pod \"dns-default-9clf4\" (UID: \"40b37910-94fc-4da8-aa1d-163af810d004\") " pod="openshift-dns/dns-default-9clf4" Apr 16 19:54:57.382387 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:57.382290 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2676cacd-954e-4b7c-a85c-1b43b90f0471-cert\") pod \"ingress-canary-ppsrs\" (UID: \"2676cacd-954e-4b7c-a85c-1b43b90f0471\") " pod="openshift-ingress-canary/ingress-canary-ppsrs" Apr 16 19:54:57.382387 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:57.382326 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/b9ff25d1-4296-4a79-9bfa-cad826fb48cb-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-rcxgg\" (UID: \"b9ff25d1-4296-4a79-9bfa-cad826fb48cb\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-rcxgg" Apr 16 19:54:57.382484 ip-10-0-128-48 kubenswrapper[2572]: E0416 19:54:57.382423 2572 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 19:54:57.382484 ip-10-0-128-48 kubenswrapper[2572]: E0416 19:54:57.382479 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b9ff25d1-4296-4a79-9bfa-cad826fb48cb-cluster-monitoring-operator-tls podName:b9ff25d1-4296-4a79-9bfa-cad826fb48cb nodeName:}" failed. No retries permitted until 2026-04-16 19:55:29.382461399 +0000 UTC m=+96.561223415 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/b9ff25d1-4296-4a79-9bfa-cad826fb48cb-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-rcxgg" (UID: "b9ff25d1-4296-4a79-9bfa-cad826fb48cb") : secret "cluster-monitoring-operator-tls" not found Apr 16 19:54:57.382595 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:57.382479 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/dea42027-362c-41e6-a940-a20b985788b0-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-lcvgf\" (UID: \"dea42027-362c-41e6-a940-a20b985788b0\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-lcvgf" Apr 16 19:54:57.382652 ip-10-0-128-48 kubenswrapper[2572]: E0416 19:54:57.382636 2572 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 19:54:57.382715 ip-10-0-128-48 kubenswrapper[2572]: E0416 19:54:57.382699 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dea42027-362c-41e6-a940-a20b985788b0-networking-console-plugin-cert podName:dea42027-362c-41e6-a940-a20b985788b0 nodeName:}" failed. No retries permitted until 2026-04-16 19:55:29.382675804 +0000 UTC m=+96.561437810 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/dea42027-362c-41e6-a940-a20b985788b0-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-lcvgf" (UID: "dea42027-362c-41e6-a940-a20b985788b0") : secret "networking-console-plugin-cert" not found Apr 16 19:54:57.384458 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:57.384439 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/40b37910-94fc-4da8-aa1d-163af810d004-metrics-tls\") pod \"dns-default-9clf4\" (UID: \"40b37910-94fc-4da8-aa1d-163af810d004\") " pod="openshift-dns/dns-default-9clf4" Apr 16 19:54:57.384605 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:57.384587 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2676cacd-954e-4b7c-a85c-1b43b90f0471-cert\") pod \"ingress-canary-ppsrs\" (UID: \"2676cacd-954e-4b7c-a85c-1b43b90f0471\") " pod="openshift-ingress-canary/ingress-canary-ppsrs" Apr 16 19:54:57.409238 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:57.409218 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-nfjh6\"" Apr 16 19:54:57.416982 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:57.416969 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-656d5cd769-kngz2" Apr 16 19:54:57.488112 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:57.478851 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-jvpqq\"" Apr 16 19:54:57.490435 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:57.490399 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-trj4c" Apr 16 19:54:57.564299 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:57.564270 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-656d5cd769-kngz2"] Apr 16 19:54:57.567454 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:54:57.567419 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc37c23a9_747d_419a_8a58_907dc11ecf6d.slice/crio-13095080d894c6b03269a7d9fad0936e3b996dec76517527a6fb156608f76a36 WatchSource:0}: Error finding container 13095080d894c6b03269a7d9fad0936e3b996dec76517527a6fb156608f76a36: Status 404 returned error can't find the container with id 13095080d894c6b03269a7d9fad0936e3b996dec76517527a6fb156608f76a36 Apr 16 19:54:57.612185 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:57.612159 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-lx244\"" Apr 16 19:54:57.616872 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:57.616846 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-6lbj8\"" Apr 16 19:54:57.617649 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:57.617628 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-trj4c"] Apr 16 19:54:57.620330 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:57.620304 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-9clf4" Apr 16 19:54:57.624566 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:57.624546 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-ppsrs" Apr 16 19:54:57.737648 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:57.737613 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-trj4c" event={"ID":"684e28e1-6369-483c-abf7-b3f82437af2c","Type":"ContainerStarted","Data":"5a56209a76c22f06df39164a4fcc549b5fb6942f58714d2fecbc59a30b45f2bd"} Apr 16 19:54:57.739394 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:57.739359 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-656d5cd769-kngz2" event={"ID":"c37c23a9-747d-419a-8a58-907dc11ecf6d","Type":"ContainerStarted","Data":"165fa232cb75e58b4443aa764cbea418f0fc71367b962d4878beddc569895af9"} Apr 16 19:54:57.739519 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:57.739400 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-656d5cd769-kngz2" event={"ID":"c37c23a9-747d-419a-8a58-907dc11ecf6d","Type":"ContainerStarted","Data":"13095080d894c6b03269a7d9fad0936e3b996dec76517527a6fb156608f76a36"} Apr 16 19:54:57.739633 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:57.739613 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-656d5cd769-kngz2" Apr 16 19:54:57.740940 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:57.740922 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2xttw_d7cbe800-699f-48fe-9a8a-b74c32bf0dcc/console-operator/2.log" Apr 16 19:54:57.741300 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:57.741283 2572 scope.go:117] "RemoveContainer" containerID="619193c81ee6e343be0256b7ab2e34ac2e84ee49f06f0fb5a374ee468cdd9a7f" Apr 16 19:54:57.741456 ip-10-0-128-48 kubenswrapper[2572]: E0416 19:54:57.741440 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-9d4b6777b-2xttw_openshift-console-operator(d7cbe800-699f-48fe-9a8a-b74c32bf0dcc)\"" pod="openshift-console-operator/console-operator-9d4b6777b-2xttw" podUID="d7cbe800-699f-48fe-9a8a-b74c32bf0dcc" Apr 16 19:54:57.757749 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:57.757718 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-9clf4"] Apr 16 19:54:57.759651 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:54:57.759627 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod40b37910_94fc_4da8_aa1d_163af810d004.slice/crio-2f73a12c1078a99ab6f372c30a41731eeb83911d0c9c33525659debedfaadaa9 WatchSource:0}: Error finding container 2f73a12c1078a99ab6f372c30a41731eeb83911d0c9c33525659debedfaadaa9: Status 404 returned error can't find the container with id 2f73a12c1078a99ab6f372c30a41731eeb83911d0c9c33525659debedfaadaa9 Apr 16 19:54:57.762941 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:57.762900 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-656d5cd769-kngz2" podStartSLOduration=64.762887397 podStartE2EDuration="1m4.762887397s" podCreationTimestamp="2026-04-16 19:53:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 19:54:57.761213694 +0000 UTC m=+64.939975718" watchObservedRunningTime="2026-04-16 19:54:57.762887397 +0000 UTC m=+64.941649421" Apr 16 19:54:57.775845 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:57.775825 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-ppsrs"] Apr 16 19:54:57.779457 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:54:57.779433 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2676cacd_954e_4b7c_a85c_1b43b90f0471.slice/crio-fada442bbecc59fa66b65a5daf14e709ff270ae4be9845b4ee99fd2639e34268 WatchSource:0}: Error finding container fada442bbecc59fa66b65a5daf14e709ff270ae4be9845b4ee99fd2639e34268: Status 404 returned error can't find the container with id fada442bbecc59fa66b65a5daf14e709ff270ae4be9845b4ee99fd2639e34268 Apr 16 19:54:57.990721 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:57.990653 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/458f83e2-e97a-457a-9081-a5ae099b6973-metrics-certs\") pod \"network-metrics-daemon-5j478\" (UID: \"458f83e2-e97a-457a-9081-a5ae099b6973\") " pod="openshift-multus/network-metrics-daemon-5j478" Apr 16 19:54:57.990835 ip-10-0-128-48 kubenswrapper[2572]: E0416 19:54:57.990780 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 19:54:57.990876 ip-10-0-128-48 kubenswrapper[2572]: E0416 19:54:57.990843 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/458f83e2-e97a-457a-9081-a5ae099b6973-metrics-certs podName:458f83e2-e97a-457a-9081-a5ae099b6973 nodeName:}" failed. No retries permitted until 2026-04-16 19:56:01.990825604 +0000 UTC m=+129.169587608 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/458f83e2-e97a-457a-9081-a5ae099b6973-metrics-certs") pod "network-metrics-daemon-5j478" (UID: "458f83e2-e97a-457a-9081-a5ae099b6973") : secret "metrics-daemon-secret" not found Apr 16 19:54:58.746446 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:58.746389 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-ppsrs" event={"ID":"2676cacd-954e-4b7c-a85c-1b43b90f0471","Type":"ContainerStarted","Data":"fada442bbecc59fa66b65a5daf14e709ff270ae4be9845b4ee99fd2639e34268"} Apr 16 19:54:58.747527 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:54:58.747498 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-9clf4" event={"ID":"40b37910-94fc-4da8-aa1d-163af810d004","Type":"ContainerStarted","Data":"2f73a12c1078a99ab6f372c30a41731eeb83911d0c9c33525659debedfaadaa9"} Apr 16 19:55:01.758064 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:55:01.758024 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-ppsrs" event={"ID":"2676cacd-954e-4b7c-a85c-1b43b90f0471","Type":"ContainerStarted","Data":"c8c1e842bc28d6431612b7d46dcfb5b5fae2687f3dab43dc8c7041e5da4c7ea8"} Apr 16 19:55:01.761022 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:55:01.760994 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-trj4c" event={"ID":"684e28e1-6369-483c-abf7-b3f82437af2c","Type":"ContainerStarted","Data":"9bd69ed1bf833da755318562c550eb9755d1e59d718377e6b11f0c6ecfe0b809"} Apr 16 19:55:01.761157 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:55:01.761028 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-trj4c" event={"ID":"684e28e1-6369-483c-abf7-b3f82437af2c","Type":"ContainerStarted","Data":"c9609d2bf10d77a83b50049c66d968059832097d74213db84bda5dda1e54f5e0"} Apr 16 19:55:01.762807 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:55:01.762785 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-9clf4" event={"ID":"40b37910-94fc-4da8-aa1d-163af810d004","Type":"ContainerStarted","Data":"629d4c1204b66a1fc8daa81e52eedf7ea2a576191487c75f58633381a66b5f5b"} Apr 16 19:55:01.762807 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:55:01.762810 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-9clf4" event={"ID":"40b37910-94fc-4da8-aa1d-163af810d004","Type":"ContainerStarted","Data":"19aa12ebc19dd8010dad40272e02e66b92ca6f9195871cc652e692cf327275a3"} Apr 16 19:55:01.762972 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:55:01.762900 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-9clf4" Apr 16 19:55:01.779442 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:55:01.779403 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-ppsrs" podStartSLOduration=33.734469321 podStartE2EDuration="36.77939073s" podCreationTimestamp="2026-04-16 19:54:25 +0000 UTC" firstStartedPulling="2026-04-16 19:54:57.781567797 +0000 UTC m=+64.960329800" lastFinishedPulling="2026-04-16 19:55:00.826489203 +0000 UTC m=+68.005251209" observedRunningTime="2026-04-16 19:55:01.77834527 +0000 UTC m=+68.957107296" watchObservedRunningTime="2026-04-16 19:55:01.77939073 +0000 UTC m=+68.958152755" Apr 16 19:55:01.797396 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:55:01.797354 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-trj4c" podStartSLOduration=59.622721274 podStartE2EDuration="1m2.797343275s" podCreationTimestamp="2026-04-16 19:53:59 +0000 UTC" firstStartedPulling="2026-04-16 19:54:57.645243473 +0000 UTC m=+64.824005496" lastFinishedPulling="2026-04-16 19:55:00.819865495 +0000 UTC m=+67.998627497" observedRunningTime="2026-04-16 19:55:01.796692864 +0000 UTC m=+68.975454902" watchObservedRunningTime="2026-04-16 19:55:01.797343275 +0000 UTC m=+68.976105299" Apr 16 19:55:01.817247 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:55:01.817205 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-9clf4" podStartSLOduration=33.761429157 podStartE2EDuration="36.817193262s" podCreationTimestamp="2026-04-16 19:54:25 +0000 UTC" firstStartedPulling="2026-04-16 19:54:57.764505011 +0000 UTC m=+64.943267014" lastFinishedPulling="2026-04-16 19:55:00.820269113 +0000 UTC m=+67.999031119" observedRunningTime="2026-04-16 19:55:01.816832439 +0000 UTC m=+68.995594465" watchObservedRunningTime="2026-04-16 19:55:01.817193262 +0000 UTC m=+68.995955339" Apr 16 19:55:05.654400 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:55:05.654368 2572 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-2xttw" Apr 16 19:55:05.654825 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:55:05.654703 2572 scope.go:117] "RemoveContainer" containerID="619193c81ee6e343be0256b7ab2e34ac2e84ee49f06f0fb5a374ee468cdd9a7f" Apr 16 19:55:05.654872 ip-10-0-128-48 kubenswrapper[2572]: E0416 19:55:05.654855 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-9d4b6777b-2xttw_openshift-console-operator(d7cbe800-699f-48fe-9a8a-b74c32bf0dcc)\"" pod="openshift-console-operator/console-operator-9d4b6777b-2xttw" podUID="d7cbe800-699f-48fe-9a8a-b74c32bf0dcc" Apr 16 19:55:08.473391 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:55:08.473355 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-cl59r"] Apr 16 19:55:08.521346 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:55:08.521316 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-cl59r"] Apr 16 19:55:08.521498 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:55:08.521450 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-cl59r" Apr 16 19:55:08.524777 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:55:08.524753 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 16 19:55:08.525690 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:55:08.525673 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 16 19:55:08.525792 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:55:08.525714 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-vh9v5\"" Apr 16 19:55:08.687187 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:55:08.687151 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/9b0ca5b9-11b5-46e9-80c6-d8a389e78051-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-cl59r\" (UID: \"9b0ca5b9-11b5-46e9-80c6-d8a389e78051\") " pod="openshift-insights/insights-runtime-extractor-cl59r" Apr 16 19:55:08.687348 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:55:08.687246 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-54x7d\" (UniqueName: \"kubernetes.io/projected/9b0ca5b9-11b5-46e9-80c6-d8a389e78051-kube-api-access-54x7d\") pod \"insights-runtime-extractor-cl59r\" (UID: \"9b0ca5b9-11b5-46e9-80c6-d8a389e78051\") " pod="openshift-insights/insights-runtime-extractor-cl59r" Apr 16 19:55:08.687348 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:55:08.687293 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/9b0ca5b9-11b5-46e9-80c6-d8a389e78051-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-cl59r\" (UID: \"9b0ca5b9-11b5-46e9-80c6-d8a389e78051\") " pod="openshift-insights/insights-runtime-extractor-cl59r" Apr 16 19:55:08.687417 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:55:08.687353 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/9b0ca5b9-11b5-46e9-80c6-d8a389e78051-crio-socket\") pod \"insights-runtime-extractor-cl59r\" (UID: \"9b0ca5b9-11b5-46e9-80c6-d8a389e78051\") " pod="openshift-insights/insights-runtime-extractor-cl59r" Apr 16 19:55:08.687417 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:55:08.687382 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/9b0ca5b9-11b5-46e9-80c6-d8a389e78051-data-volume\") pod \"insights-runtime-extractor-cl59r\" (UID: \"9b0ca5b9-11b5-46e9-80c6-d8a389e78051\") " pod="openshift-insights/insights-runtime-extractor-cl59r" Apr 16 19:55:08.788480 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:55:08.788402 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-54x7d\" (UniqueName: \"kubernetes.io/projected/9b0ca5b9-11b5-46e9-80c6-d8a389e78051-kube-api-access-54x7d\") pod \"insights-runtime-extractor-cl59r\" (UID: \"9b0ca5b9-11b5-46e9-80c6-d8a389e78051\") " pod="openshift-insights/insights-runtime-extractor-cl59r" Apr 16 19:55:08.788480 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:55:08.788452 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/9b0ca5b9-11b5-46e9-80c6-d8a389e78051-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-cl59r\" (UID: \"9b0ca5b9-11b5-46e9-80c6-d8a389e78051\") " pod="openshift-insights/insights-runtime-extractor-cl59r" Apr 16 19:55:08.788697 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:55:08.788583 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/9b0ca5b9-11b5-46e9-80c6-d8a389e78051-crio-socket\") pod \"insights-runtime-extractor-cl59r\" (UID: \"9b0ca5b9-11b5-46e9-80c6-d8a389e78051\") " pod="openshift-insights/insights-runtime-extractor-cl59r" Apr 16 19:55:08.788697 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:55:08.788622 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/9b0ca5b9-11b5-46e9-80c6-d8a389e78051-data-volume\") pod \"insights-runtime-extractor-cl59r\" (UID: \"9b0ca5b9-11b5-46e9-80c6-d8a389e78051\") " pod="openshift-insights/insights-runtime-extractor-cl59r" Apr 16 19:55:08.788697 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:55:08.788667 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/9b0ca5b9-11b5-46e9-80c6-d8a389e78051-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-cl59r\" (UID: \"9b0ca5b9-11b5-46e9-80c6-d8a389e78051\") " pod="openshift-insights/insights-runtime-extractor-cl59r" Apr 16 19:55:08.788832 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:55:08.788692 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/9b0ca5b9-11b5-46e9-80c6-d8a389e78051-crio-socket\") pod \"insights-runtime-extractor-cl59r\" (UID: \"9b0ca5b9-11b5-46e9-80c6-d8a389e78051\") " pod="openshift-insights/insights-runtime-extractor-cl59r" Apr 16 19:55:08.789049 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:55:08.789028 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/9b0ca5b9-11b5-46e9-80c6-d8a389e78051-data-volume\") pod \"insights-runtime-extractor-cl59r\" (UID: \"9b0ca5b9-11b5-46e9-80c6-d8a389e78051\") " pod="openshift-insights/insights-runtime-extractor-cl59r" Apr 16 19:55:08.789280 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:55:08.789258 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/9b0ca5b9-11b5-46e9-80c6-d8a389e78051-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-cl59r\" (UID: \"9b0ca5b9-11b5-46e9-80c6-d8a389e78051\") " pod="openshift-insights/insights-runtime-extractor-cl59r" Apr 16 19:55:08.790702 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:55:08.790677 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/9b0ca5b9-11b5-46e9-80c6-d8a389e78051-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-cl59r\" (UID: \"9b0ca5b9-11b5-46e9-80c6-d8a389e78051\") " pod="openshift-insights/insights-runtime-extractor-cl59r" Apr 16 19:55:08.797434 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:55:08.797415 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-54x7d\" (UniqueName: \"kubernetes.io/projected/9b0ca5b9-11b5-46e9-80c6-d8a389e78051-kube-api-access-54x7d\") pod \"insights-runtime-extractor-cl59r\" (UID: \"9b0ca5b9-11b5-46e9-80c6-d8a389e78051\") " pod="openshift-insights/insights-runtime-extractor-cl59r" Apr 16 19:55:08.830456 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:55:08.830437 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-cl59r" Apr 16 19:55:08.958516 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:55:08.958487 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-cl59r"] Apr 16 19:55:08.961001 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:55:08.960972 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9b0ca5b9_11b5_46e9_80c6_d8a389e78051.slice/crio-2c4930e039d83fe4c13301dabc52528466260aba281e11031340ecb3bb492090 WatchSource:0}: Error finding container 2c4930e039d83fe4c13301dabc52528466260aba281e11031340ecb3bb492090: Status 404 returned error can't find the container with id 2c4930e039d83fe4c13301dabc52528466260aba281e11031340ecb3bb492090 Apr 16 19:55:09.785773 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:55:09.785740 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-cl59r" event={"ID":"9b0ca5b9-11b5-46e9-80c6-d8a389e78051","Type":"ContainerStarted","Data":"7add8b583eaec33a79abd732b39b387c444944795ed3009033b4e66cb27a2788"} Apr 16 19:55:09.785773 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:55:09.785778 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-cl59r" event={"ID":"9b0ca5b9-11b5-46e9-80c6-d8a389e78051","Type":"ContainerStarted","Data":"2c4930e039d83fe4c13301dabc52528466260aba281e11031340ecb3bb492090"} Apr 16 19:55:10.790236 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:55:10.790197 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-cl59r" event={"ID":"9b0ca5b9-11b5-46e9-80c6-d8a389e78051","Type":"ContainerStarted","Data":"59a50a2f5a3bcf04eae399e7ed7e14fa695e739fb80d6ee6aa79d40ec1870b93"} Apr 16 19:55:11.767709 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:55:11.767679 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-9clf4" Apr 16 19:55:12.797951 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:55:12.797919 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-cl59r" event={"ID":"9b0ca5b9-11b5-46e9-80c6-d8a389e78051","Type":"ContainerStarted","Data":"da312835bf11bdedc2afc0fe25da6ae2a2f1237c770784c293f5c42b72e4f4d9"} Apr 16 19:55:12.823261 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:55:12.823208 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-cl59r" podStartSLOduration=1.949709115 podStartE2EDuration="4.823195179s" podCreationTimestamp="2026-04-16 19:55:08 +0000 UTC" firstStartedPulling="2026-04-16 19:55:09.057077014 +0000 UTC m=+76.235839017" lastFinishedPulling="2026-04-16 19:55:11.930563079 +0000 UTC m=+79.109325081" observedRunningTime="2026-04-16 19:55:12.822701426 +0000 UTC m=+80.001463450" watchObservedRunningTime="2026-04-16 19:55:12.823195179 +0000 UTC m=+80.001957204" Apr 16 19:55:15.698772 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:55:15.698738 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-qvsgg" Apr 16 19:55:16.362054 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:55:16.362023 2572 scope.go:117] "RemoveContainer" containerID="619193c81ee6e343be0256b7ab2e34ac2e84ee49f06f0fb5a374ee468cdd9a7f" Apr 16 19:55:16.809434 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:55:16.809411 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2xttw_d7cbe800-699f-48fe-9a8a-b74c32bf0dcc/console-operator/2.log" Apr 16 19:55:16.809780 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:55:16.809465 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-2xttw" event={"ID":"d7cbe800-699f-48fe-9a8a-b74c32bf0dcc","Type":"ContainerStarted","Data":"68f30b01cd2b8d724c35a8bc29045c317de34ab338430173565a9d21cbbe4c97"} Apr 16 19:55:16.809780 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:55:16.809729 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-2xttw" Apr 16 19:55:16.989520 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:55:16.989487 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-9d4b6777b-2xttw" Apr 16 19:55:17.421049 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:55:17.421013 2572 patch_prober.go:28] interesting pod/image-registry-656d5cd769-kngz2 container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 16 19:55:17.421251 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:55:17.421071 2572 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-656d5cd769-kngz2" podUID="c37c23a9-747d-419a-8a58-907dc11ecf6d" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 19:55:18.752210 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:55:18.752183 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-656d5cd769-kngz2" Apr 16 19:55:29.264788 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:55:29.264749 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/142d9ab9-4b05-479d-b198-2760a09292d1-service-ca-bundle\") pod \"router-default-74fc646496-l58qj\" (UID: \"142d9ab9-4b05-479d-b198-2760a09292d1\") " pod="openshift-ingress/router-default-74fc646496-l58qj" Apr 16 19:55:29.265490 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:55:29.265467 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/142d9ab9-4b05-479d-b198-2760a09292d1-service-ca-bundle\") pod \"router-default-74fc646496-l58qj\" (UID: \"142d9ab9-4b05-479d-b198-2760a09292d1\") " pod="openshift-ingress/router-default-74fc646496-l58qj" Apr 16 19:55:29.466843 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:55:29.466746 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/dea42027-362c-41e6-a940-a20b985788b0-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-lcvgf\" (UID: \"dea42027-362c-41e6-a940-a20b985788b0\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-lcvgf" Apr 16 19:55:29.467004 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:55:29.466871 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/b9ff25d1-4296-4a79-9bfa-cad826fb48cb-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-rcxgg\" (UID: \"b9ff25d1-4296-4a79-9bfa-cad826fb48cb\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-rcxgg" Apr 16 19:55:29.469252 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:55:29.469224 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/dea42027-362c-41e6-a940-a20b985788b0-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-lcvgf\" (UID: \"dea42027-362c-41e6-a940-a20b985788b0\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-lcvgf" Apr 16 19:55:29.469360 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:55:29.469250 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/b9ff25d1-4296-4a79-9bfa-cad826fb48cb-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-rcxgg\" (UID: \"b9ff25d1-4296-4a79-9bfa-cad826fb48cb\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-rcxgg" Apr 16 19:55:29.516697 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:55:29.516666 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-q92g7\"" Apr 16 19:55:29.524521 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:55:29.524499 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-74fc646496-l58qj" Apr 16 19:55:29.637398 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:55:29.637374 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-84znb\"" Apr 16 19:55:29.645661 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:55:29.645561 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-rcxgg" Apr 16 19:55:29.648078 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:55:29.648056 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-74fc646496-l58qj"] Apr 16 19:55:29.649176 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:55:29.649152 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod142d9ab9_4b05_479d_b198_2760a09292d1.slice/crio-5886f61e4839d4fb570f1bc1dfa8692085dba11126bb359b3a0aedc195c12a06 WatchSource:0}: Error finding container 5886f61e4839d4fb570f1bc1dfa8692085dba11126bb359b3a0aedc195c12a06: Status 404 returned error can't find the container with id 5886f61e4839d4fb570f1bc1dfa8692085dba11126bb359b3a0aedc195c12a06 Apr 16 19:55:29.651622 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:55:29.651601 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-w9jfp\"" Apr 16 19:55:29.659232 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:55:29.659208 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-lcvgf" Apr 16 19:55:29.789504 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:55:29.789458 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-rcxgg"] Apr 16 19:55:29.794014 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:55:29.793988 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb9ff25d1_4296_4a79_9bfa_cad826fb48cb.slice/crio-414e10aca0f220bbcf7a05c6de5530ac485874956a606ee227ca9f24d38d6a0e WatchSource:0}: Error finding container 414e10aca0f220bbcf7a05c6de5530ac485874956a606ee227ca9f24d38d6a0e: Status 404 returned error can't find the container with id 414e10aca0f220bbcf7a05c6de5530ac485874956a606ee227ca9f24d38d6a0e Apr 16 19:55:29.810217 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:55:29.810197 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-lcvgf"] Apr 16 19:55:29.812526 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:55:29.812502 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddea42027_362c_41e6_a940_a20b985788b0.slice/crio-2e18795b3d8b8b56a5a993f3bde601849a5ca4360974400fa0fb8e7090a9cf37 WatchSource:0}: Error finding container 2e18795b3d8b8b56a5a993f3bde601849a5ca4360974400fa0fb8e7090a9cf37: Status 404 returned error can't find the container with id 2e18795b3d8b8b56a5a993f3bde601849a5ca4360974400fa0fb8e7090a9cf37 Apr 16 19:55:29.844955 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:55:29.844911 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-rcxgg" event={"ID":"b9ff25d1-4296-4a79-9bfa-cad826fb48cb","Type":"ContainerStarted","Data":"414e10aca0f220bbcf7a05c6de5530ac485874956a606ee227ca9f24d38d6a0e"} Apr 16 19:55:29.846240 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:55:29.846213 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-74fc646496-l58qj" event={"ID":"142d9ab9-4b05-479d-b198-2760a09292d1","Type":"ContainerStarted","Data":"e20cab7259ed21d79a2ffc33ca5ed5d97d22d0218f0fda1c4e839c8bce7a387e"} Apr 16 19:55:29.846337 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:55:29.846245 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-74fc646496-l58qj" event={"ID":"142d9ab9-4b05-479d-b198-2760a09292d1","Type":"ContainerStarted","Data":"5886f61e4839d4fb570f1bc1dfa8692085dba11126bb359b3a0aedc195c12a06"} Apr 16 19:55:29.847252 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:55:29.847231 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-lcvgf" event={"ID":"dea42027-362c-41e6-a940-a20b985788b0","Type":"ContainerStarted","Data":"2e18795b3d8b8b56a5a993f3bde601849a5ca4360974400fa0fb8e7090a9cf37"} Apr 16 19:55:29.877395 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:55:29.877322 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-74fc646496-l58qj" podStartSLOduration=90.877307262 podStartE2EDuration="1m30.877307262s" podCreationTimestamp="2026-04-16 19:53:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 19:55:29.875552333 +0000 UTC m=+97.054314359" watchObservedRunningTime="2026-04-16 19:55:29.877307262 +0000 UTC m=+97.056069298" Apr 16 19:55:30.525460 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:55:30.525429 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-74fc646496-l58qj" Apr 16 19:55:30.528777 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:55:30.528748 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-74fc646496-l58qj" Apr 16 19:55:30.724299 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:55:30.724193 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-656d5cd769-kngz2"] Apr 16 19:55:30.852723 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:55:30.852688 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-lcvgf" event={"ID":"dea42027-362c-41e6-a940-a20b985788b0","Type":"ContainerStarted","Data":"766ee1d14257f0e1258961c8687d4a9e1d2076d1954334c3d231d72141e1a385"} Apr 16 19:55:30.853250 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:55:30.853229 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/router-default-74fc646496-l58qj" Apr 16 19:55:30.854498 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:55:30.854476 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-74fc646496-l58qj" Apr 16 19:55:30.877879 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:55:30.877824 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-cb95c66f6-lcvgf" podStartSLOduration=79.918544819 podStartE2EDuration="1m20.877809648s" podCreationTimestamp="2026-04-16 19:54:10 +0000 UTC" firstStartedPulling="2026-04-16 19:55:29.814253652 +0000 UTC m=+96.993015655" lastFinishedPulling="2026-04-16 19:55:30.773518481 +0000 UTC m=+97.952280484" observedRunningTime="2026-04-16 19:55:30.877015678 +0000 UTC m=+98.055777705" watchObservedRunningTime="2026-04-16 19:55:30.877809648 +0000 UTC m=+98.056571672" Apr 16 19:55:31.859284 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:55:31.859242 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-rcxgg" event={"ID":"b9ff25d1-4296-4a79-9bfa-cad826fb48cb","Type":"ContainerStarted","Data":"7661d87556c4d59251146680a38109d0d339d66b8fd3e235f2c6c62512bf4bcb"} Apr 16 19:55:31.876884 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:55:31.876794 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-rcxgg" podStartSLOduration=91.322642933 podStartE2EDuration="1m32.876778466s" podCreationTimestamp="2026-04-16 19:53:59 +0000 UTC" firstStartedPulling="2026-04-16 19:55:29.79602273 +0000 UTC m=+96.974784734" lastFinishedPulling="2026-04-16 19:55:31.350158247 +0000 UTC m=+98.528920267" observedRunningTime="2026-04-16 19:55:31.875796377 +0000 UTC m=+99.054558402" watchObservedRunningTime="2026-04-16 19:55:31.876778466 +0000 UTC m=+99.055540685" Apr 16 19:55:43.289410 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:55:43.289377 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-ghm4g"] Apr 16 19:55:43.291957 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:55:43.291928 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-ghm4g" Apr 16 19:55:43.294603 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:55:43.294577 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 16 19:55:43.294735 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:55:43.294690 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 16 19:55:43.294963 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:55:43.294932 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 16 19:55:43.295990 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:55:43.295971 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 16 19:55:43.296083 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:55:43.295996 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-wxw92\"" Apr 16 19:55:43.378868 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:55:43.378842 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t5lls\" (UniqueName: \"kubernetes.io/projected/3b43c525-2b23-4b6e-b52f-b7b02c633eb0-kube-api-access-t5lls\") pod \"node-exporter-ghm4g\" (UID: \"3b43c525-2b23-4b6e-b52f-b7b02c633eb0\") " pod="openshift-monitoring/node-exporter-ghm4g" Apr 16 19:55:43.379024 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:55:43.378880 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3b43c525-2b23-4b6e-b52f-b7b02c633eb0-sys\") pod \"node-exporter-ghm4g\" (UID: \"3b43c525-2b23-4b6e-b52f-b7b02c633eb0\") " pod="openshift-monitoring/node-exporter-ghm4g" Apr 16 19:55:43.379024 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:55:43.378897 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/3b43c525-2b23-4b6e-b52f-b7b02c633eb0-node-exporter-textfile\") pod \"node-exporter-ghm4g\" (UID: \"3b43c525-2b23-4b6e-b52f-b7b02c633eb0\") " pod="openshift-monitoring/node-exporter-ghm4g" Apr 16 19:55:43.379024 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:55:43.378917 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/3b43c525-2b23-4b6e-b52f-b7b02c633eb0-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-ghm4g\" (UID: \"3b43c525-2b23-4b6e-b52f-b7b02c633eb0\") " pod="openshift-monitoring/node-exporter-ghm4g" Apr 16 19:55:43.379024 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:55:43.378941 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/3b43c525-2b23-4b6e-b52f-b7b02c633eb0-node-exporter-tls\") pod \"node-exporter-ghm4g\" (UID: \"3b43c525-2b23-4b6e-b52f-b7b02c633eb0\") " pod="openshift-monitoring/node-exporter-ghm4g" Apr 16 19:55:43.379024 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:55:43.379007 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/3b43c525-2b23-4b6e-b52f-b7b02c633eb0-node-exporter-wtmp\") pod \"node-exporter-ghm4g\" (UID: \"3b43c525-2b23-4b6e-b52f-b7b02c633eb0\") " pod="openshift-monitoring/node-exporter-ghm4g" Apr 16 19:55:43.379294 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:55:43.379045 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/3b43c525-2b23-4b6e-b52f-b7b02c633eb0-root\") pod \"node-exporter-ghm4g\" (UID: \"3b43c525-2b23-4b6e-b52f-b7b02c633eb0\") " pod="openshift-monitoring/node-exporter-ghm4g" Apr 16 19:55:43.379294 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:55:43.379085 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/3b43c525-2b23-4b6e-b52f-b7b02c633eb0-node-exporter-accelerators-collector-config\") pod \"node-exporter-ghm4g\" (UID: \"3b43c525-2b23-4b6e-b52f-b7b02c633eb0\") " pod="openshift-monitoring/node-exporter-ghm4g" Apr 16 19:55:43.379294 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:55:43.379151 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/3b43c525-2b23-4b6e-b52f-b7b02c633eb0-metrics-client-ca\") pod \"node-exporter-ghm4g\" (UID: \"3b43c525-2b23-4b6e-b52f-b7b02c633eb0\") " pod="openshift-monitoring/node-exporter-ghm4g" Apr 16 19:55:43.480221 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:55:43.480192 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/3b43c525-2b23-4b6e-b52f-b7b02c633eb0-node-exporter-wtmp\") pod \"node-exporter-ghm4g\" (UID: \"3b43c525-2b23-4b6e-b52f-b7b02c633eb0\") " pod="openshift-monitoring/node-exporter-ghm4g" Apr 16 19:55:43.480346 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:55:43.480225 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/3b43c525-2b23-4b6e-b52f-b7b02c633eb0-root\") pod \"node-exporter-ghm4g\" (UID: \"3b43c525-2b23-4b6e-b52f-b7b02c633eb0\") " pod="openshift-monitoring/node-exporter-ghm4g" Apr 16 19:55:43.480346 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:55:43.480257 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/3b43c525-2b23-4b6e-b52f-b7b02c633eb0-node-exporter-accelerators-collector-config\") pod \"node-exporter-ghm4g\" (UID: \"3b43c525-2b23-4b6e-b52f-b7b02c633eb0\") " pod="openshift-monitoring/node-exporter-ghm4g" Apr 16 19:55:43.480346 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:55:43.480287 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/3b43c525-2b23-4b6e-b52f-b7b02c633eb0-metrics-client-ca\") pod \"node-exporter-ghm4g\" (UID: \"3b43c525-2b23-4b6e-b52f-b7b02c633eb0\") " pod="openshift-monitoring/node-exporter-ghm4g" Apr 16 19:55:43.480346 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:55:43.480309 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/3b43c525-2b23-4b6e-b52f-b7b02c633eb0-root\") pod \"node-exporter-ghm4g\" (UID: \"3b43c525-2b23-4b6e-b52f-b7b02c633eb0\") " pod="openshift-monitoring/node-exporter-ghm4g" Apr 16 19:55:43.480346 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:55:43.480323 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t5lls\" (UniqueName: \"kubernetes.io/projected/3b43c525-2b23-4b6e-b52f-b7b02c633eb0-kube-api-access-t5lls\") pod \"node-exporter-ghm4g\" (UID: \"3b43c525-2b23-4b6e-b52f-b7b02c633eb0\") " pod="openshift-monitoring/node-exporter-ghm4g" Apr 16 19:55:43.480595 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:55:43.480348 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/3b43c525-2b23-4b6e-b52f-b7b02c633eb0-node-exporter-wtmp\") pod \"node-exporter-ghm4g\" (UID: \"3b43c525-2b23-4b6e-b52f-b7b02c633eb0\") " pod="openshift-monitoring/node-exporter-ghm4g" Apr 16 19:55:43.480595 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:55:43.480368 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3b43c525-2b23-4b6e-b52f-b7b02c633eb0-sys\") pod \"node-exporter-ghm4g\" (UID: \"3b43c525-2b23-4b6e-b52f-b7b02c633eb0\") " pod="openshift-monitoring/node-exporter-ghm4g" Apr 16 19:55:43.480595 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:55:43.480392 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/3b43c525-2b23-4b6e-b52f-b7b02c633eb0-node-exporter-textfile\") pod \"node-exporter-ghm4g\" (UID: \"3b43c525-2b23-4b6e-b52f-b7b02c633eb0\") " pod="openshift-monitoring/node-exporter-ghm4g" Apr 16 19:55:43.480595 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:55:43.480420 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/3b43c525-2b23-4b6e-b52f-b7b02c633eb0-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-ghm4g\" (UID: \"3b43c525-2b23-4b6e-b52f-b7b02c633eb0\") " pod="openshift-monitoring/node-exporter-ghm4g" Apr 16 19:55:43.480595 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:55:43.480460 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/3b43c525-2b23-4b6e-b52f-b7b02c633eb0-node-exporter-tls\") pod \"node-exporter-ghm4g\" (UID: \"3b43c525-2b23-4b6e-b52f-b7b02c633eb0\") " pod="openshift-monitoring/node-exporter-ghm4g" Apr 16 19:55:43.480595 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:55:43.480462 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3b43c525-2b23-4b6e-b52f-b7b02c633eb0-sys\") pod \"node-exporter-ghm4g\" (UID: \"3b43c525-2b23-4b6e-b52f-b7b02c633eb0\") " pod="openshift-monitoring/node-exporter-ghm4g" Apr 16 19:55:43.480949 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:55:43.480821 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/3b43c525-2b23-4b6e-b52f-b7b02c633eb0-node-exporter-textfile\") pod \"node-exporter-ghm4g\" (UID: \"3b43c525-2b23-4b6e-b52f-b7b02c633eb0\") " pod="openshift-monitoring/node-exporter-ghm4g" Apr 16 19:55:43.480949 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:55:43.480896 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/3b43c525-2b23-4b6e-b52f-b7b02c633eb0-node-exporter-accelerators-collector-config\") pod \"node-exporter-ghm4g\" (UID: \"3b43c525-2b23-4b6e-b52f-b7b02c633eb0\") " pod="openshift-monitoring/node-exporter-ghm4g" Apr 16 19:55:43.480949 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:55:43.480934 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/3b43c525-2b23-4b6e-b52f-b7b02c633eb0-metrics-client-ca\") pod \"node-exporter-ghm4g\" (UID: \"3b43c525-2b23-4b6e-b52f-b7b02c633eb0\") " pod="openshift-monitoring/node-exporter-ghm4g" Apr 16 19:55:43.482788 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:55:43.482770 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/3b43c525-2b23-4b6e-b52f-b7b02c633eb0-node-exporter-tls\") pod \"node-exporter-ghm4g\" (UID: \"3b43c525-2b23-4b6e-b52f-b7b02c633eb0\") " pod="openshift-monitoring/node-exporter-ghm4g" Apr 16 19:55:43.482850 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:55:43.482806 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/3b43c525-2b23-4b6e-b52f-b7b02c633eb0-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-ghm4g\" (UID: \"3b43c525-2b23-4b6e-b52f-b7b02c633eb0\") " pod="openshift-monitoring/node-exporter-ghm4g" Apr 16 19:55:43.488206 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:55:43.488189 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t5lls\" (UniqueName: \"kubernetes.io/projected/3b43c525-2b23-4b6e-b52f-b7b02c633eb0-kube-api-access-t5lls\") pod \"node-exporter-ghm4g\" (UID: \"3b43c525-2b23-4b6e-b52f-b7b02c633eb0\") " pod="openshift-monitoring/node-exporter-ghm4g" Apr 16 19:55:43.602178 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:55:43.602148 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-ghm4g" Apr 16 19:55:43.610694 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:55:43.610670 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b43c525_2b23_4b6e_b52f_b7b02c633eb0.slice/crio-4321026a14fe676ff21aaed27722566b26035ab52257f61552066aaf175d2c72 WatchSource:0}: Error finding container 4321026a14fe676ff21aaed27722566b26035ab52257f61552066aaf175d2c72: Status 404 returned error can't find the container with id 4321026a14fe676ff21aaed27722566b26035ab52257f61552066aaf175d2c72 Apr 16 19:55:43.895765 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:55:43.895688 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-ghm4g" event={"ID":"3b43c525-2b23-4b6e-b52f-b7b02c633eb0","Type":"ContainerStarted","Data":"4321026a14fe676ff21aaed27722566b26035ab52257f61552066aaf175d2c72"} Apr 16 19:55:44.902444 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:55:44.902407 2572 generic.go:358] "Generic (PLEG): container finished" podID="3b43c525-2b23-4b6e-b52f-b7b02c633eb0" containerID="c2db0a39b865eebf186ceb203819befdf83d9d83b281a7b77db0c152db960149" exitCode=0 Apr 16 19:55:44.902893 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:55:44.902490 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-ghm4g" event={"ID":"3b43c525-2b23-4b6e-b52f-b7b02c633eb0","Type":"ContainerDied","Data":"c2db0a39b865eebf186ceb203819befdf83d9d83b281a7b77db0c152db960149"} Apr 16 19:55:45.907206 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:55:45.907167 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-ghm4g" event={"ID":"3b43c525-2b23-4b6e-b52f-b7b02c633eb0","Type":"ContainerStarted","Data":"8091202c39f1dbd82df73ae3e9d1919e683e7520158c495f8fd4c16b17d51f85"} Apr 16 19:55:45.907206 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:55:45.907203 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-ghm4g" event={"ID":"3b43c525-2b23-4b6e-b52f-b7b02c633eb0","Type":"ContainerStarted","Data":"00441b3daef54ade75b8df2609ae2ed1c758cd4731b754764c5d13c412f731a4"} Apr 16 19:55:45.928916 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:55:45.928869 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-ghm4g" podStartSLOduration=2.107851341 podStartE2EDuration="2.928856001s" podCreationTimestamp="2026-04-16 19:55:43 +0000 UTC" firstStartedPulling="2026-04-16 19:55:43.612156805 +0000 UTC m=+110.790918814" lastFinishedPulling="2026-04-16 19:55:44.433161467 +0000 UTC m=+111.611923474" observedRunningTime="2026-04-16 19:55:45.926695518 +0000 UTC m=+113.105457542" watchObservedRunningTime="2026-04-16 19:55:45.928856001 +0000 UTC m=+113.107618026" Apr 16 19:55:55.749223 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:55:55.749167 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-656d5cd769-kngz2" podUID="c37c23a9-747d-419a-8a58-907dc11ecf6d" containerName="registry" containerID="cri-o://165fa232cb75e58b4443aa764cbea418f0fc71367b962d4878beddc569895af9" gracePeriod=30 Apr 16 19:55:55.777849 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:55:55.777807 2572 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-78477dcc55-b42tf" podUID="a94af5f5-bc76-4166-81f8-5f322b1a86ab" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 16 19:55:55.935060 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:55:55.935029 2572 generic.go:358] "Generic (PLEG): container finished" podID="c37c23a9-747d-419a-8a58-907dc11ecf6d" containerID="165fa232cb75e58b4443aa764cbea418f0fc71367b962d4878beddc569895af9" exitCode=0 Apr 16 19:55:55.935197 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:55:55.935110 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-656d5cd769-kngz2" event={"ID":"c37c23a9-747d-419a-8a58-907dc11ecf6d","Type":"ContainerDied","Data":"165fa232cb75e58b4443aa764cbea418f0fc71367b962d4878beddc569895af9"} Apr 16 19:55:55.936366 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:55:55.936337 2572 generic.go:358] "Generic (PLEG): container finished" podID="2090cd61-18f9-425a-bf42-4b9628417aad" containerID="5530e97b315240bcac14d72022d5ddbd8912ce72f3266cc2034f5c9eba4da8d8" exitCode=0 Apr 16 19:55:55.936457 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:55:55.936383 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-qxcc8" event={"ID":"2090cd61-18f9-425a-bf42-4b9628417aad","Type":"ContainerDied","Data":"5530e97b315240bcac14d72022d5ddbd8912ce72f3266cc2034f5c9eba4da8d8"} Apr 16 19:55:55.936748 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:55:55.936731 2572 scope.go:117] "RemoveContainer" containerID="5530e97b315240bcac14d72022d5ddbd8912ce72f3266cc2034f5c9eba4da8d8" Apr 16 19:55:56.001468 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:55:56.001414 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-656d5cd769-kngz2" Apr 16 19:55:56.080298 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:55:56.080265 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/c37c23a9-747d-419a-8a58-907dc11ecf6d-image-registry-private-configuration\") pod \"c37c23a9-747d-419a-8a58-907dc11ecf6d\" (UID: \"c37c23a9-747d-419a-8a58-907dc11ecf6d\") " Apr 16 19:55:56.080472 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:55:56.080312 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ftxm9\" (UniqueName: \"kubernetes.io/projected/c37c23a9-747d-419a-8a58-907dc11ecf6d-kube-api-access-ftxm9\") pod \"c37c23a9-747d-419a-8a58-907dc11ecf6d\" (UID: \"c37c23a9-747d-419a-8a58-907dc11ecf6d\") " Apr 16 19:55:56.080472 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:55:56.080341 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/c37c23a9-747d-419a-8a58-907dc11ecf6d-installation-pull-secrets\") pod \"c37c23a9-747d-419a-8a58-907dc11ecf6d\" (UID: \"c37c23a9-747d-419a-8a58-907dc11ecf6d\") " Apr 16 19:55:56.080472 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:55:56.080374 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c37c23a9-747d-419a-8a58-907dc11ecf6d-registry-tls\") pod \"c37c23a9-747d-419a-8a58-907dc11ecf6d\" (UID: \"c37c23a9-747d-419a-8a58-907dc11ecf6d\") " Apr 16 19:55:56.080472 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:55:56.080418 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/c37c23a9-747d-419a-8a58-907dc11ecf6d-ca-trust-extracted\") pod \"c37c23a9-747d-419a-8a58-907dc11ecf6d\" (UID: \"c37c23a9-747d-419a-8a58-907dc11ecf6d\") " Apr 16 19:55:56.080472 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:55:56.080442 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c37c23a9-747d-419a-8a58-907dc11ecf6d-trusted-ca\") pod \"c37c23a9-747d-419a-8a58-907dc11ecf6d\" (UID: \"c37c23a9-747d-419a-8a58-907dc11ecf6d\") " Apr 16 19:55:56.080720 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:55:56.080480 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c37c23a9-747d-419a-8a58-907dc11ecf6d-bound-sa-token\") pod \"c37c23a9-747d-419a-8a58-907dc11ecf6d\" (UID: \"c37c23a9-747d-419a-8a58-907dc11ecf6d\") " Apr 16 19:55:56.080720 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:55:56.080525 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/c37c23a9-747d-419a-8a58-907dc11ecf6d-registry-certificates\") pod \"c37c23a9-747d-419a-8a58-907dc11ecf6d\" (UID: \"c37c23a9-747d-419a-8a58-907dc11ecf6d\") " Apr 16 19:55:56.081057 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:55:56.080976 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c37c23a9-747d-419a-8a58-907dc11ecf6d-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "c37c23a9-747d-419a-8a58-907dc11ecf6d" (UID: "c37c23a9-747d-419a-8a58-907dc11ecf6d"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 19:55:56.083075 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:55:56.083050 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c37c23a9-747d-419a-8a58-907dc11ecf6d-kube-api-access-ftxm9" (OuterVolumeSpecName: "kube-api-access-ftxm9") pod "c37c23a9-747d-419a-8a58-907dc11ecf6d" (UID: "c37c23a9-747d-419a-8a58-907dc11ecf6d"). InnerVolumeSpecName "kube-api-access-ftxm9". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 19:55:56.083075 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:55:56.083063 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c37c23a9-747d-419a-8a58-907dc11ecf6d-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "c37c23a9-747d-419a-8a58-907dc11ecf6d" (UID: "c37c23a9-747d-419a-8a58-907dc11ecf6d"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 19:55:56.083241 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:55:56.083068 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c37c23a9-747d-419a-8a58-907dc11ecf6d-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "c37c23a9-747d-419a-8a58-907dc11ecf6d" (UID: "c37c23a9-747d-419a-8a58-907dc11ecf6d"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 19:55:56.083241 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:55:56.083107 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c37c23a9-747d-419a-8a58-907dc11ecf6d-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "c37c23a9-747d-419a-8a58-907dc11ecf6d" (UID: "c37c23a9-747d-419a-8a58-907dc11ecf6d"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 19:55:56.083316 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:55:56.083294 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c37c23a9-747d-419a-8a58-907dc11ecf6d-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "c37c23a9-747d-419a-8a58-907dc11ecf6d" (UID: "c37c23a9-747d-419a-8a58-907dc11ecf6d"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 19:55:56.084445 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:55:56.084422 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c37c23a9-747d-419a-8a58-907dc11ecf6d-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "c37c23a9-747d-419a-8a58-907dc11ecf6d" (UID: "c37c23a9-747d-419a-8a58-907dc11ecf6d"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 19:55:56.091927 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:55:56.091899 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c37c23a9-747d-419a-8a58-907dc11ecf6d-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "c37c23a9-747d-419a-8a58-907dc11ecf6d" (UID: "c37c23a9-747d-419a-8a58-907dc11ecf6d"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 19:55:56.181372 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:55:56.181349 2572 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/c37c23a9-747d-419a-8a58-907dc11ecf6d-ca-trust-extracted\") on node \"ip-10-0-128-48.ec2.internal\" DevicePath \"\"" Apr 16 19:55:56.181372 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:55:56.181370 2572 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c37c23a9-747d-419a-8a58-907dc11ecf6d-trusted-ca\") on node \"ip-10-0-128-48.ec2.internal\" DevicePath \"\"" Apr 16 19:55:56.181486 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:55:56.181380 2572 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c37c23a9-747d-419a-8a58-907dc11ecf6d-bound-sa-token\") on node \"ip-10-0-128-48.ec2.internal\" DevicePath \"\"" Apr 16 19:55:56.181486 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:55:56.181389 2572 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/c37c23a9-747d-419a-8a58-907dc11ecf6d-registry-certificates\") on node \"ip-10-0-128-48.ec2.internal\" DevicePath \"\"" Apr 16 19:55:56.181486 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:55:56.181398 2572 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/c37c23a9-747d-419a-8a58-907dc11ecf6d-image-registry-private-configuration\") on node \"ip-10-0-128-48.ec2.internal\" DevicePath \"\"" Apr 16 19:55:56.181486 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:55:56.181407 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-ftxm9\" (UniqueName: \"kubernetes.io/projected/c37c23a9-747d-419a-8a58-907dc11ecf6d-kube-api-access-ftxm9\") on node \"ip-10-0-128-48.ec2.internal\" DevicePath \"\"" Apr 16 19:55:56.181486 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:55:56.181416 2572 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/c37c23a9-747d-419a-8a58-907dc11ecf6d-installation-pull-secrets\") on node \"ip-10-0-128-48.ec2.internal\" DevicePath \"\"" Apr 16 19:55:56.181486 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:55:56.181424 2572 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c37c23a9-747d-419a-8a58-907dc11ecf6d-registry-tls\") on node \"ip-10-0-128-48.ec2.internal\" DevicePath \"\"" Apr 16 19:55:56.939803 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:55:56.939773 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-656d5cd769-kngz2" Apr 16 19:55:56.939803 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:55:56.939781 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-656d5cd769-kngz2" event={"ID":"c37c23a9-747d-419a-8a58-907dc11ecf6d","Type":"ContainerDied","Data":"13095080d894c6b03269a7d9fad0936e3b996dec76517527a6fb156608f76a36"} Apr 16 19:55:56.940312 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:55:56.939832 2572 scope.go:117] "RemoveContainer" containerID="165fa232cb75e58b4443aa764cbea418f0fc71367b962d4878beddc569895af9" Apr 16 19:55:56.941631 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:55:56.941606 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-qxcc8" event={"ID":"2090cd61-18f9-425a-bf42-4b9628417aad","Type":"ContainerStarted","Data":"972f087e812bda02209319dd9eb36e8b7c65abe87a0072d75b93ca1d9430b776"} Apr 16 19:55:56.973932 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:55:56.973903 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-656d5cd769-kngz2"] Apr 16 19:55:56.978962 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:55:56.978943 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-656d5cd769-kngz2"] Apr 16 19:55:57.365143 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:55:57.365111 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c37c23a9-747d-419a-8a58-907dc11ecf6d" path="/var/lib/kubelet/pods/c37c23a9-747d-419a-8a58-907dc11ecf6d/volumes" Apr 16 19:55:59.954495 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:55:59.954412 2572 generic.go:358] "Generic (PLEG): container finished" podID="499fb04b-1629-4d2a-8d4c-4b6f38ec093e" containerID="39b55aeab78a93c58cdce7f8f78fecc34e47191d6777d14076796b5b1cc8a17e" exitCode=0 Apr 16 19:55:59.954495 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:55:59.954453 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-g96dz" event={"ID":"499fb04b-1629-4d2a-8d4c-4b6f38ec093e","Type":"ContainerDied","Data":"39b55aeab78a93c58cdce7f8f78fecc34e47191d6777d14076796b5b1cc8a17e"} Apr 16 19:55:59.954865 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:55:59.954711 2572 scope.go:117] "RemoveContainer" containerID="39b55aeab78a93c58cdce7f8f78fecc34e47191d6777d14076796b5b1cc8a17e" Apr 16 19:56:00.958894 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:56:00.958860 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-g96dz" event={"ID":"499fb04b-1629-4d2a-8d4c-4b6f38ec093e","Type":"ContainerStarted","Data":"ceadd5b7aa395d85642c48b373d376bfd04022ca3619b8394156b96e25445b3b"} Apr 16 19:56:02.027109 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:56:02.027059 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/458f83e2-e97a-457a-9081-a5ae099b6973-metrics-certs\") pod \"network-metrics-daemon-5j478\" (UID: \"458f83e2-e97a-457a-9081-a5ae099b6973\") " pod="openshift-multus/network-metrics-daemon-5j478" Apr 16 19:56:02.029493 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:56:02.029465 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/458f83e2-e97a-457a-9081-a5ae099b6973-metrics-certs\") pod \"network-metrics-daemon-5j478\" (UID: \"458f83e2-e97a-457a-9081-a5ae099b6973\") " pod="openshift-multus/network-metrics-daemon-5j478" Apr 16 19:56:02.282196 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:56:02.282122 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-l2m6x\"" Apr 16 19:56:02.289273 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:56:02.289256 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5j478" Apr 16 19:56:02.402515 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:56:02.402429 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-5j478"] Apr 16 19:56:02.404751 ip-10-0-128-48 kubenswrapper[2572]: W0416 19:56:02.404722 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod458f83e2_e97a_457a_9081_a5ae099b6973.slice/crio-c7eba854558e9c324974f0050e909e51ab91306932f86ad5dfb8987836575944 WatchSource:0}: Error finding container c7eba854558e9c324974f0050e909e51ab91306932f86ad5dfb8987836575944: Status 404 returned error can't find the container with id c7eba854558e9c324974f0050e909e51ab91306932f86ad5dfb8987836575944 Apr 16 19:56:02.964242 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:56:02.964204 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-5j478" event={"ID":"458f83e2-e97a-457a-9081-a5ae099b6973","Type":"ContainerStarted","Data":"c7eba854558e9c324974f0050e909e51ab91306932f86ad5dfb8987836575944"} Apr 16 19:56:03.968622 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:56:03.968559 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-5j478" event={"ID":"458f83e2-e97a-457a-9081-a5ae099b6973","Type":"ContainerStarted","Data":"c3e7cf373237a17b8000eedf1d59125212f0afbfb8be8dafceb65edf89312819"} Apr 16 19:56:03.968622 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:56:03.968598 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-5j478" event={"ID":"458f83e2-e97a-457a-9081-a5ae099b6973","Type":"ContainerStarted","Data":"f61f4d502e5d8850d5fd89c8bf48101eefeb3ed05a048fa74591eba38f9f5829"} Apr 16 19:56:03.984285 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:56:03.984242 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-5j478" podStartSLOduration=129.683669986 podStartE2EDuration="2m10.984229077s" podCreationTimestamp="2026-04-16 19:53:53 +0000 UTC" firstStartedPulling="2026-04-16 19:56:02.406619573 +0000 UTC m=+129.585381579" lastFinishedPulling="2026-04-16 19:56:03.707178659 +0000 UTC m=+130.885940670" observedRunningTime="2026-04-16 19:56:03.98415065 +0000 UTC m=+131.162912675" watchObservedRunningTime="2026-04-16 19:56:03.984229077 +0000 UTC m=+131.162991102" Apr 16 19:56:05.776769 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:56:05.776724 2572 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-78477dcc55-b42tf" podUID="a94af5f5-bc76-4166-81f8-5f322b1a86ab" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 16 19:56:15.777373 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:56:15.777286 2572 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-78477dcc55-b42tf" podUID="a94af5f5-bc76-4166-81f8-5f322b1a86ab" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 16 19:56:15.777373 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:56:15.777363 2572 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-78477dcc55-b42tf" Apr 16 19:56:15.777857 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:56:15.777799 2572 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="service-proxy" containerStatusID={"Type":"cri-o","ID":"fa90e674137d116c9ab9e146edb1115f9f9e54d3250ce8ddaa1a2db50ac49b63"} pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-78477dcc55-b42tf" containerMessage="Container service-proxy failed liveness probe, will be restarted" Apr 16 19:56:15.777857 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:56:15.777845 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-78477dcc55-b42tf" podUID="a94af5f5-bc76-4166-81f8-5f322b1a86ab" containerName="service-proxy" containerID="cri-o://fa90e674137d116c9ab9e146edb1115f9f9e54d3250ce8ddaa1a2db50ac49b63" gracePeriod=30 Apr 16 19:56:16.002148 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:56:16.002120 2572 generic.go:358] "Generic (PLEG): container finished" podID="a94af5f5-bc76-4166-81f8-5f322b1a86ab" containerID="fa90e674137d116c9ab9e146edb1115f9f9e54d3250ce8ddaa1a2db50ac49b63" exitCode=2 Apr 16 19:56:16.002257 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:56:16.002186 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-78477dcc55-b42tf" event={"ID":"a94af5f5-bc76-4166-81f8-5f322b1a86ab","Type":"ContainerDied","Data":"fa90e674137d116c9ab9e146edb1115f9f9e54d3250ce8ddaa1a2db50ac49b63"} Apr 16 19:56:16.002257 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:56:16.002222 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-78477dcc55-b42tf" event={"ID":"a94af5f5-bc76-4166-81f8-5f322b1a86ab","Type":"ContainerStarted","Data":"d3c23f4595a877118bde53e85511435d77b8a1e41e818e06b2f1fb0f93d1f41e"} Apr 16 19:56:16.003491 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:56:16.003471 2572 generic.go:358] "Generic (PLEG): container finished" podID="c485156a-3973-41b9-9936-2cd58d6d6ea4" containerID="5e807dbf4e4f915719997d8ae94049b27f84f96d60e855469438293fd679d64b" exitCode=0 Apr 16 19:56:16.003592 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:56:16.003545 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-gm4s4" event={"ID":"c485156a-3973-41b9-9936-2cd58d6d6ea4","Type":"ContainerDied","Data":"5e807dbf4e4f915719997d8ae94049b27f84f96d60e855469438293fd679d64b"} Apr 16 19:56:16.003867 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:56:16.003851 2572 scope.go:117] "RemoveContainer" containerID="5e807dbf4e4f915719997d8ae94049b27f84f96d60e855469438293fd679d64b" Apr 16 19:56:16.997215 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:56:16.997186 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-74fc646496-l58qj_142d9ab9-4b05-479d-b198-2760a09292d1/router/0.log" Apr 16 19:56:17.008449 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:56:17.008417 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-gm4s4" event={"ID":"c485156a-3973-41b9-9936-2cd58d6d6ea4","Type":"ContainerStarted","Data":"cecf56ea46b1d8d8e27b8a3defc9d0d020886a1cd4d10d1e4b47a84ea4199ab8"} Apr 16 19:56:17.009132 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:56:17.009114 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-ppsrs_2676cacd-954e-4b7c-a85c-1b43b90f0471/serve-healthcheck-canary/0.log" Apr 16 19:58:53.223980 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:58:53.223951 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2xttw_d7cbe800-699f-48fe-9a8a-b74c32bf0dcc/console-operator/2.log" Apr 16 19:58:53.224463 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:58:53.224085 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2xttw_d7cbe800-699f-48fe-9a8a-b74c32bf0dcc/console-operator/2.log" Apr 16 19:58:53.231534 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:58:53.231513 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bsbmz_40cddc46-c9be-411b-92b8-a65e04009fc9/ovn-acl-logging/0.log" Apr 16 19:58:53.231686 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:58:53.231534 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bsbmz_40cddc46-c9be-411b-92b8-a65e04009fc9/ovn-acl-logging/0.log" Apr 16 19:58:53.234578 ip-10-0-128-48 kubenswrapper[2572]: I0416 19:58:53.234560 2572 kubelet.go:1628] "Image garbage collection succeeded" Apr 16 20:01:19.420400 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:01:19.420314 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-5pvbr"] Apr 16 20:01:19.420975 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:01:19.420787 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c37c23a9-747d-419a-8a58-907dc11ecf6d" containerName="registry" Apr 16 20:01:19.420975 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:01:19.420806 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="c37c23a9-747d-419a-8a58-907dc11ecf6d" containerName="registry" Apr 16 20:01:19.420975 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:01:19.420892 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="c37c23a9-747d-419a-8a58-907dc11ecf6d" containerName="registry" Apr 16 20:01:19.423088 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:01:19.423061 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-5pvbr" Apr 16 20:01:19.425850 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:01:19.425821 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"istio-ca-root-cert\"" Apr 16 20:01:19.426669 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:01:19.426647 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"openshift-ai-inference-openshift-default-dockercfg-74msk\"" Apr 16 20:01:19.437409 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:01:19.437387 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-5pvbr"] Apr 16 20:01:19.557846 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:01:19.557815 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/33697ceb-0a50-49f2-830f-8bf9e962c02c-credential-socket\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-5pvbr\" (UID: \"33697ceb-0a50-49f2-830f-8bf9e962c02c\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-5pvbr" Apr 16 20:01:19.558012 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:01:19.557865 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/33697ceb-0a50-49f2-830f-8bf9e962c02c-workload-certs\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-5pvbr\" (UID: \"33697ceb-0a50-49f2-830f-8bf9e962c02c\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-5pvbr" Apr 16 20:01:19.558012 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:01:19.557919 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/33697ceb-0a50-49f2-830f-8bf9e962c02c-istio-token\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-5pvbr\" (UID: \"33697ceb-0a50-49f2-830f-8bf9e962c02c\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-5pvbr" Apr 16 20:01:19.558012 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:01:19.557978 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/33697ceb-0a50-49f2-830f-8bf9e962c02c-istio-podinfo\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-5pvbr\" (UID: \"33697ceb-0a50-49f2-830f-8bf9e962c02c\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-5pvbr" Apr 16 20:01:19.558144 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:01:19.558015 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/33697ceb-0a50-49f2-830f-8bf9e962c02c-workload-socket\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-5pvbr\" (UID: \"33697ceb-0a50-49f2-830f-8bf9e962c02c\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-5pvbr" Apr 16 20:01:19.558144 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:01:19.558045 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9zthf\" (UniqueName: \"kubernetes.io/projected/33697ceb-0a50-49f2-830f-8bf9e962c02c-kube-api-access-9zthf\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-5pvbr\" (UID: \"33697ceb-0a50-49f2-830f-8bf9e962c02c\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-5pvbr" Apr 16 20:01:19.558144 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:01:19.558079 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/33697ceb-0a50-49f2-830f-8bf9e962c02c-istio-envoy\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-5pvbr\" (UID: \"33697ceb-0a50-49f2-830f-8bf9e962c02c\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-5pvbr" Apr 16 20:01:19.558144 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:01:19.558121 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/33697ceb-0a50-49f2-830f-8bf9e962c02c-istio-data\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-5pvbr\" (UID: \"33697ceb-0a50-49f2-830f-8bf9e962c02c\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-5pvbr" Apr 16 20:01:19.558266 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:01:19.558190 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/33697ceb-0a50-49f2-830f-8bf9e962c02c-istiod-ca-cert\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-5pvbr\" (UID: \"33697ceb-0a50-49f2-830f-8bf9e962c02c\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-5pvbr" Apr 16 20:01:19.659346 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:01:19.659308 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/33697ceb-0a50-49f2-830f-8bf9e962c02c-istiod-ca-cert\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-5pvbr\" (UID: \"33697ceb-0a50-49f2-830f-8bf9e962c02c\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-5pvbr" Apr 16 20:01:19.659512 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:01:19.659356 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/33697ceb-0a50-49f2-830f-8bf9e962c02c-credential-socket\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-5pvbr\" (UID: \"33697ceb-0a50-49f2-830f-8bf9e962c02c\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-5pvbr" Apr 16 20:01:19.659512 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:01:19.659399 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/33697ceb-0a50-49f2-830f-8bf9e962c02c-workload-certs\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-5pvbr\" (UID: \"33697ceb-0a50-49f2-830f-8bf9e962c02c\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-5pvbr" Apr 16 20:01:19.659512 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:01:19.659422 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/33697ceb-0a50-49f2-830f-8bf9e962c02c-istio-token\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-5pvbr\" (UID: \"33697ceb-0a50-49f2-830f-8bf9e962c02c\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-5pvbr" Apr 16 20:01:19.659512 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:01:19.659458 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/33697ceb-0a50-49f2-830f-8bf9e962c02c-istio-podinfo\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-5pvbr\" (UID: \"33697ceb-0a50-49f2-830f-8bf9e962c02c\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-5pvbr" Apr 16 20:01:19.659713 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:01:19.659506 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/33697ceb-0a50-49f2-830f-8bf9e962c02c-workload-socket\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-5pvbr\" (UID: \"33697ceb-0a50-49f2-830f-8bf9e962c02c\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-5pvbr" Apr 16 20:01:19.659713 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:01:19.659542 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9zthf\" (UniqueName: \"kubernetes.io/projected/33697ceb-0a50-49f2-830f-8bf9e962c02c-kube-api-access-9zthf\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-5pvbr\" (UID: \"33697ceb-0a50-49f2-830f-8bf9e962c02c\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-5pvbr" Apr 16 20:01:19.659713 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:01:19.659584 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/33697ceb-0a50-49f2-830f-8bf9e962c02c-istio-envoy\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-5pvbr\" (UID: \"33697ceb-0a50-49f2-830f-8bf9e962c02c\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-5pvbr" Apr 16 20:01:19.659713 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:01:19.659604 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/33697ceb-0a50-49f2-830f-8bf9e962c02c-istio-data\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-5pvbr\" (UID: \"33697ceb-0a50-49f2-830f-8bf9e962c02c\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-5pvbr" Apr 16 20:01:19.659912 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:01:19.659793 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/33697ceb-0a50-49f2-830f-8bf9e962c02c-credential-socket\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-5pvbr\" (UID: \"33697ceb-0a50-49f2-830f-8bf9e962c02c\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-5pvbr" Apr 16 20:01:19.659912 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:01:19.659867 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/33697ceb-0a50-49f2-830f-8bf9e962c02c-workload-socket\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-5pvbr\" (UID: \"33697ceb-0a50-49f2-830f-8bf9e962c02c\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-5pvbr" Apr 16 20:01:19.660038 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:01:19.660016 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/33697ceb-0a50-49f2-830f-8bf9e962c02c-workload-certs\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-5pvbr\" (UID: \"33697ceb-0a50-49f2-830f-8bf9e962c02c\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-5pvbr" Apr 16 20:01:19.660132 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:01:19.660039 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/33697ceb-0a50-49f2-830f-8bf9e962c02c-istiod-ca-cert\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-5pvbr\" (UID: \"33697ceb-0a50-49f2-830f-8bf9e962c02c\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-5pvbr" Apr 16 20:01:19.660132 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:01:19.660048 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/33697ceb-0a50-49f2-830f-8bf9e962c02c-istio-data\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-5pvbr\" (UID: \"33697ceb-0a50-49f2-830f-8bf9e962c02c\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-5pvbr" Apr 16 20:01:19.661959 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:01:19.661931 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/33697ceb-0a50-49f2-830f-8bf9e962c02c-istio-envoy\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-5pvbr\" (UID: \"33697ceb-0a50-49f2-830f-8bf9e962c02c\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-5pvbr" Apr 16 20:01:19.662119 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:01:19.662084 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/33697ceb-0a50-49f2-830f-8bf9e962c02c-istio-podinfo\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-5pvbr\" (UID: \"33697ceb-0a50-49f2-830f-8bf9e962c02c\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-5pvbr" Apr 16 20:01:19.667923 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:01:19.667897 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/33697ceb-0a50-49f2-830f-8bf9e962c02c-istio-token\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-5pvbr\" (UID: \"33697ceb-0a50-49f2-830f-8bf9e962c02c\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-5pvbr" Apr 16 20:01:19.668135 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:01:19.668119 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9zthf\" (UniqueName: \"kubernetes.io/projected/33697ceb-0a50-49f2-830f-8bf9e962c02c-kube-api-access-9zthf\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-5pvbr\" (UID: \"33697ceb-0a50-49f2-830f-8bf9e962c02c\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-5pvbr" Apr 16 20:01:19.734235 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:01:19.734167 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-5pvbr" Apr 16 20:01:19.859562 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:01:19.859398 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-5pvbr"] Apr 16 20:01:19.862161 ip-10-0-128-48 kubenswrapper[2572]: W0416 20:01:19.862127 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod33697ceb_0a50_49f2_830f_8bf9e962c02c.slice/crio-f7ff2c4a33a47246441e1b836fa43448e0f70950a15c76ae8f353d8d55ab5fb0 WatchSource:0}: Error finding container f7ff2c4a33a47246441e1b836fa43448e0f70950a15c76ae8f353d8d55ab5fb0: Status 404 returned error can't find the container with id f7ff2c4a33a47246441e1b836fa43448e0f70950a15c76ae8f353d8d55ab5fb0 Apr 16 20:01:19.863842 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:01:19.863826 2572 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 20:01:20.838857 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:01:20.838814 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-5pvbr" event={"ID":"33697ceb-0a50-49f2-830f-8bf9e962c02c","Type":"ContainerStarted","Data":"f7ff2c4a33a47246441e1b836fa43448e0f70950a15c76ae8f353d8d55ab5fb0"} Apr 16 20:01:22.381649 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:01:22.381609 2572 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 16 20:01:22.381946 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:01:22.381684 2572 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 16 20:01:22.381946 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:01:22.381712 2572 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 16 20:01:22.846078 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:01:22.846045 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-5pvbr" event={"ID":"33697ceb-0a50-49f2-830f-8bf9e962c02c","Type":"ContainerStarted","Data":"8a60d6fc04796cadd14bf672d4302597dfb7a0565ae8eeee11f2b997ed5cdc37"} Apr 16 20:01:22.866954 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:01:22.866903 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-5pvbr" podStartSLOduration=1.349462183 podStartE2EDuration="3.866886218s" podCreationTimestamp="2026-04-16 20:01:19 +0000 UTC" firstStartedPulling="2026-04-16 20:01:19.863961314 +0000 UTC m=+447.042723316" lastFinishedPulling="2026-04-16 20:01:22.381385346 +0000 UTC m=+449.560147351" observedRunningTime="2026-04-16 20:01:22.866027145 +0000 UTC m=+450.044789169" watchObservedRunningTime="2026-04-16 20:01:22.866886218 +0000 UTC m=+450.045648244" Apr 16 20:01:23.734768 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:01:23.734728 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-5pvbr" Apr 16 20:01:23.739525 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:01:23.739492 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-5pvbr" Apr 16 20:01:23.849236 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:01:23.849202 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-5pvbr" Apr 16 20:01:23.850186 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:01:23.850165 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-5pvbr" Apr 16 20:01:39.233053 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:01:39.233020 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-ltgfr"] Apr 16 20:01:39.235267 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:01:39.235239 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-ltgfr" Apr 16 20:01:39.240492 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:01:39.240469 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 16 20:01:39.240595 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:01:39.240470 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"kuadrant-operator-controller-manager-dockercfg-pj9lg\"" Apr 16 20:01:39.240770 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:01:39.240755 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 16 20:01:39.256954 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:01:39.256933 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-ltgfr"] Apr 16 20:01:39.329393 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:01:39.329360 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7tbkt\" (UniqueName: \"kubernetes.io/projected/9eaa2062-a5e2-480b-bcf7-9a163da37f94-kube-api-access-7tbkt\") pod \"kuadrant-operator-controller-manager-6ddf9554fc-ltgfr\" (UID: \"9eaa2062-a5e2-480b-bcf7-9a163da37f94\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-ltgfr" Apr 16 20:01:39.329539 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:01:39.329411 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/9eaa2062-a5e2-480b-bcf7-9a163da37f94-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6ddf9554fc-ltgfr\" (UID: \"9eaa2062-a5e2-480b-bcf7-9a163da37f94\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-ltgfr" Apr 16 20:01:39.332655 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:01:39.332630 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/lws-controller-manager-ddc57ffc5-tnlfz"] Apr 16 20:01:39.334771 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:01:39.334755 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-ddc57ffc5-tnlfz" Apr 16 20:01:39.340103 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:01:39.340066 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"openshift-service-ca.crt\"" Apr 16 20:01:39.340229 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:01:39.340143 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"lws-manager-config\"" Apr 16 20:01:39.340229 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:01:39.340188 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"kube-root-ca.crt\"" Apr 16 20:01:39.340365 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:01:39.340258 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"webhook-server-cert\"" Apr 16 20:01:39.343107 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:01:39.343077 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"lws-controller-manager-dockercfg-bs2lj\"" Apr 16 20:01:39.353934 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:01:39.353917 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"metrics-server-cert\"" Apr 16 20:01:39.374323 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:01:39.374299 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-ddc57ffc5-tnlfz"] Apr 16 20:01:39.430706 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:01:39.430673 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5pg9d\" (UniqueName: \"kubernetes.io/projected/58fbb2f0-9cdf-4a93-8d11-f46cb14742e4-kube-api-access-5pg9d\") pod \"lws-controller-manager-ddc57ffc5-tnlfz\" (UID: \"58fbb2f0-9cdf-4a93-8d11-f46cb14742e4\") " pod="openshift-lws-operator/lws-controller-manager-ddc57ffc5-tnlfz" Apr 16 20:01:39.430844 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:01:39.430730 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/9eaa2062-a5e2-480b-bcf7-9a163da37f94-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6ddf9554fc-ltgfr\" (UID: \"9eaa2062-a5e2-480b-bcf7-9a163da37f94\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-ltgfr" Apr 16 20:01:39.430844 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:01:39.430761 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/58fbb2f0-9cdf-4a93-8d11-f46cb14742e4-cert\") pod \"lws-controller-manager-ddc57ffc5-tnlfz\" (UID: \"58fbb2f0-9cdf-4a93-8d11-f46cb14742e4\") " pod="openshift-lws-operator/lws-controller-manager-ddc57ffc5-tnlfz" Apr 16 20:01:39.430844 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:01:39.430821 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/58fbb2f0-9cdf-4a93-8d11-f46cb14742e4-manager-config\") pod \"lws-controller-manager-ddc57ffc5-tnlfz\" (UID: \"58fbb2f0-9cdf-4a93-8d11-f46cb14742e4\") " pod="openshift-lws-operator/lws-controller-manager-ddc57ffc5-tnlfz" Apr 16 20:01:39.431005 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:01:39.430853 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/58fbb2f0-9cdf-4a93-8d11-f46cb14742e4-metrics-cert\") pod \"lws-controller-manager-ddc57ffc5-tnlfz\" (UID: \"58fbb2f0-9cdf-4a93-8d11-f46cb14742e4\") " pod="openshift-lws-operator/lws-controller-manager-ddc57ffc5-tnlfz" Apr 16 20:01:39.431005 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:01:39.430886 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7tbkt\" (UniqueName: \"kubernetes.io/projected/9eaa2062-a5e2-480b-bcf7-9a163da37f94-kube-api-access-7tbkt\") pod \"kuadrant-operator-controller-manager-6ddf9554fc-ltgfr\" (UID: \"9eaa2062-a5e2-480b-bcf7-9a163da37f94\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-ltgfr" Apr 16 20:01:39.431118 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:01:39.431021 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/9eaa2062-a5e2-480b-bcf7-9a163da37f94-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6ddf9554fc-ltgfr\" (UID: \"9eaa2062-a5e2-480b-bcf7-9a163da37f94\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-ltgfr" Apr 16 20:01:39.447347 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:01:39.447324 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7tbkt\" (UniqueName: \"kubernetes.io/projected/9eaa2062-a5e2-480b-bcf7-9a163da37f94-kube-api-access-7tbkt\") pod \"kuadrant-operator-controller-manager-6ddf9554fc-ltgfr\" (UID: \"9eaa2062-a5e2-480b-bcf7-9a163da37f94\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-ltgfr" Apr 16 20:01:39.532267 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:01:39.532196 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/58fbb2f0-9cdf-4a93-8d11-f46cb14742e4-manager-config\") pod \"lws-controller-manager-ddc57ffc5-tnlfz\" (UID: \"58fbb2f0-9cdf-4a93-8d11-f46cb14742e4\") " pod="openshift-lws-operator/lws-controller-manager-ddc57ffc5-tnlfz" Apr 16 20:01:39.532267 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:01:39.532236 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/58fbb2f0-9cdf-4a93-8d11-f46cb14742e4-metrics-cert\") pod \"lws-controller-manager-ddc57ffc5-tnlfz\" (UID: \"58fbb2f0-9cdf-4a93-8d11-f46cb14742e4\") " pod="openshift-lws-operator/lws-controller-manager-ddc57ffc5-tnlfz" Apr 16 20:01:39.532441 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:01:39.532284 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5pg9d\" (UniqueName: \"kubernetes.io/projected/58fbb2f0-9cdf-4a93-8d11-f46cb14742e4-kube-api-access-5pg9d\") pod \"lws-controller-manager-ddc57ffc5-tnlfz\" (UID: \"58fbb2f0-9cdf-4a93-8d11-f46cb14742e4\") " pod="openshift-lws-operator/lws-controller-manager-ddc57ffc5-tnlfz" Apr 16 20:01:39.532441 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:01:39.532333 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/58fbb2f0-9cdf-4a93-8d11-f46cb14742e4-cert\") pod \"lws-controller-manager-ddc57ffc5-tnlfz\" (UID: \"58fbb2f0-9cdf-4a93-8d11-f46cb14742e4\") " pod="openshift-lws-operator/lws-controller-manager-ddc57ffc5-tnlfz" Apr 16 20:01:39.532916 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:01:39.532891 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/58fbb2f0-9cdf-4a93-8d11-f46cb14742e4-manager-config\") pod \"lws-controller-manager-ddc57ffc5-tnlfz\" (UID: \"58fbb2f0-9cdf-4a93-8d11-f46cb14742e4\") " pod="openshift-lws-operator/lws-controller-manager-ddc57ffc5-tnlfz" Apr 16 20:01:39.534685 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:01:39.534663 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/58fbb2f0-9cdf-4a93-8d11-f46cb14742e4-metrics-cert\") pod \"lws-controller-manager-ddc57ffc5-tnlfz\" (UID: \"58fbb2f0-9cdf-4a93-8d11-f46cb14742e4\") " pod="openshift-lws-operator/lws-controller-manager-ddc57ffc5-tnlfz" Apr 16 20:01:39.534841 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:01:39.534820 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/58fbb2f0-9cdf-4a93-8d11-f46cb14742e4-cert\") pod \"lws-controller-manager-ddc57ffc5-tnlfz\" (UID: \"58fbb2f0-9cdf-4a93-8d11-f46cb14742e4\") " pod="openshift-lws-operator/lws-controller-manager-ddc57ffc5-tnlfz" Apr 16 20:01:39.543846 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:01:39.543827 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5pg9d\" (UniqueName: \"kubernetes.io/projected/58fbb2f0-9cdf-4a93-8d11-f46cb14742e4-kube-api-access-5pg9d\") pod \"lws-controller-manager-ddc57ffc5-tnlfz\" (UID: \"58fbb2f0-9cdf-4a93-8d11-f46cb14742e4\") " pod="openshift-lws-operator/lws-controller-manager-ddc57ffc5-tnlfz" Apr 16 20:01:39.546066 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:01:39.546049 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-ltgfr" Apr 16 20:01:39.643338 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:01:39.643313 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-ddc57ffc5-tnlfz" Apr 16 20:01:39.685134 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:01:39.685105 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-ltgfr"] Apr 16 20:01:39.687167 ip-10-0-128-48 kubenswrapper[2572]: W0416 20:01:39.687132 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9eaa2062_a5e2_480b_bcf7_9a163da37f94.slice/crio-422005b091551550a5a3347d95118d2a181ba55e2e382565c0698ba9b8e61d49 WatchSource:0}: Error finding container 422005b091551550a5a3347d95118d2a181ba55e2e382565c0698ba9b8e61d49: Status 404 returned error can't find the container with id 422005b091551550a5a3347d95118d2a181ba55e2e382565c0698ba9b8e61d49 Apr 16 20:01:39.770676 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:01:39.770650 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-ddc57ffc5-tnlfz"] Apr 16 20:01:39.772969 ip-10-0-128-48 kubenswrapper[2572]: W0416 20:01:39.772941 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod58fbb2f0_9cdf_4a93_8d11_f46cb14742e4.slice/crio-47f243fc162e757685b8b2241595bbd43adbfe3f7a0ac886a0d9e65e5b8850b9 WatchSource:0}: Error finding container 47f243fc162e757685b8b2241595bbd43adbfe3f7a0ac886a0d9e65e5b8850b9: Status 404 returned error can't find the container with id 47f243fc162e757685b8b2241595bbd43adbfe3f7a0ac886a0d9e65e5b8850b9 Apr 16 20:01:39.893652 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:01:39.893615 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-ddc57ffc5-tnlfz" event={"ID":"58fbb2f0-9cdf-4a93-8d11-f46cb14742e4","Type":"ContainerStarted","Data":"47f243fc162e757685b8b2241595bbd43adbfe3f7a0ac886a0d9e65e5b8850b9"} Apr 16 20:01:39.894763 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:01:39.894739 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-ltgfr" event={"ID":"9eaa2062-a5e2-480b-bcf7-9a163da37f94","Type":"ContainerStarted","Data":"422005b091551550a5a3347d95118d2a181ba55e2e382565c0698ba9b8e61d49"} Apr 16 20:01:44.916781 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:01:44.916748 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-ddc57ffc5-tnlfz" event={"ID":"58fbb2f0-9cdf-4a93-8d11-f46cb14742e4","Type":"ContainerStarted","Data":"d6145f023a70862334223d90611761f75f618227c84fe019e805544af1537217"} Apr 16 20:01:44.917222 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:01:44.916842 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-lws-operator/lws-controller-manager-ddc57ffc5-tnlfz" Apr 16 20:01:44.918381 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:01:44.918344 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-ltgfr" event={"ID":"9eaa2062-a5e2-480b-bcf7-9a163da37f94","Type":"ContainerStarted","Data":"474ed135fba79ba984bc8e28f5a9b1e4ad351d94981103a697312320f5b234b2"} Apr 16 20:01:44.918502 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:01:44.918485 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-ltgfr" Apr 16 20:01:44.936765 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:01:44.936710 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/lws-controller-manager-ddc57ffc5-tnlfz" podStartSLOduration=1.7347329569999999 podStartE2EDuration="5.936695403s" podCreationTimestamp="2026-04-16 20:01:39 +0000 UTC" firstStartedPulling="2026-04-16 20:01:39.774697959 +0000 UTC m=+466.953459962" lastFinishedPulling="2026-04-16 20:01:43.976660405 +0000 UTC m=+471.155422408" observedRunningTime="2026-04-16 20:01:44.9361872 +0000 UTC m=+472.114949226" watchObservedRunningTime="2026-04-16 20:01:44.936695403 +0000 UTC m=+472.115457429" Apr 16 20:01:44.956696 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:01:44.956656 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-ltgfr" podStartSLOduration=1.661188318 podStartE2EDuration="5.956645628s" podCreationTimestamp="2026-04-16 20:01:39 +0000 UTC" firstStartedPulling="2026-04-16 20:01:39.696893142 +0000 UTC m=+466.875655152" lastFinishedPulling="2026-04-16 20:01:43.992350455 +0000 UTC m=+471.171112462" observedRunningTime="2026-04-16 20:01:44.956484607 +0000 UTC m=+472.135246634" watchObservedRunningTime="2026-04-16 20:01:44.956645628 +0000 UTC m=+472.135407653" Apr 16 20:01:55.923776 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:01:55.923744 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-ltgfr" Apr 16 20:01:55.924302 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:01:55.923866 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-lws-operator/lws-controller-manager-ddc57ffc5-tnlfz" Apr 16 20:03:53.246442 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:03:53.246414 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2xttw_d7cbe800-699f-48fe-9a8a-b74c32bf0dcc/console-operator/2.log" Apr 16 20:03:53.246904 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:03:53.246868 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2xttw_d7cbe800-699f-48fe-9a8a-b74c32bf0dcc/console-operator/2.log" Apr 16 20:03:53.253135 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:03:53.253112 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bsbmz_40cddc46-c9be-411b-92b8-a65e04009fc9/ovn-acl-logging/0.log" Apr 16 20:03:53.253388 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:03:53.253368 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bsbmz_40cddc46-c9be-411b-92b8-a65e04009fc9/ovn-acl-logging/0.log" Apr 16 20:05:35.793994 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:05:35.793960 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-7b44d99f79b2nzg"] Apr 16 20:05:35.797530 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:05:35.797507 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-7b44d99f79b2nzg" Apr 16 20:05:35.800515 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:05:35.800495 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 16 20:05:35.801128 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:05:35.801110 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-pqxtj\"" Apr 16 20:05:35.802062 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:05:35.802037 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 16 20:05:35.802602 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:05:35.802583 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvad71fa5348b85aebd404221bba611457-kserve-self-signed-certs\"" Apr 16 20:05:35.817122 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:05:35.817079 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-7b44d99f79b2nzg"] Apr 16 20:05:35.909466 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:05:35.909435 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/dbf77875-163d-406b-9999-d306320dea79-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-7b44d99f79b2nzg\" (UID: \"dbf77875-163d-406b-9999-d306320dea79\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-7b44d99f79b2nzg" Apr 16 20:05:35.909596 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:05:35.909477 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/dbf77875-163d-406b-9999-d306320dea79-home\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-7b44d99f79b2nzg\" (UID: \"dbf77875-163d-406b-9999-d306320dea79\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-7b44d99f79b2nzg" Apr 16 20:05:35.909596 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:05:35.909533 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/dbf77875-163d-406b-9999-d306320dea79-dshm\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-7b44d99f79b2nzg\" (UID: \"dbf77875-163d-406b-9999-d306320dea79\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-7b44d99f79b2nzg" Apr 16 20:05:35.909596 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:05:35.909550 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vgddk\" (UniqueName: \"kubernetes.io/projected/dbf77875-163d-406b-9999-d306320dea79-kube-api-access-vgddk\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-7b44d99f79b2nzg\" (UID: \"dbf77875-163d-406b-9999-d306320dea79\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-7b44d99f79b2nzg" Apr 16 20:05:35.909596 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:05:35.909569 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/dbf77875-163d-406b-9999-d306320dea79-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-7b44d99f79b2nzg\" (UID: \"dbf77875-163d-406b-9999-d306320dea79\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-7b44d99f79b2nzg" Apr 16 20:05:35.909596 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:05:35.909597 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/dbf77875-163d-406b-9999-d306320dea79-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-7b44d99f79b2nzg\" (UID: \"dbf77875-163d-406b-9999-d306320dea79\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-7b44d99f79b2nzg" Apr 16 20:05:36.010403 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:05:36.010372 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/dbf77875-163d-406b-9999-d306320dea79-home\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-7b44d99f79b2nzg\" (UID: \"dbf77875-163d-406b-9999-d306320dea79\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-7b44d99f79b2nzg" Apr 16 20:05:36.010533 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:05:36.010427 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/dbf77875-163d-406b-9999-d306320dea79-dshm\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-7b44d99f79b2nzg\" (UID: \"dbf77875-163d-406b-9999-d306320dea79\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-7b44d99f79b2nzg" Apr 16 20:05:36.010577 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:05:36.010540 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vgddk\" (UniqueName: \"kubernetes.io/projected/dbf77875-163d-406b-9999-d306320dea79-kube-api-access-vgddk\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-7b44d99f79b2nzg\" (UID: \"dbf77875-163d-406b-9999-d306320dea79\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-7b44d99f79b2nzg" Apr 16 20:05:36.010612 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:05:36.010581 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/dbf77875-163d-406b-9999-d306320dea79-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-7b44d99f79b2nzg\" (UID: \"dbf77875-163d-406b-9999-d306320dea79\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-7b44d99f79b2nzg" Apr 16 20:05:36.010662 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:05:36.010646 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/dbf77875-163d-406b-9999-d306320dea79-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-7b44d99f79b2nzg\" (UID: \"dbf77875-163d-406b-9999-d306320dea79\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-7b44d99f79b2nzg" Apr 16 20:05:36.010820 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:05:36.010796 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/dbf77875-163d-406b-9999-d306320dea79-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-7b44d99f79b2nzg\" (UID: \"dbf77875-163d-406b-9999-d306320dea79\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-7b44d99f79b2nzg" Apr 16 20:05:36.010911 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:05:36.010814 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/dbf77875-163d-406b-9999-d306320dea79-home\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-7b44d99f79b2nzg\" (UID: \"dbf77875-163d-406b-9999-d306320dea79\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-7b44d99f79b2nzg" Apr 16 20:05:36.010968 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:05:36.010937 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/dbf77875-163d-406b-9999-d306320dea79-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-7b44d99f79b2nzg\" (UID: \"dbf77875-163d-406b-9999-d306320dea79\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-7b44d99f79b2nzg" Apr 16 20:05:36.011020 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:05:36.010967 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/dbf77875-163d-406b-9999-d306320dea79-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-7b44d99f79b2nzg\" (UID: \"dbf77875-163d-406b-9999-d306320dea79\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-7b44d99f79b2nzg" Apr 16 20:05:36.012706 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:05:36.012682 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/dbf77875-163d-406b-9999-d306320dea79-dshm\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-7b44d99f79b2nzg\" (UID: \"dbf77875-163d-406b-9999-d306320dea79\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-7b44d99f79b2nzg" Apr 16 20:05:36.013256 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:05:36.013233 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/dbf77875-163d-406b-9999-d306320dea79-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-7b44d99f79b2nzg\" (UID: \"dbf77875-163d-406b-9999-d306320dea79\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-7b44d99f79b2nzg" Apr 16 20:05:36.019387 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:05:36.019363 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vgddk\" (UniqueName: \"kubernetes.io/projected/dbf77875-163d-406b-9999-d306320dea79-kube-api-access-vgddk\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-7b44d99f79b2nzg\" (UID: \"dbf77875-163d-406b-9999-d306320dea79\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-7b44d99f79b2nzg" Apr 16 20:05:36.110059 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:05:36.109975 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-7b44d99f79b2nzg" Apr 16 20:05:36.234397 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:05:36.234301 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-7b44d99f79b2nzg"] Apr 16 20:05:36.238130 ip-10-0-128-48 kubenswrapper[2572]: W0416 20:05:36.238080 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddbf77875_163d_406b_9999_d306320dea79.slice/crio-72e708f6b2dfd6852c5505c3e7bd7b1f57a4523fd5d53a8679a00a2d0ebeb7ce WatchSource:0}: Error finding container 72e708f6b2dfd6852c5505c3e7bd7b1f57a4523fd5d53a8679a00a2d0ebeb7ce: Status 404 returned error can't find the container with id 72e708f6b2dfd6852c5505c3e7bd7b1f57a4523fd5d53a8679a00a2d0ebeb7ce Apr 16 20:05:36.638716 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:05:36.638679 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-7b44d99f79b2nzg" event={"ID":"dbf77875-163d-406b-9999-d306320dea79","Type":"ContainerStarted","Data":"72e708f6b2dfd6852c5505c3e7bd7b1f57a4523fd5d53a8679a00a2d0ebeb7ce"} Apr 16 20:05:39.650884 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:05:39.650846 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-7b44d99f79b2nzg" event={"ID":"dbf77875-163d-406b-9999-d306320dea79","Type":"ContainerStarted","Data":"0c58b44cb9398ce49146b12d8a3bfac0fc8b86100e8928014e3face7a0cd8482"} Apr 16 20:05:43.666267 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:05:43.666181 2572 generic.go:358] "Generic (PLEG): container finished" podID="dbf77875-163d-406b-9999-d306320dea79" containerID="0c58b44cb9398ce49146b12d8a3bfac0fc8b86100e8928014e3face7a0cd8482" exitCode=0 Apr 16 20:05:43.666267 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:05:43.666250 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-7b44d99f79b2nzg" event={"ID":"dbf77875-163d-406b-9999-d306320dea79","Type":"ContainerDied","Data":"0c58b44cb9398ce49146b12d8a3bfac0fc8b86100e8928014e3face7a0cd8482"} Apr 16 20:05:45.674360 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:05:45.674318 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-7b44d99f79b2nzg" event={"ID":"dbf77875-163d-406b-9999-d306320dea79","Type":"ContainerStarted","Data":"7dd776c585c5d0f17f7a31fc7ed6e4fc3ce8b00feedc6a5f578dd3d2594b617c"} Apr 16 20:05:45.700961 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:05:45.700903 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-7b44d99f79b2nzg" podStartSLOduration=2.211285743 podStartE2EDuration="10.70088657s" podCreationTimestamp="2026-04-16 20:05:35 +0000 UTC" firstStartedPulling="2026-04-16 20:05:36.240066488 +0000 UTC m=+703.418828493" lastFinishedPulling="2026-04-16 20:05:44.72966731 +0000 UTC m=+711.908429320" observedRunningTime="2026-04-16 20:05:45.699027675 +0000 UTC m=+712.877789702" watchObservedRunningTime="2026-04-16 20:05:45.70088657 +0000 UTC m=+712.879648594" Apr 16 20:05:46.110630 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:05:46.110592 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-7b44d99f79b2nzg" Apr 16 20:05:46.110802 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:05:46.110641 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-7b44d99f79b2nzg" Apr 16 20:05:46.123589 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:05:46.123564 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-7b44d99f79b2nzg" Apr 16 20:05:46.688709 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:05:46.688681 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-7b44d99f79b2nzg" Apr 16 20:07:12.786728 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:07:12.786653 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-7b44d99f79b2nzg"] Apr 16 20:07:12.787224 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:07:12.787016 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-7b44d99f79b2nzg" podUID="dbf77875-163d-406b-9999-d306320dea79" containerName="main" containerID="cri-o://7dd776c585c5d0f17f7a31fc7ed6e4fc3ce8b00feedc6a5f578dd3d2594b617c" gracePeriod=30 Apr 16 20:07:12.957948 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:07:12.957917 2572 generic.go:358] "Generic (PLEG): container finished" podID="dbf77875-163d-406b-9999-d306320dea79" containerID="7dd776c585c5d0f17f7a31fc7ed6e4fc3ce8b00feedc6a5f578dd3d2594b617c" exitCode=0 Apr 16 20:07:12.958107 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:07:12.957980 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-7b44d99f79b2nzg" event={"ID":"dbf77875-163d-406b-9999-d306320dea79","Type":"ContainerDied","Data":"7dd776c585c5d0f17f7a31fc7ed6e4fc3ce8b00feedc6a5f578dd3d2594b617c"} Apr 16 20:07:13.029104 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:07:13.029069 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-7b44d99f79b2nzg" Apr 16 20:07:13.191267 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:07:13.191230 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/dbf77875-163d-406b-9999-d306320dea79-kserve-provision-location\") pod \"dbf77875-163d-406b-9999-d306320dea79\" (UID: \"dbf77875-163d-406b-9999-d306320dea79\") " Apr 16 20:07:13.191426 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:07:13.191294 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vgddk\" (UniqueName: \"kubernetes.io/projected/dbf77875-163d-406b-9999-d306320dea79-kube-api-access-vgddk\") pod \"dbf77875-163d-406b-9999-d306320dea79\" (UID: \"dbf77875-163d-406b-9999-d306320dea79\") " Apr 16 20:07:13.191426 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:07:13.191310 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/dbf77875-163d-406b-9999-d306320dea79-model-cache\") pod \"dbf77875-163d-406b-9999-d306320dea79\" (UID: \"dbf77875-163d-406b-9999-d306320dea79\") " Apr 16 20:07:13.191426 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:07:13.191361 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/dbf77875-163d-406b-9999-d306320dea79-home\") pod \"dbf77875-163d-406b-9999-d306320dea79\" (UID: \"dbf77875-163d-406b-9999-d306320dea79\") " Apr 16 20:07:13.191426 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:07:13.191384 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/dbf77875-163d-406b-9999-d306320dea79-dshm\") pod \"dbf77875-163d-406b-9999-d306320dea79\" (UID: \"dbf77875-163d-406b-9999-d306320dea79\") " Apr 16 20:07:13.191426 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:07:13.191413 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/dbf77875-163d-406b-9999-d306320dea79-tls-certs\") pod \"dbf77875-163d-406b-9999-d306320dea79\" (UID: \"dbf77875-163d-406b-9999-d306320dea79\") " Apr 16 20:07:13.191676 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:07:13.191540 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dbf77875-163d-406b-9999-d306320dea79-model-cache" (OuterVolumeSpecName: "model-cache") pod "dbf77875-163d-406b-9999-d306320dea79" (UID: "dbf77875-163d-406b-9999-d306320dea79"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:07:13.191676 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:07:13.191651 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dbf77875-163d-406b-9999-d306320dea79-home" (OuterVolumeSpecName: "home") pod "dbf77875-163d-406b-9999-d306320dea79" (UID: "dbf77875-163d-406b-9999-d306320dea79"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:07:13.191676 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:07:13.191664 2572 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/dbf77875-163d-406b-9999-d306320dea79-model-cache\") on node \"ip-10-0-128-48.ec2.internal\" DevicePath \"\"" Apr 16 20:07:13.193615 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:07:13.193583 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dbf77875-163d-406b-9999-d306320dea79-kube-api-access-vgddk" (OuterVolumeSpecName: "kube-api-access-vgddk") pod "dbf77875-163d-406b-9999-d306320dea79" (UID: "dbf77875-163d-406b-9999-d306320dea79"). InnerVolumeSpecName "kube-api-access-vgddk". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 20:07:13.193732 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:07:13.193616 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dbf77875-163d-406b-9999-d306320dea79-dshm" (OuterVolumeSpecName: "dshm") pod "dbf77875-163d-406b-9999-d306320dea79" (UID: "dbf77875-163d-406b-9999-d306320dea79"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:07:13.193732 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:07:13.193661 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dbf77875-163d-406b-9999-d306320dea79-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "dbf77875-163d-406b-9999-d306320dea79" (UID: "dbf77875-163d-406b-9999-d306320dea79"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:07:13.245854 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:07:13.245822 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dbf77875-163d-406b-9999-d306320dea79-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "dbf77875-163d-406b-9999-d306320dea79" (UID: "dbf77875-163d-406b-9999-d306320dea79"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:07:13.292759 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:07:13.292733 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-vgddk\" (UniqueName: \"kubernetes.io/projected/dbf77875-163d-406b-9999-d306320dea79-kube-api-access-vgddk\") on node \"ip-10-0-128-48.ec2.internal\" DevicePath \"\"" Apr 16 20:07:13.292759 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:07:13.292755 2572 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/dbf77875-163d-406b-9999-d306320dea79-home\") on node \"ip-10-0-128-48.ec2.internal\" DevicePath \"\"" Apr 16 20:07:13.292759 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:07:13.292763 2572 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/dbf77875-163d-406b-9999-d306320dea79-dshm\") on node \"ip-10-0-128-48.ec2.internal\" DevicePath \"\"" Apr 16 20:07:13.292936 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:07:13.292773 2572 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/dbf77875-163d-406b-9999-d306320dea79-tls-certs\") on node \"ip-10-0-128-48.ec2.internal\" DevicePath \"\"" Apr 16 20:07:13.292936 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:07:13.292782 2572 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/dbf77875-163d-406b-9999-d306320dea79-kserve-provision-location\") on node \"ip-10-0-128-48.ec2.internal\" DevicePath \"\"" Apr 16 20:07:13.962157 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:07:13.962118 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-7b44d99f79b2nzg" event={"ID":"dbf77875-163d-406b-9999-d306320dea79","Type":"ContainerDied","Data":"72e708f6b2dfd6852c5505c3e7bd7b1f57a4523fd5d53a8679a00a2d0ebeb7ce"} Apr 16 20:07:13.962558 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:07:13.962168 2572 scope.go:117] "RemoveContainer" containerID="7dd776c585c5d0f17f7a31fc7ed6e4fc3ce8b00feedc6a5f578dd3d2594b617c" Apr 16 20:07:13.962558 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:07:13.962170 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-7b44d99f79b2nzg" Apr 16 20:07:13.970066 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:07:13.970047 2572 scope.go:117] "RemoveContainer" containerID="0c58b44cb9398ce49146b12d8a3bfac0fc8b86100e8928014e3face7a0cd8482" Apr 16 20:07:13.981550 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:07:13.981529 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-7b44d99f79b2nzg"] Apr 16 20:07:13.985652 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:07:13.985627 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-7b44d99f79b2nzg"] Apr 16 20:07:15.365111 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:07:15.365073 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dbf77875-163d-406b-9999-d306320dea79" path="/var/lib/kubelet/pods/dbf77875-163d-406b-9999-d306320dea79/volumes" Apr 16 20:07:15.612878 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:07:15.612842 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-598cb956d-n47tq"] Apr 16 20:07:15.613219 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:07:15.613202 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="dbf77875-163d-406b-9999-d306320dea79" containerName="storage-initializer" Apr 16 20:07:15.613351 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:07:15.613221 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="dbf77875-163d-406b-9999-d306320dea79" containerName="storage-initializer" Apr 16 20:07:15.613351 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:07:15.613242 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="dbf77875-163d-406b-9999-d306320dea79" containerName="main" Apr 16 20:07:15.613351 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:07:15.613250 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="dbf77875-163d-406b-9999-d306320dea79" containerName="main" Apr 16 20:07:15.613351 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:07:15.613327 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="dbf77875-163d-406b-9999-d306320dea79" containerName="main" Apr 16 20:07:15.615629 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:07:15.615575 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-598cb956d-n47tq" Apr 16 20:07:15.620450 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:07:15.620429 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 16 20:07:15.620571 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:07:15.620450 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvdde380eaa9fe1facad32d45131f9e34d-kserve-self-signed-certs\"" Apr 16 20:07:15.620571 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:07:15.620458 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 16 20:07:15.620571 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:07:15.620459 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-pqxtj\"" Apr 16 20:07:15.631224 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:07:15.631204 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-598cb956d-n47tq"] Apr 16 20:07:15.810486 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:07:15.810448 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mvx6b\" (UniqueName: \"kubernetes.io/projected/ea241c85-3097-48f6-87de-c739959c64db-kube-api-access-mvx6b\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-598cb956d-n47tq\" (UID: \"ea241c85-3097-48f6-87de-c739959c64db\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-598cb956d-n47tq" Apr 16 20:07:15.810647 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:07:15.810497 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/ea241c85-3097-48f6-87de-c739959c64db-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-598cb956d-n47tq\" (UID: \"ea241c85-3097-48f6-87de-c739959c64db\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-598cb956d-n47tq" Apr 16 20:07:15.810647 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:07:15.810560 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ea241c85-3097-48f6-87de-c739959c64db-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-598cb956d-n47tq\" (UID: \"ea241c85-3097-48f6-87de-c739959c64db\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-598cb956d-n47tq" Apr 16 20:07:15.810647 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:07:15.810599 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/ea241c85-3097-48f6-87de-c739959c64db-dshm\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-598cb956d-n47tq\" (UID: \"ea241c85-3097-48f6-87de-c739959c64db\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-598cb956d-n47tq" Apr 16 20:07:15.810752 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:07:15.810651 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/ea241c85-3097-48f6-87de-c739959c64db-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-598cb956d-n47tq\" (UID: \"ea241c85-3097-48f6-87de-c739959c64db\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-598cb956d-n47tq" Apr 16 20:07:15.810752 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:07:15.810711 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/ea241c85-3097-48f6-87de-c739959c64db-home\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-598cb956d-n47tq\" (UID: \"ea241c85-3097-48f6-87de-c739959c64db\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-598cb956d-n47tq" Apr 16 20:07:15.911267 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:07:15.911199 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/ea241c85-3097-48f6-87de-c739959c64db-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-598cb956d-n47tq\" (UID: \"ea241c85-3097-48f6-87de-c739959c64db\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-598cb956d-n47tq" Apr 16 20:07:15.911267 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:07:15.911231 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ea241c85-3097-48f6-87de-c739959c64db-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-598cb956d-n47tq\" (UID: \"ea241c85-3097-48f6-87de-c739959c64db\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-598cb956d-n47tq" Apr 16 20:07:15.911267 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:07:15.911256 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/ea241c85-3097-48f6-87de-c739959c64db-dshm\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-598cb956d-n47tq\" (UID: \"ea241c85-3097-48f6-87de-c739959c64db\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-598cb956d-n47tq" Apr 16 20:07:15.911502 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:07:15.911289 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/ea241c85-3097-48f6-87de-c739959c64db-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-598cb956d-n47tq\" (UID: \"ea241c85-3097-48f6-87de-c739959c64db\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-598cb956d-n47tq" Apr 16 20:07:15.911502 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:07:15.911335 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/ea241c85-3097-48f6-87de-c739959c64db-home\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-598cb956d-n47tq\" (UID: \"ea241c85-3097-48f6-87de-c739959c64db\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-598cb956d-n47tq" Apr 16 20:07:15.911502 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:07:15.911373 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mvx6b\" (UniqueName: \"kubernetes.io/projected/ea241c85-3097-48f6-87de-c739959c64db-kube-api-access-mvx6b\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-598cb956d-n47tq\" (UID: \"ea241c85-3097-48f6-87de-c739959c64db\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-598cb956d-n47tq" Apr 16 20:07:15.911720 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:07:15.911694 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/ea241c85-3097-48f6-87de-c739959c64db-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-598cb956d-n47tq\" (UID: \"ea241c85-3097-48f6-87de-c739959c64db\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-598cb956d-n47tq" Apr 16 20:07:15.911836 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:07:15.911810 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ea241c85-3097-48f6-87de-c739959c64db-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-598cb956d-n47tq\" (UID: \"ea241c85-3097-48f6-87de-c739959c64db\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-598cb956d-n47tq" Apr 16 20:07:15.912202 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:07:15.912153 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/ea241c85-3097-48f6-87de-c739959c64db-home\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-598cb956d-n47tq\" (UID: \"ea241c85-3097-48f6-87de-c739959c64db\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-598cb956d-n47tq" Apr 16 20:07:15.914039 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:07:15.914017 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/ea241c85-3097-48f6-87de-c739959c64db-dshm\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-598cb956d-n47tq\" (UID: \"ea241c85-3097-48f6-87de-c739959c64db\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-598cb956d-n47tq" Apr 16 20:07:15.914430 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:07:15.914410 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/ea241c85-3097-48f6-87de-c739959c64db-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-598cb956d-n47tq\" (UID: \"ea241c85-3097-48f6-87de-c739959c64db\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-598cb956d-n47tq" Apr 16 20:07:15.924323 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:07:15.924296 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mvx6b\" (UniqueName: \"kubernetes.io/projected/ea241c85-3097-48f6-87de-c739959c64db-kube-api-access-mvx6b\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-598cb956d-n47tq\" (UID: \"ea241c85-3097-48f6-87de-c739959c64db\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-598cb956d-n47tq" Apr 16 20:07:15.925877 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:07:15.925856 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-598cb956d-n47tq" Apr 16 20:07:16.054163 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:07:16.054062 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-598cb956d-n47tq"] Apr 16 20:07:16.056959 ip-10-0-128-48 kubenswrapper[2572]: W0416 20:07:16.056926 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podea241c85_3097_48f6_87de_c739959c64db.slice/crio-128b4881822bf3e54c0966f7c793bf5dbf5177d3306d94b4ebb01ddfc9b7b4f4 WatchSource:0}: Error finding container 128b4881822bf3e54c0966f7c793bf5dbf5177d3306d94b4ebb01ddfc9b7b4f4: Status 404 returned error can't find the container with id 128b4881822bf3e54c0966f7c793bf5dbf5177d3306d94b4ebb01ddfc9b7b4f4 Apr 16 20:07:16.058822 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:07:16.058802 2572 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 20:07:16.974245 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:07:16.974198 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-598cb956d-n47tq" event={"ID":"ea241c85-3097-48f6-87de-c739959c64db","Type":"ContainerStarted","Data":"24899f350b6b2769b1dabb144a07536150974a1f91363c476900e5e283ee2734"} Apr 16 20:07:16.974245 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:07:16.974247 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-598cb956d-n47tq" event={"ID":"ea241c85-3097-48f6-87de-c739959c64db","Type":"ContainerStarted","Data":"128b4881822bf3e54c0966f7c793bf5dbf5177d3306d94b4ebb01ddfc9b7b4f4"} Apr 16 20:07:20.987322 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:07:20.987290 2572 generic.go:358] "Generic (PLEG): container finished" podID="ea241c85-3097-48f6-87de-c739959c64db" containerID="24899f350b6b2769b1dabb144a07536150974a1f91363c476900e5e283ee2734" exitCode=0 Apr 16 20:07:20.987322 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:07:20.987327 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-598cb956d-n47tq" event={"ID":"ea241c85-3097-48f6-87de-c739959c64db","Type":"ContainerDied","Data":"24899f350b6b2769b1dabb144a07536150974a1f91363c476900e5e283ee2734"} Apr 16 20:07:48.091778 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:07:48.091741 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-598cb956d-n47tq" event={"ID":"ea241c85-3097-48f6-87de-c739959c64db","Type":"ContainerStarted","Data":"ae6e0be179524db9701bdbd8069c5ee1904eed0e612026dfab2e77da37b11366"} Apr 16 20:07:48.119174 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:07:48.119116 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-598cb956d-n47tq" podStartSLOduration=6.532782558 podStartE2EDuration="33.119078785s" podCreationTimestamp="2026-04-16 20:07:15 +0000 UTC" firstStartedPulling="2026-04-16 20:07:20.988304361 +0000 UTC m=+808.167066364" lastFinishedPulling="2026-04-16 20:07:47.574600584 +0000 UTC m=+834.753362591" observedRunningTime="2026-04-16 20:07:48.118586739 +0000 UTC m=+835.297348770" watchObservedRunningTime="2026-04-16 20:07:48.119078785 +0000 UTC m=+835.297840813" Apr 16 20:07:55.926178 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:07:55.926145 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-598cb956d-n47tq" Apr 16 20:07:55.926728 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:07:55.926194 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-598cb956d-n47tq" Apr 16 20:07:55.927409 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:07:55.927383 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-598cb956d-n47tq" podUID="ea241c85-3097-48f6-87de-c739959c64db" containerName="main" probeResult="failure" output="Get \"https://10.132.0.27:8000/health\": dial tcp 10.132.0.27:8000: connect: connection refused" Apr 16 20:08:05.926595 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:08:05.926552 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-598cb956d-n47tq" podUID="ea241c85-3097-48f6-87de-c739959c64db" containerName="main" probeResult="failure" output="Get \"https://10.132.0.27:8000/health\": dial tcp 10.132.0.27:8000: connect: connection refused" Apr 16 20:08:15.927126 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:08:15.927057 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-598cb956d-n47tq" podUID="ea241c85-3097-48f6-87de-c739959c64db" containerName="main" probeResult="failure" output="Get \"https://10.132.0.27:8000/health\": dial tcp 10.132.0.27:8000: connect: connection refused" Apr 16 20:08:25.926542 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:08:25.926501 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-598cb956d-n47tq" podUID="ea241c85-3097-48f6-87de-c739959c64db" containerName="main" probeResult="failure" output="Get \"https://10.132.0.27:8000/health\": dial tcp 10.132.0.27:8000: connect: connection refused" Apr 16 20:08:35.926184 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:08:35.926140 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-598cb956d-n47tq" podUID="ea241c85-3097-48f6-87de-c739959c64db" containerName="main" probeResult="failure" output="Get \"https://10.132.0.27:8000/health\": dial tcp 10.132.0.27:8000: connect: connection refused" Apr 16 20:08:45.926856 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:08:45.926764 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-598cb956d-n47tq" podUID="ea241c85-3097-48f6-87de-c739959c64db" containerName="main" probeResult="failure" output="Get \"https://10.132.0.27:8000/health\": dial tcp 10.132.0.27:8000: connect: connection refused" Apr 16 20:08:53.271033 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:08:53.271002 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2xttw_d7cbe800-699f-48fe-9a8a-b74c32bf0dcc/console-operator/2.log" Apr 16 20:08:53.273573 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:08:53.273540 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2xttw_d7cbe800-699f-48fe-9a8a-b74c32bf0dcc/console-operator/2.log" Apr 16 20:08:53.278768 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:08:53.278734 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bsbmz_40cddc46-c9be-411b-92b8-a65e04009fc9/ovn-acl-logging/0.log" Apr 16 20:08:53.281315 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:08:53.281295 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bsbmz_40cddc46-c9be-411b-92b8-a65e04009fc9/ovn-acl-logging/0.log" Apr 16 20:08:55.927152 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:08:55.927110 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-598cb956d-n47tq" podUID="ea241c85-3097-48f6-87de-c739959c64db" containerName="main" probeResult="failure" output="Get \"https://10.132.0.27:8000/health\": dial tcp 10.132.0.27:8000: connect: connection refused" Apr 16 20:09:05.926669 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:09:05.926620 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-598cb956d-n47tq" podUID="ea241c85-3097-48f6-87de-c739959c64db" containerName="main" probeResult="failure" output="Get \"https://10.132.0.27:8000/health\": dial tcp 10.132.0.27:8000: connect: connection refused" Apr 16 20:09:15.935936 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:09:15.935899 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-598cb956d-n47tq" Apr 16 20:09:15.943710 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:09:15.943685 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-598cb956d-n47tq" Apr 16 20:10:21.630629 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:10:21.630540 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-598cb956d-n47tq"] Apr 16 20:10:21.633382 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:10:21.630918 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-598cb956d-n47tq" podUID="ea241c85-3097-48f6-87de-c739959c64db" containerName="main" containerID="cri-o://ae6e0be179524db9701bdbd8069c5ee1904eed0e612026dfab2e77da37b11366" gracePeriod=30 Apr 16 20:10:25.215890 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:10:25.215850 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-5b8cffcb46-jlzws"] Apr 16 20:10:25.224211 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:10:25.224174 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-5b8cffcb46-jlzws" Apr 16 20:10:25.227414 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:10:25.227388 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"custom-route-timeout-test-kserve-self-signed-certs\"" Apr 16 20:10:25.228937 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:10:25.228910 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-5b8cffcb46-jlzws"] Apr 16 20:10:25.270987 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:10:25.270959 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d7685da0-508e-483c-8474-145e3f203c64-tls-certs\") pod \"custom-route-timeout-test-kserve-5b8cffcb46-jlzws\" (UID: \"d7685da0-508e-483c-8474-145e3f203c64\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-5b8cffcb46-jlzws" Apr 16 20:10:25.271183 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:10:25.271005 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/d7685da0-508e-483c-8474-145e3f203c64-home\") pod \"custom-route-timeout-test-kserve-5b8cffcb46-jlzws\" (UID: \"d7685da0-508e-483c-8474-145e3f203c64\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-5b8cffcb46-jlzws" Apr 16 20:10:25.271183 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:10:25.271067 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/d7685da0-508e-483c-8474-145e3f203c64-dshm\") pod \"custom-route-timeout-test-kserve-5b8cffcb46-jlzws\" (UID: \"d7685da0-508e-483c-8474-145e3f203c64\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-5b8cffcb46-jlzws" Apr 16 20:10:25.271183 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:10:25.271138 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d7685da0-508e-483c-8474-145e3f203c64-kserve-provision-location\") pod \"custom-route-timeout-test-kserve-5b8cffcb46-jlzws\" (UID: \"d7685da0-508e-483c-8474-145e3f203c64\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-5b8cffcb46-jlzws" Apr 16 20:10:25.271183 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:10:25.271177 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/d7685da0-508e-483c-8474-145e3f203c64-model-cache\") pod \"custom-route-timeout-test-kserve-5b8cffcb46-jlzws\" (UID: \"d7685da0-508e-483c-8474-145e3f203c64\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-5b8cffcb46-jlzws" Apr 16 20:10:25.271342 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:10:25.271212 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-th6m6\" (UniqueName: \"kubernetes.io/projected/d7685da0-508e-483c-8474-145e3f203c64-kube-api-access-th6m6\") pod \"custom-route-timeout-test-kserve-5b8cffcb46-jlzws\" (UID: \"d7685da0-508e-483c-8474-145e3f203c64\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-5b8cffcb46-jlzws" Apr 16 20:10:25.371769 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:10:25.371739 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d7685da0-508e-483c-8474-145e3f203c64-tls-certs\") pod \"custom-route-timeout-test-kserve-5b8cffcb46-jlzws\" (UID: \"d7685da0-508e-483c-8474-145e3f203c64\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-5b8cffcb46-jlzws" Apr 16 20:10:25.371925 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:10:25.371781 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/d7685da0-508e-483c-8474-145e3f203c64-home\") pod \"custom-route-timeout-test-kserve-5b8cffcb46-jlzws\" (UID: \"d7685da0-508e-483c-8474-145e3f203c64\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-5b8cffcb46-jlzws" Apr 16 20:10:25.371925 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:10:25.371897 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/d7685da0-508e-483c-8474-145e3f203c64-dshm\") pod \"custom-route-timeout-test-kserve-5b8cffcb46-jlzws\" (UID: \"d7685da0-508e-483c-8474-145e3f203c64\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-5b8cffcb46-jlzws" Apr 16 20:10:25.371994 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:10:25.371943 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d7685da0-508e-483c-8474-145e3f203c64-kserve-provision-location\") pod \"custom-route-timeout-test-kserve-5b8cffcb46-jlzws\" (UID: \"d7685da0-508e-483c-8474-145e3f203c64\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-5b8cffcb46-jlzws" Apr 16 20:10:25.372062 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:10:25.372043 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/d7685da0-508e-483c-8474-145e3f203c64-model-cache\") pod \"custom-route-timeout-test-kserve-5b8cffcb46-jlzws\" (UID: \"d7685da0-508e-483c-8474-145e3f203c64\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-5b8cffcb46-jlzws" Apr 16 20:10:25.372129 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:10:25.372116 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-th6m6\" (UniqueName: \"kubernetes.io/projected/d7685da0-508e-483c-8474-145e3f203c64-kube-api-access-th6m6\") pod \"custom-route-timeout-test-kserve-5b8cffcb46-jlzws\" (UID: \"d7685da0-508e-483c-8474-145e3f203c64\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-5b8cffcb46-jlzws" Apr 16 20:10:25.372191 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:10:25.372161 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/d7685da0-508e-483c-8474-145e3f203c64-home\") pod \"custom-route-timeout-test-kserve-5b8cffcb46-jlzws\" (UID: \"d7685da0-508e-483c-8474-145e3f203c64\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-5b8cffcb46-jlzws" Apr 16 20:10:25.372329 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:10:25.372308 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d7685da0-508e-483c-8474-145e3f203c64-kserve-provision-location\") pod \"custom-route-timeout-test-kserve-5b8cffcb46-jlzws\" (UID: \"d7685da0-508e-483c-8474-145e3f203c64\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-5b8cffcb46-jlzws" Apr 16 20:10:25.372457 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:10:25.372414 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/d7685da0-508e-483c-8474-145e3f203c64-model-cache\") pod \"custom-route-timeout-test-kserve-5b8cffcb46-jlzws\" (UID: \"d7685da0-508e-483c-8474-145e3f203c64\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-5b8cffcb46-jlzws" Apr 16 20:10:25.374115 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:10:25.374079 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/d7685da0-508e-483c-8474-145e3f203c64-dshm\") pod \"custom-route-timeout-test-kserve-5b8cffcb46-jlzws\" (UID: \"d7685da0-508e-483c-8474-145e3f203c64\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-5b8cffcb46-jlzws" Apr 16 20:10:25.374301 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:10:25.374284 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d7685da0-508e-483c-8474-145e3f203c64-tls-certs\") pod \"custom-route-timeout-test-kserve-5b8cffcb46-jlzws\" (UID: \"d7685da0-508e-483c-8474-145e3f203c64\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-5b8cffcb46-jlzws" Apr 16 20:10:25.380347 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:10:25.380327 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-th6m6\" (UniqueName: \"kubernetes.io/projected/d7685da0-508e-483c-8474-145e3f203c64-kube-api-access-th6m6\") pod \"custom-route-timeout-test-kserve-5b8cffcb46-jlzws\" (UID: \"d7685da0-508e-483c-8474-145e3f203c64\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-5b8cffcb46-jlzws" Apr 16 20:10:25.536902 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:10:25.536824 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-5b8cffcb46-jlzws" Apr 16 20:10:25.660566 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:10:25.660542 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-5b8cffcb46-jlzws"] Apr 16 20:10:25.662586 ip-10-0-128-48 kubenswrapper[2572]: W0416 20:10:25.662559 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd7685da0_508e_483c_8474_145e3f203c64.slice/crio-2175cb1fa17b3db2927541441e15a4421222c5b7421698b98df686325dc87033 WatchSource:0}: Error finding container 2175cb1fa17b3db2927541441e15a4421222c5b7421698b98df686325dc87033: Status 404 returned error can't find the container with id 2175cb1fa17b3db2927541441e15a4421222c5b7421698b98df686325dc87033 Apr 16 20:10:26.610749 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:10:26.610714 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-5b8cffcb46-jlzws" event={"ID":"d7685da0-508e-483c-8474-145e3f203c64","Type":"ContainerStarted","Data":"c79359b64e13333aa2d4fd72dc28761ab870ee3f0cb0b3554a1de507b1294a1f"} Apr 16 20:10:26.611142 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:10:26.610757 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-5b8cffcb46-jlzws" event={"ID":"d7685da0-508e-483c-8474-145e3f203c64","Type":"ContainerStarted","Data":"2175cb1fa17b3db2927541441e15a4421222c5b7421698b98df686325dc87033"} Apr 16 20:10:30.625048 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:10:30.625013 2572 generic.go:358] "Generic (PLEG): container finished" podID="d7685da0-508e-483c-8474-145e3f203c64" containerID="c79359b64e13333aa2d4fd72dc28761ab870ee3f0cb0b3554a1de507b1294a1f" exitCode=0 Apr 16 20:10:30.625519 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:10:30.625109 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-5b8cffcb46-jlzws" event={"ID":"d7685da0-508e-483c-8474-145e3f203c64","Type":"ContainerDied","Data":"c79359b64e13333aa2d4fd72dc28761ab870ee3f0cb0b3554a1de507b1294a1f"} Apr 16 20:10:31.629365 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:10:31.629322 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-5b8cffcb46-jlzws" event={"ID":"d7685da0-508e-483c-8474-145e3f203c64","Type":"ContainerStarted","Data":"708c6bcfc0d8024b2603bf3677ccee23b86ec9ac92974e3e6dcabbd0df83b863"} Apr 16 20:10:31.649520 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:10:31.649461 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-5b8cffcb46-jlzws" podStartSLOduration=6.649444139 podStartE2EDuration="6.649444139s" podCreationTimestamp="2026-04-16 20:10:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:10:31.647992616 +0000 UTC m=+998.826754642" watchObservedRunningTime="2026-04-16 20:10:31.649444139 +0000 UTC m=+998.828206164" Apr 16 20:10:35.537807 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:10:35.537755 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-5b8cffcb46-jlzws" Apr 16 20:10:35.537807 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:10:35.537815 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-5b8cffcb46-jlzws" Apr 16 20:10:35.539531 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:10:35.539505 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-5b8cffcb46-jlzws" podUID="d7685da0-508e-483c-8474-145e3f203c64" containerName="main" probeResult="failure" output="Get \"https://10.132.0.28:8000/health\": dial tcp 10.132.0.28:8000: connect: connection refused" Apr 16 20:10:45.538144 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:10:45.538065 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-5b8cffcb46-jlzws" podUID="d7685da0-508e-483c-8474-145e3f203c64" containerName="main" probeResult="failure" output="Get \"https://10.132.0.28:8000/health\": dial tcp 10.132.0.28:8000: connect: connection refused" Apr 16 20:10:51.877378 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:10:51.877353 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-598cb956d-n47tq_ea241c85-3097-48f6-87de-c739959c64db/main/0.log" Apr 16 20:10:51.877751 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:10:51.877736 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-598cb956d-n47tq" Apr 16 20:10:52.002958 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:10:52.002876 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/ea241c85-3097-48f6-87de-c739959c64db-tls-certs\") pod \"ea241c85-3097-48f6-87de-c739959c64db\" (UID: \"ea241c85-3097-48f6-87de-c739959c64db\") " Apr 16 20:10:52.002958 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:10:52.002927 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mvx6b\" (UniqueName: \"kubernetes.io/projected/ea241c85-3097-48f6-87de-c739959c64db-kube-api-access-mvx6b\") pod \"ea241c85-3097-48f6-87de-c739959c64db\" (UID: \"ea241c85-3097-48f6-87de-c739959c64db\") " Apr 16 20:10:52.002958 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:10:52.002948 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ea241c85-3097-48f6-87de-c739959c64db-kserve-provision-location\") pod \"ea241c85-3097-48f6-87de-c739959c64db\" (UID: \"ea241c85-3097-48f6-87de-c739959c64db\") " Apr 16 20:10:52.003263 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:10:52.002975 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/ea241c85-3097-48f6-87de-c739959c64db-dshm\") pod \"ea241c85-3097-48f6-87de-c739959c64db\" (UID: \"ea241c85-3097-48f6-87de-c739959c64db\") " Apr 16 20:10:52.003263 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:10:52.003021 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/ea241c85-3097-48f6-87de-c739959c64db-model-cache\") pod \"ea241c85-3097-48f6-87de-c739959c64db\" (UID: \"ea241c85-3097-48f6-87de-c739959c64db\") " Apr 16 20:10:52.003263 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:10:52.003059 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/ea241c85-3097-48f6-87de-c739959c64db-home\") pod \"ea241c85-3097-48f6-87de-c739959c64db\" (UID: \"ea241c85-3097-48f6-87de-c739959c64db\") " Apr 16 20:10:52.003419 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:10:52.003350 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ea241c85-3097-48f6-87de-c739959c64db-model-cache" (OuterVolumeSpecName: "model-cache") pod "ea241c85-3097-48f6-87de-c739959c64db" (UID: "ea241c85-3097-48f6-87de-c739959c64db"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:10:52.003622 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:10:52.003598 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ea241c85-3097-48f6-87de-c739959c64db-home" (OuterVolumeSpecName: "home") pod "ea241c85-3097-48f6-87de-c739959c64db" (UID: "ea241c85-3097-48f6-87de-c739959c64db"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:10:52.005404 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:10:52.005374 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea241c85-3097-48f6-87de-c739959c64db-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "ea241c85-3097-48f6-87de-c739959c64db" (UID: "ea241c85-3097-48f6-87de-c739959c64db"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:10:52.005577 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:10:52.005561 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ea241c85-3097-48f6-87de-c739959c64db-dshm" (OuterVolumeSpecName: "dshm") pod "ea241c85-3097-48f6-87de-c739959c64db" (UID: "ea241c85-3097-48f6-87de-c739959c64db"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:10:52.005715 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:10:52.005702 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ea241c85-3097-48f6-87de-c739959c64db-kube-api-access-mvx6b" (OuterVolumeSpecName: "kube-api-access-mvx6b") pod "ea241c85-3097-48f6-87de-c739959c64db" (UID: "ea241c85-3097-48f6-87de-c739959c64db"). InnerVolumeSpecName "kube-api-access-mvx6b". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 20:10:52.071985 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:10:52.071923 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ea241c85-3097-48f6-87de-c739959c64db-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "ea241c85-3097-48f6-87de-c739959c64db" (UID: "ea241c85-3097-48f6-87de-c739959c64db"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:10:52.104224 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:10:52.104194 2572 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/ea241c85-3097-48f6-87de-c739959c64db-model-cache\") on node \"ip-10-0-128-48.ec2.internal\" DevicePath \"\"" Apr 16 20:10:52.104224 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:10:52.104221 2572 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/ea241c85-3097-48f6-87de-c739959c64db-home\") on node \"ip-10-0-128-48.ec2.internal\" DevicePath \"\"" Apr 16 20:10:52.104224 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:10:52.104230 2572 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/ea241c85-3097-48f6-87de-c739959c64db-tls-certs\") on node \"ip-10-0-128-48.ec2.internal\" DevicePath \"\"" Apr 16 20:10:52.104437 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:10:52.104238 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-mvx6b\" (UniqueName: \"kubernetes.io/projected/ea241c85-3097-48f6-87de-c739959c64db-kube-api-access-mvx6b\") on node \"ip-10-0-128-48.ec2.internal\" DevicePath \"\"" Apr 16 20:10:52.104437 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:10:52.104250 2572 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ea241c85-3097-48f6-87de-c739959c64db-kserve-provision-location\") on node \"ip-10-0-128-48.ec2.internal\" DevicePath \"\"" Apr 16 20:10:52.104437 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:10:52.104258 2572 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/ea241c85-3097-48f6-87de-c739959c64db-dshm\") on node \"ip-10-0-128-48.ec2.internal\" DevicePath \"\"" Apr 16 20:10:52.699085 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:10:52.699055 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-598cb956d-n47tq_ea241c85-3097-48f6-87de-c739959c64db/main/0.log" Apr 16 20:10:52.699439 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:10:52.699412 2572 generic.go:358] "Generic (PLEG): container finished" podID="ea241c85-3097-48f6-87de-c739959c64db" containerID="ae6e0be179524db9701bdbd8069c5ee1904eed0e612026dfab2e77da37b11366" exitCode=137 Apr 16 20:10:52.699540 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:10:52.699491 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-598cb956d-n47tq" event={"ID":"ea241c85-3097-48f6-87de-c739959c64db","Type":"ContainerDied","Data":"ae6e0be179524db9701bdbd8069c5ee1904eed0e612026dfab2e77da37b11366"} Apr 16 20:10:52.699540 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:10:52.699514 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-598cb956d-n47tq" Apr 16 20:10:52.699540 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:10:52.699533 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-598cb956d-n47tq" event={"ID":"ea241c85-3097-48f6-87de-c739959c64db","Type":"ContainerDied","Data":"128b4881822bf3e54c0966f7c793bf5dbf5177d3306d94b4ebb01ddfc9b7b4f4"} Apr 16 20:10:52.699641 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:10:52.699549 2572 scope.go:117] "RemoveContainer" containerID="ae6e0be179524db9701bdbd8069c5ee1904eed0e612026dfab2e77da37b11366" Apr 16 20:10:52.722462 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:10:52.722433 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-598cb956d-n47tq"] Apr 16 20:10:52.724915 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:10:52.724890 2572 scope.go:117] "RemoveContainer" containerID="24899f350b6b2769b1dabb144a07536150974a1f91363c476900e5e283ee2734" Apr 16 20:10:52.726282 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:10:52.726257 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-598cb956d-n47tq"] Apr 16 20:10:52.734624 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:10:52.734605 2572 scope.go:117] "RemoveContainer" containerID="ae6e0be179524db9701bdbd8069c5ee1904eed0e612026dfab2e77da37b11366" Apr 16 20:10:52.734940 ip-10-0-128-48 kubenswrapper[2572]: E0416 20:10:52.734919 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ae6e0be179524db9701bdbd8069c5ee1904eed0e612026dfab2e77da37b11366\": container with ID starting with ae6e0be179524db9701bdbd8069c5ee1904eed0e612026dfab2e77da37b11366 not found: ID does not exist" containerID="ae6e0be179524db9701bdbd8069c5ee1904eed0e612026dfab2e77da37b11366" Apr 16 20:10:52.735000 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:10:52.734949 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ae6e0be179524db9701bdbd8069c5ee1904eed0e612026dfab2e77da37b11366"} err="failed to get container status \"ae6e0be179524db9701bdbd8069c5ee1904eed0e612026dfab2e77da37b11366\": rpc error: code = NotFound desc = could not find container \"ae6e0be179524db9701bdbd8069c5ee1904eed0e612026dfab2e77da37b11366\": container with ID starting with ae6e0be179524db9701bdbd8069c5ee1904eed0e612026dfab2e77da37b11366 not found: ID does not exist" Apr 16 20:10:52.735000 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:10:52.734969 2572 scope.go:117] "RemoveContainer" containerID="24899f350b6b2769b1dabb144a07536150974a1f91363c476900e5e283ee2734" Apr 16 20:10:52.735230 ip-10-0-128-48 kubenswrapper[2572]: E0416 20:10:52.735213 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"24899f350b6b2769b1dabb144a07536150974a1f91363c476900e5e283ee2734\": container with ID starting with 24899f350b6b2769b1dabb144a07536150974a1f91363c476900e5e283ee2734 not found: ID does not exist" containerID="24899f350b6b2769b1dabb144a07536150974a1f91363c476900e5e283ee2734" Apr 16 20:10:52.735296 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:10:52.735237 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"24899f350b6b2769b1dabb144a07536150974a1f91363c476900e5e283ee2734"} err="failed to get container status \"24899f350b6b2769b1dabb144a07536150974a1f91363c476900e5e283ee2734\": rpc error: code = NotFound desc = could not find container \"24899f350b6b2769b1dabb144a07536150974a1f91363c476900e5e283ee2734\": container with ID starting with 24899f350b6b2769b1dabb144a07536150974a1f91363c476900e5e283ee2734 not found: ID does not exist" Apr 16 20:10:53.367591 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:10:53.367563 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ea241c85-3097-48f6-87de-c739959c64db" path="/var/lib/kubelet/pods/ea241c85-3097-48f6-87de-c739959c64db/volumes" Apr 16 20:10:55.537910 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:10:55.537865 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-5b8cffcb46-jlzws" podUID="d7685da0-508e-483c-8474-145e3f203c64" containerName="main" probeResult="failure" output="Get \"https://10.132.0.28:8000/health\": dial tcp 10.132.0.28:8000: connect: connection refused" Apr 16 20:11:05.538084 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:11:05.538042 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-5b8cffcb46-jlzws" podUID="d7685da0-508e-483c-8474-145e3f203c64" containerName="main" probeResult="failure" output="Get \"https://10.132.0.28:8000/health\": dial tcp 10.132.0.28:8000: connect: connection refused" Apr 16 20:11:15.537813 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:11:15.537760 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-5b8cffcb46-jlzws" podUID="d7685da0-508e-483c-8474-145e3f203c64" containerName="main" probeResult="failure" output="Get \"https://10.132.0.28:8000/health\": dial tcp 10.132.0.28:8000: connect: connection refused" Apr 16 20:11:25.537668 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:11:25.537622 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-5b8cffcb46-jlzws" podUID="d7685da0-508e-483c-8474-145e3f203c64" containerName="main" probeResult="failure" output="Get \"https://10.132.0.28:8000/health\": dial tcp 10.132.0.28:8000: connect: connection refused" Apr 16 20:11:35.538056 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:11:35.538008 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-5b8cffcb46-jlzws" podUID="d7685da0-508e-483c-8474-145e3f203c64" containerName="main" probeResult="failure" output="Get \"https://10.132.0.28:8000/health\": dial tcp 10.132.0.28:8000: connect: connection refused" Apr 16 20:11:45.538372 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:11:45.538274 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-5b8cffcb46-jlzws" podUID="d7685da0-508e-483c-8474-145e3f203c64" containerName="main" probeResult="failure" output="Get \"https://10.132.0.28:8000/health\": dial tcp 10.132.0.28:8000: connect: connection refused" Apr 16 20:11:55.537857 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:11:55.537808 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-5b8cffcb46-jlzws" podUID="d7685da0-508e-483c-8474-145e3f203c64" containerName="main" probeResult="failure" output="Get \"https://10.132.0.28:8000/health\": dial tcp 10.132.0.28:8000: connect: connection refused" Apr 16 20:12:05.546961 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:12:05.546932 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-5b8cffcb46-jlzws" Apr 16 20:12:05.554598 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:12:05.554574 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-5b8cffcb46-jlzws" Apr 16 20:12:11.668555 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:12:11.668523 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-5b8cffcb46-jlzws"] Apr 16 20:12:11.668941 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:12:11.668848 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-5b8cffcb46-jlzws" podUID="d7685da0-508e-483c-8474-145e3f203c64" containerName="main" containerID="cri-o://708c6bcfc0d8024b2603bf3677ccee23b86ec9ac92974e3e6dcabbd0df83b863" gracePeriod=30 Apr 16 20:12:17.513975 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:12:17.513938 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-857cc6b66-kq5l9"] Apr 16 20:12:17.514435 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:12:17.514292 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ea241c85-3097-48f6-87de-c739959c64db" containerName="storage-initializer" Apr 16 20:12:17.514435 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:12:17.514306 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea241c85-3097-48f6-87de-c739959c64db" containerName="storage-initializer" Apr 16 20:12:17.514435 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:12:17.514313 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ea241c85-3097-48f6-87de-c739959c64db" containerName="main" Apr 16 20:12:17.514435 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:12:17.514319 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea241c85-3097-48f6-87de-c739959c64db" containerName="main" Apr 16 20:12:17.514435 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:12:17.514369 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="ea241c85-3097-48f6-87de-c739959c64db" containerName="main" Apr 16 20:12:17.517135 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:12:17.517113 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-857cc6b66-kq5l9" Apr 16 20:12:17.520382 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:12:17.520362 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-with-refs-test-kserve-self-signed-certs\"" Apr 16 20:12:17.528653 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:12:17.528632 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-857cc6b66-kq5l9"] Apr 16 20:12:17.612759 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:12:17.612724 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/4ae3ec14-d235-43c1-acdf-65b5e4368321-tls-certs\") pod \"router-with-refs-test-kserve-857cc6b66-kq5l9\" (UID: \"4ae3ec14-d235-43c1-acdf-65b5e4368321\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-857cc6b66-kq5l9" Apr 16 20:12:17.612927 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:12:17.612773 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sn9dz\" (UniqueName: \"kubernetes.io/projected/4ae3ec14-d235-43c1-acdf-65b5e4368321-kube-api-access-sn9dz\") pod \"router-with-refs-test-kserve-857cc6b66-kq5l9\" (UID: \"4ae3ec14-d235-43c1-acdf-65b5e4368321\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-857cc6b66-kq5l9" Apr 16 20:12:17.612927 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:12:17.612803 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/4ae3ec14-d235-43c1-acdf-65b5e4368321-dshm\") pod \"router-with-refs-test-kserve-857cc6b66-kq5l9\" (UID: \"4ae3ec14-d235-43c1-acdf-65b5e4368321\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-857cc6b66-kq5l9" Apr 16 20:12:17.612927 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:12:17.612827 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/4ae3ec14-d235-43c1-acdf-65b5e4368321-home\") pod \"router-with-refs-test-kserve-857cc6b66-kq5l9\" (UID: \"4ae3ec14-d235-43c1-acdf-65b5e4368321\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-857cc6b66-kq5l9" Apr 16 20:12:17.612927 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:12:17.612846 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/4ae3ec14-d235-43c1-acdf-65b5e4368321-model-cache\") pod \"router-with-refs-test-kserve-857cc6b66-kq5l9\" (UID: \"4ae3ec14-d235-43c1-acdf-65b5e4368321\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-857cc6b66-kq5l9" Apr 16 20:12:17.612927 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:12:17.612907 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4ae3ec14-d235-43c1-acdf-65b5e4368321-kserve-provision-location\") pod \"router-with-refs-test-kserve-857cc6b66-kq5l9\" (UID: \"4ae3ec14-d235-43c1-acdf-65b5e4368321\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-857cc6b66-kq5l9" Apr 16 20:12:17.713908 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:12:17.713883 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/4ae3ec14-d235-43c1-acdf-65b5e4368321-tls-certs\") pod \"router-with-refs-test-kserve-857cc6b66-kq5l9\" (UID: \"4ae3ec14-d235-43c1-acdf-65b5e4368321\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-857cc6b66-kq5l9" Apr 16 20:12:17.714053 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:12:17.713931 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sn9dz\" (UniqueName: \"kubernetes.io/projected/4ae3ec14-d235-43c1-acdf-65b5e4368321-kube-api-access-sn9dz\") pod \"router-with-refs-test-kserve-857cc6b66-kq5l9\" (UID: \"4ae3ec14-d235-43c1-acdf-65b5e4368321\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-857cc6b66-kq5l9" Apr 16 20:12:17.714053 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:12:17.713966 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/4ae3ec14-d235-43c1-acdf-65b5e4368321-dshm\") pod \"router-with-refs-test-kserve-857cc6b66-kq5l9\" (UID: \"4ae3ec14-d235-43c1-acdf-65b5e4368321\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-857cc6b66-kq5l9" Apr 16 20:12:17.714053 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:12:17.714001 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/4ae3ec14-d235-43c1-acdf-65b5e4368321-home\") pod \"router-with-refs-test-kserve-857cc6b66-kq5l9\" (UID: \"4ae3ec14-d235-43c1-acdf-65b5e4368321\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-857cc6b66-kq5l9" Apr 16 20:12:17.714053 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:12:17.714020 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/4ae3ec14-d235-43c1-acdf-65b5e4368321-model-cache\") pod \"router-with-refs-test-kserve-857cc6b66-kq5l9\" (UID: \"4ae3ec14-d235-43c1-acdf-65b5e4368321\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-857cc6b66-kq5l9" Apr 16 20:12:17.714053 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:12:17.714046 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4ae3ec14-d235-43c1-acdf-65b5e4368321-kserve-provision-location\") pod \"router-with-refs-test-kserve-857cc6b66-kq5l9\" (UID: \"4ae3ec14-d235-43c1-acdf-65b5e4368321\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-857cc6b66-kq5l9" Apr 16 20:12:17.714466 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:12:17.714433 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/4ae3ec14-d235-43c1-acdf-65b5e4368321-home\") pod \"router-with-refs-test-kserve-857cc6b66-kq5l9\" (UID: \"4ae3ec14-d235-43c1-acdf-65b5e4368321\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-857cc6b66-kq5l9" Apr 16 20:12:17.714586 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:12:17.714544 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/4ae3ec14-d235-43c1-acdf-65b5e4368321-model-cache\") pod \"router-with-refs-test-kserve-857cc6b66-kq5l9\" (UID: \"4ae3ec14-d235-43c1-acdf-65b5e4368321\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-857cc6b66-kq5l9" Apr 16 20:12:17.714586 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:12:17.714569 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4ae3ec14-d235-43c1-acdf-65b5e4368321-kserve-provision-location\") pod \"router-with-refs-test-kserve-857cc6b66-kq5l9\" (UID: \"4ae3ec14-d235-43c1-acdf-65b5e4368321\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-857cc6b66-kq5l9" Apr 16 20:12:17.716200 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:12:17.716175 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/4ae3ec14-d235-43c1-acdf-65b5e4368321-dshm\") pod \"router-with-refs-test-kserve-857cc6b66-kq5l9\" (UID: \"4ae3ec14-d235-43c1-acdf-65b5e4368321\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-857cc6b66-kq5l9" Apr 16 20:12:17.716458 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:12:17.716439 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/4ae3ec14-d235-43c1-acdf-65b5e4368321-tls-certs\") pod \"router-with-refs-test-kserve-857cc6b66-kq5l9\" (UID: \"4ae3ec14-d235-43c1-acdf-65b5e4368321\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-857cc6b66-kq5l9" Apr 16 20:12:17.722436 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:12:17.722418 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sn9dz\" (UniqueName: \"kubernetes.io/projected/4ae3ec14-d235-43c1-acdf-65b5e4368321-kube-api-access-sn9dz\") pod \"router-with-refs-test-kserve-857cc6b66-kq5l9\" (UID: \"4ae3ec14-d235-43c1-acdf-65b5e4368321\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-857cc6b66-kq5l9" Apr 16 20:12:17.827313 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:12:17.827279 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-857cc6b66-kq5l9" Apr 16 20:12:18.155730 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:12:18.155706 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-857cc6b66-kq5l9"] Apr 16 20:12:18.158389 ip-10-0-128-48 kubenswrapper[2572]: W0416 20:12:18.158349 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4ae3ec14_d235_43c1_acdf_65b5e4368321.slice/crio-cbe38fa380473d9d086fcd52ff544a9e0e7e903d759675c3aa6033b87cefda32 WatchSource:0}: Error finding container cbe38fa380473d9d086fcd52ff544a9e0e7e903d759675c3aa6033b87cefda32: Status 404 returned error can't find the container with id cbe38fa380473d9d086fcd52ff544a9e0e7e903d759675c3aa6033b87cefda32 Apr 16 20:12:18.160191 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:12:18.160169 2572 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 20:12:18.992921 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:12:18.992881 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-857cc6b66-kq5l9" event={"ID":"4ae3ec14-d235-43c1-acdf-65b5e4368321","Type":"ContainerStarted","Data":"4b336e1ed070fc1b126e7428d87c2a392548e2e3505e67f0b8ec9de508a1aa85"} Apr 16 20:12:18.992921 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:12:18.992920 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-857cc6b66-kq5l9" event={"ID":"4ae3ec14-d235-43c1-acdf-65b5e4368321","Type":"ContainerStarted","Data":"cbe38fa380473d9d086fcd52ff544a9e0e7e903d759675c3aa6033b87cefda32"} Apr 16 20:12:23.007138 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:12:23.007087 2572 generic.go:358] "Generic (PLEG): container finished" podID="4ae3ec14-d235-43c1-acdf-65b5e4368321" containerID="4b336e1ed070fc1b126e7428d87c2a392548e2e3505e67f0b8ec9de508a1aa85" exitCode=0 Apr 16 20:12:23.007513 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:12:23.007159 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-857cc6b66-kq5l9" event={"ID":"4ae3ec14-d235-43c1-acdf-65b5e4368321","Type":"ContainerDied","Data":"4b336e1ed070fc1b126e7428d87c2a392548e2e3505e67f0b8ec9de508a1aa85"} Apr 16 20:12:24.012141 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:12:24.012106 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-857cc6b66-kq5l9" event={"ID":"4ae3ec14-d235-43c1-acdf-65b5e4368321","Type":"ContainerStarted","Data":"e29228268e2ed63f8ff4664503756c655f054b1e6f17a59c964b0bcd72b9dec6"} Apr 16 20:12:24.037353 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:12:24.037306 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-857cc6b66-kq5l9" podStartSLOduration=7.037293047 podStartE2EDuration="7.037293047s" podCreationTimestamp="2026-04-16 20:12:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:12:24.034828104 +0000 UTC m=+1111.213590129" watchObservedRunningTime="2026-04-16 20:12:24.037293047 +0000 UTC m=+1111.216055085" Apr 16 20:12:27.827413 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:12:27.827374 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-857cc6b66-kq5l9" Apr 16 20:12:27.827413 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:12:27.827417 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-857cc6b66-kq5l9" Apr 16 20:12:27.828930 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:12:27.828897 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-857cc6b66-kq5l9" podUID="4ae3ec14-d235-43c1-acdf-65b5e4368321" containerName="main" probeResult="failure" output="Get \"https://10.132.0.29:8000/health\": dial tcp 10.132.0.29:8000: connect: connection refused" Apr 16 20:12:37.828193 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:12:37.828145 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-857cc6b66-kq5l9" podUID="4ae3ec14-d235-43c1-acdf-65b5e4368321" containerName="main" probeResult="failure" output="Get \"https://10.132.0.29:8000/health\": dial tcp 10.132.0.29:8000: connect: connection refused" Apr 16 20:12:41.530855 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:12:41.530820 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5f9c4d7w4lj8"] Apr 16 20:12:41.534560 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:12:41.534538 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5f9c4d7w4lj8" Apr 16 20:12:41.537530 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:12:41.537505 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-ha-replicas-test-epp-sa-dockercfg-2n5s5\"" Apr 16 20:12:41.537530 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:12:41.537519 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-ha-replicas-test-kserve-self-signed-certs\"" Apr 16 20:12:41.545225 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:12:41.545205 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5f9c4d7w4lj8"] Apr 16 20:12:41.613377 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:12:41.613342 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-94bbj\" (UniqueName: \"kubernetes.io/projected/2c75db93-cb3e-4fa5-890a-d1032cdaada6-kube-api-access-94bbj\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-5f9c4d7w4lj8\" (UID: \"2c75db93-cb3e-4fa5-890a-d1032cdaada6\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5f9c4d7w4lj8" Apr 16 20:12:41.613531 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:12:41.613381 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2c75db93-cb3e-4fa5-890a-d1032cdaada6-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-5f9c4d7w4lj8\" (UID: \"2c75db93-cb3e-4fa5-890a-d1032cdaada6\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5f9c4d7w4lj8" Apr 16 20:12:41.613531 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:12:41.613511 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2c75db93-cb3e-4fa5-890a-d1032cdaada6-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-5f9c4d7w4lj8\" (UID: \"2c75db93-cb3e-4fa5-890a-d1032cdaada6\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5f9c4d7w4lj8" Apr 16 20:12:41.613603 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:12:41.613572 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/2c75db93-cb3e-4fa5-890a-d1032cdaada6-tokenizer-tmp\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-5f9c4d7w4lj8\" (UID: \"2c75db93-cb3e-4fa5-890a-d1032cdaada6\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5f9c4d7w4lj8" Apr 16 20:12:41.613640 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:12:41.613608 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/2c75db93-cb3e-4fa5-890a-d1032cdaada6-tokenizer-cache\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-5f9c4d7w4lj8\" (UID: \"2c75db93-cb3e-4fa5-890a-d1032cdaada6\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5f9c4d7w4lj8" Apr 16 20:12:41.613687 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:12:41.613669 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/2c75db93-cb3e-4fa5-890a-d1032cdaada6-tokenizer-uds\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-5f9c4d7w4lj8\" (UID: \"2c75db93-cb3e-4fa5-890a-d1032cdaada6\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5f9c4d7w4lj8" Apr 16 20:12:41.714732 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:12:41.714693 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/2c75db93-cb3e-4fa5-890a-d1032cdaada6-tokenizer-tmp\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-5f9c4d7w4lj8\" (UID: \"2c75db93-cb3e-4fa5-890a-d1032cdaada6\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5f9c4d7w4lj8" Apr 16 20:12:41.714920 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:12:41.714751 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/2c75db93-cb3e-4fa5-890a-d1032cdaada6-tokenizer-cache\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-5f9c4d7w4lj8\" (UID: \"2c75db93-cb3e-4fa5-890a-d1032cdaada6\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5f9c4d7w4lj8" Apr 16 20:12:41.714920 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:12:41.714799 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/2c75db93-cb3e-4fa5-890a-d1032cdaada6-tokenizer-uds\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-5f9c4d7w4lj8\" (UID: \"2c75db93-cb3e-4fa5-890a-d1032cdaada6\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5f9c4d7w4lj8" Apr 16 20:12:41.714920 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:12:41.714855 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-94bbj\" (UniqueName: \"kubernetes.io/projected/2c75db93-cb3e-4fa5-890a-d1032cdaada6-kube-api-access-94bbj\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-5f9c4d7w4lj8\" (UID: \"2c75db93-cb3e-4fa5-890a-d1032cdaada6\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5f9c4d7w4lj8" Apr 16 20:12:41.714920 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:12:41.714886 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2c75db93-cb3e-4fa5-890a-d1032cdaada6-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-5f9c4d7w4lj8\" (UID: \"2c75db93-cb3e-4fa5-890a-d1032cdaada6\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5f9c4d7w4lj8" Apr 16 20:12:41.715151 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:12:41.714955 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2c75db93-cb3e-4fa5-890a-d1032cdaada6-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-5f9c4d7w4lj8\" (UID: \"2c75db93-cb3e-4fa5-890a-d1032cdaada6\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5f9c4d7w4lj8" Apr 16 20:12:41.715205 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:12:41.715135 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/2c75db93-cb3e-4fa5-890a-d1032cdaada6-tokenizer-tmp\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-5f9c4d7w4lj8\" (UID: \"2c75db93-cb3e-4fa5-890a-d1032cdaada6\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5f9c4d7w4lj8" Apr 16 20:12:41.715205 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:12:41.715166 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/2c75db93-cb3e-4fa5-890a-d1032cdaada6-tokenizer-cache\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-5f9c4d7w4lj8\" (UID: \"2c75db93-cb3e-4fa5-890a-d1032cdaada6\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5f9c4d7w4lj8" Apr 16 20:12:41.715346 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:12:41.715308 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2c75db93-cb3e-4fa5-890a-d1032cdaada6-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-5f9c4d7w4lj8\" (UID: \"2c75db93-cb3e-4fa5-890a-d1032cdaada6\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5f9c4d7w4lj8" Apr 16 20:12:41.715397 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:12:41.715340 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/2c75db93-cb3e-4fa5-890a-d1032cdaada6-tokenizer-uds\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-5f9c4d7w4lj8\" (UID: \"2c75db93-cb3e-4fa5-890a-d1032cdaada6\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5f9c4d7w4lj8" Apr 16 20:12:41.717567 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:12:41.717536 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2c75db93-cb3e-4fa5-890a-d1032cdaada6-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-5f9c4d7w4lj8\" (UID: \"2c75db93-cb3e-4fa5-890a-d1032cdaada6\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5f9c4d7w4lj8" Apr 16 20:12:41.724255 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:12:41.724228 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-94bbj\" (UniqueName: \"kubernetes.io/projected/2c75db93-cb3e-4fa5-890a-d1032cdaada6-kube-api-access-94bbj\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-5f9c4d7w4lj8\" (UID: \"2c75db93-cb3e-4fa5-890a-d1032cdaada6\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5f9c4d7w4lj8" Apr 16 20:12:41.845372 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:12:41.845322 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5f9c4d7w4lj8" Apr 16 20:12:41.932611 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:12:41.932577 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_custom-route-timeout-test-kserve-5b8cffcb46-jlzws_d7685da0-508e-483c-8474-145e3f203c64/main/0.log" Apr 16 20:12:41.933059 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:12:41.933034 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-5b8cffcb46-jlzws" Apr 16 20:12:41.995819 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:12:41.995794 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5f9c4d7w4lj8"] Apr 16 20:12:41.997511 ip-10-0-128-48 kubenswrapper[2572]: W0416 20:12:41.997480 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2c75db93_cb3e_4fa5_890a_d1032cdaada6.slice/crio-637cf16c4f51189f843c782b6b0f582b2ec8f7889326e68f03b960abbdc3f5ff WatchSource:0}: Error finding container 637cf16c4f51189f843c782b6b0f582b2ec8f7889326e68f03b960abbdc3f5ff: Status 404 returned error can't find the container with id 637cf16c4f51189f843c782b6b0f582b2ec8f7889326e68f03b960abbdc3f5ff Apr 16 20:12:42.018121 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:12:42.018078 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d7685da0-508e-483c-8474-145e3f203c64-tls-certs\") pod \"d7685da0-508e-483c-8474-145e3f203c64\" (UID: \"d7685da0-508e-483c-8474-145e3f203c64\") " Apr 16 20:12:42.018218 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:12:42.018153 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d7685da0-508e-483c-8474-145e3f203c64-kserve-provision-location\") pod \"d7685da0-508e-483c-8474-145e3f203c64\" (UID: \"d7685da0-508e-483c-8474-145e3f203c64\") " Apr 16 20:12:42.018218 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:12:42.018188 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-th6m6\" (UniqueName: \"kubernetes.io/projected/d7685da0-508e-483c-8474-145e3f203c64-kube-api-access-th6m6\") pod \"d7685da0-508e-483c-8474-145e3f203c64\" (UID: \"d7685da0-508e-483c-8474-145e3f203c64\") " Apr 16 20:12:42.018218 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:12:42.018214 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/d7685da0-508e-483c-8474-145e3f203c64-model-cache\") pod \"d7685da0-508e-483c-8474-145e3f203c64\" (UID: \"d7685da0-508e-483c-8474-145e3f203c64\") " Apr 16 20:12:42.018478 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:12:42.018457 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d7685da0-508e-483c-8474-145e3f203c64-model-cache" (OuterVolumeSpecName: "model-cache") pod "d7685da0-508e-483c-8474-145e3f203c64" (UID: "d7685da0-508e-483c-8474-145e3f203c64"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:12:42.018953 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:12:42.018485 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/d7685da0-508e-483c-8474-145e3f203c64-dshm\") pod \"d7685da0-508e-483c-8474-145e3f203c64\" (UID: \"d7685da0-508e-483c-8474-145e3f203c64\") " Apr 16 20:12:42.018953 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:12:42.018708 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/d7685da0-508e-483c-8474-145e3f203c64-home\") pod \"d7685da0-508e-483c-8474-145e3f203c64\" (UID: \"d7685da0-508e-483c-8474-145e3f203c64\") " Apr 16 20:12:42.019189 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:12:42.019158 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d7685da0-508e-483c-8474-145e3f203c64-home" (OuterVolumeSpecName: "home") pod "d7685da0-508e-483c-8474-145e3f203c64" (UID: "d7685da0-508e-483c-8474-145e3f203c64"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:12:42.019259 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:12:42.019172 2572 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/d7685da0-508e-483c-8474-145e3f203c64-model-cache\") on node \"ip-10-0-128-48.ec2.internal\" DevicePath \"\"" Apr 16 20:12:42.020598 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:12:42.020560 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7685da0-508e-483c-8474-145e3f203c64-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "d7685da0-508e-483c-8474-145e3f203c64" (UID: "d7685da0-508e-483c-8474-145e3f203c64"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:12:42.020939 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:12:42.020915 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d7685da0-508e-483c-8474-145e3f203c64-kube-api-access-th6m6" (OuterVolumeSpecName: "kube-api-access-th6m6") pod "d7685da0-508e-483c-8474-145e3f203c64" (UID: "d7685da0-508e-483c-8474-145e3f203c64"). InnerVolumeSpecName "kube-api-access-th6m6". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 20:12:42.021030 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:12:42.021017 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d7685da0-508e-483c-8474-145e3f203c64-dshm" (OuterVolumeSpecName: "dshm") pod "d7685da0-508e-483c-8474-145e3f203c64" (UID: "d7685da0-508e-483c-8474-145e3f203c64"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:12:42.075494 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:12:42.075455 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d7685da0-508e-483c-8474-145e3f203c64-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "d7685da0-508e-483c-8474-145e3f203c64" (UID: "d7685da0-508e-483c-8474-145e3f203c64"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:12:42.076675 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:12:42.076642 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5f9c4d7w4lj8" event={"ID":"2c75db93-cb3e-4fa5-890a-d1032cdaada6","Type":"ContainerStarted","Data":"9bd7831b3f53ad51b8dccbe011631775490437088b5e1d47e11462ec6a6d132d"} Apr 16 20:12:42.076797 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:12:42.076688 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5f9c4d7w4lj8" event={"ID":"2c75db93-cb3e-4fa5-890a-d1032cdaada6","Type":"ContainerStarted","Data":"637cf16c4f51189f843c782b6b0f582b2ec8f7889326e68f03b960abbdc3f5ff"} Apr 16 20:12:42.078084 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:12:42.078057 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_custom-route-timeout-test-kserve-5b8cffcb46-jlzws_d7685da0-508e-483c-8474-145e3f203c64/main/0.log" Apr 16 20:12:42.078430 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:12:42.078407 2572 generic.go:358] "Generic (PLEG): container finished" podID="d7685da0-508e-483c-8474-145e3f203c64" containerID="708c6bcfc0d8024b2603bf3677ccee23b86ec9ac92974e3e6dcabbd0df83b863" exitCode=137 Apr 16 20:12:42.078532 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:12:42.078468 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-5b8cffcb46-jlzws" Apr 16 20:12:42.078532 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:12:42.078483 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-5b8cffcb46-jlzws" event={"ID":"d7685da0-508e-483c-8474-145e3f203c64","Type":"ContainerDied","Data":"708c6bcfc0d8024b2603bf3677ccee23b86ec9ac92974e3e6dcabbd0df83b863"} Apr 16 20:12:42.078532 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:12:42.078512 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-5b8cffcb46-jlzws" event={"ID":"d7685da0-508e-483c-8474-145e3f203c64","Type":"ContainerDied","Data":"2175cb1fa17b3db2927541441e15a4421222c5b7421698b98df686325dc87033"} Apr 16 20:12:42.078532 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:12:42.078531 2572 scope.go:117] "RemoveContainer" containerID="708c6bcfc0d8024b2603bf3677ccee23b86ec9ac92974e3e6dcabbd0df83b863" Apr 16 20:12:42.098800 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:12:42.098710 2572 scope.go:117] "RemoveContainer" containerID="c79359b64e13333aa2d4fd72dc28761ab870ee3f0cb0b3554a1de507b1294a1f" Apr 16 20:12:42.117901 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:12:42.117870 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-5b8cffcb46-jlzws"] Apr 16 20:12:42.120068 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:12:42.120036 2572 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/d7685da0-508e-483c-8474-145e3f203c64-dshm\") on node \"ip-10-0-128-48.ec2.internal\" DevicePath \"\"" Apr 16 20:12:42.120068 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:12:42.120065 2572 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/d7685da0-508e-483c-8474-145e3f203c64-home\") on node \"ip-10-0-128-48.ec2.internal\" DevicePath \"\"" Apr 16 20:12:42.120508 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:12:42.120077 2572 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d7685da0-508e-483c-8474-145e3f203c64-tls-certs\") on node \"ip-10-0-128-48.ec2.internal\" DevicePath \"\"" Apr 16 20:12:42.120508 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:12:42.120086 2572 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d7685da0-508e-483c-8474-145e3f203c64-kserve-provision-location\") on node \"ip-10-0-128-48.ec2.internal\" DevicePath \"\"" Apr 16 20:12:42.120508 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:12:42.120126 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-th6m6\" (UniqueName: \"kubernetes.io/projected/d7685da0-508e-483c-8474-145e3f203c64-kube-api-access-th6m6\") on node \"ip-10-0-128-48.ec2.internal\" DevicePath \"\"" Apr 16 20:12:42.125128 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:12:42.125081 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-5b8cffcb46-jlzws"] Apr 16 20:12:42.164999 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:12:42.164971 2572 scope.go:117] "RemoveContainer" containerID="708c6bcfc0d8024b2603bf3677ccee23b86ec9ac92974e3e6dcabbd0df83b863" Apr 16 20:12:42.165374 ip-10-0-128-48 kubenswrapper[2572]: E0416 20:12:42.165345 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"708c6bcfc0d8024b2603bf3677ccee23b86ec9ac92974e3e6dcabbd0df83b863\": container with ID starting with 708c6bcfc0d8024b2603bf3677ccee23b86ec9ac92974e3e6dcabbd0df83b863 not found: ID does not exist" containerID="708c6bcfc0d8024b2603bf3677ccee23b86ec9ac92974e3e6dcabbd0df83b863" Apr 16 20:12:42.165499 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:12:42.165386 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"708c6bcfc0d8024b2603bf3677ccee23b86ec9ac92974e3e6dcabbd0df83b863"} err="failed to get container status \"708c6bcfc0d8024b2603bf3677ccee23b86ec9ac92974e3e6dcabbd0df83b863\": rpc error: code = NotFound desc = could not find container \"708c6bcfc0d8024b2603bf3677ccee23b86ec9ac92974e3e6dcabbd0df83b863\": container with ID starting with 708c6bcfc0d8024b2603bf3677ccee23b86ec9ac92974e3e6dcabbd0df83b863 not found: ID does not exist" Apr 16 20:12:42.165499 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:12:42.165415 2572 scope.go:117] "RemoveContainer" containerID="c79359b64e13333aa2d4fd72dc28761ab870ee3f0cb0b3554a1de507b1294a1f" Apr 16 20:12:42.165729 ip-10-0-128-48 kubenswrapper[2572]: E0416 20:12:42.165708 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c79359b64e13333aa2d4fd72dc28761ab870ee3f0cb0b3554a1de507b1294a1f\": container with ID starting with c79359b64e13333aa2d4fd72dc28761ab870ee3f0cb0b3554a1de507b1294a1f not found: ID does not exist" containerID="c79359b64e13333aa2d4fd72dc28761ab870ee3f0cb0b3554a1de507b1294a1f" Apr 16 20:12:42.165822 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:12:42.165738 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c79359b64e13333aa2d4fd72dc28761ab870ee3f0cb0b3554a1de507b1294a1f"} err="failed to get container status \"c79359b64e13333aa2d4fd72dc28761ab870ee3f0cb0b3554a1de507b1294a1f\": rpc error: code = NotFound desc = could not find container \"c79359b64e13333aa2d4fd72dc28761ab870ee3f0cb0b3554a1de507b1294a1f\": container with ID starting with c79359b64e13333aa2d4fd72dc28761ab870ee3f0cb0b3554a1de507b1294a1f not found: ID does not exist" Apr 16 20:12:43.084767 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:12:43.084728 2572 generic.go:358] "Generic (PLEG): container finished" podID="2c75db93-cb3e-4fa5-890a-d1032cdaada6" containerID="9bd7831b3f53ad51b8dccbe011631775490437088b5e1d47e11462ec6a6d132d" exitCode=0 Apr 16 20:12:43.084767 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:12:43.084770 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5f9c4d7w4lj8" event={"ID":"2c75db93-cb3e-4fa5-890a-d1032cdaada6","Type":"ContainerDied","Data":"9bd7831b3f53ad51b8dccbe011631775490437088b5e1d47e11462ec6a6d132d"} Apr 16 20:12:43.367802 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:12:43.367722 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d7685da0-508e-483c-8474-145e3f203c64" path="/var/lib/kubelet/pods/d7685da0-508e-483c-8474-145e3f203c64/volumes" Apr 16 20:12:45.095351 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:12:45.095304 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5f9c4d7w4lj8" event={"ID":"2c75db93-cb3e-4fa5-890a-d1032cdaada6","Type":"ContainerStarted","Data":"7337e024308b5aba19d0b110dcc2b3f944073fa3be7df5f2d8d61de8721adebf"} Apr 16 20:12:47.828330 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:12:47.828281 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-857cc6b66-kq5l9" podUID="4ae3ec14-d235-43c1-acdf-65b5e4368321" containerName="main" probeResult="failure" output="Get \"https://10.132.0.29:8000/health\": dial tcp 10.132.0.29:8000: connect: connection refused" Apr 16 20:12:57.828164 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:12:57.828112 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-857cc6b66-kq5l9" podUID="4ae3ec14-d235-43c1-acdf-65b5e4368321" containerName="main" probeResult="failure" output="Get \"https://10.132.0.29:8000/health\": dial tcp 10.132.0.29:8000: connect: connection refused" Apr 16 20:13:07.828527 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:13:07.828479 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-857cc6b66-kq5l9" podUID="4ae3ec14-d235-43c1-acdf-65b5e4368321" containerName="main" probeResult="failure" output="Get \"https://10.132.0.29:8000/health\": dial tcp 10.132.0.29:8000: connect: connection refused" Apr 16 20:13:14.708933 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:13:14.708853 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5f9c4d7w4lj8"] Apr 16 20:13:15.218822 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:13:15.218719 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5f9c4d7w4lj8" event={"ID":"2c75db93-cb3e-4fa5-890a-d1032cdaada6","Type":"ContainerStarted","Data":"1fac42792fe134411ca146457120d4c13ff9129467ebb3ce839080736280b57f"} Apr 16 20:13:15.218985 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:13:15.218843 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5f9c4d7w4lj8" podUID="2c75db93-cb3e-4fa5-890a-d1032cdaada6" containerName="main" containerID="cri-o://7337e024308b5aba19d0b110dcc2b3f944073fa3be7df5f2d8d61de8721adebf" gracePeriod=30 Apr 16 20:13:15.218985 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:13:15.218972 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5f9c4d7w4lj8" podUID="2c75db93-cb3e-4fa5-890a-d1032cdaada6" containerName="tokenizer" containerID="cri-o://1fac42792fe134411ca146457120d4c13ff9129467ebb3ce839080736280b57f" gracePeriod=30 Apr 16 20:13:15.219205 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:13:15.219187 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5f9c4d7w4lj8" Apr 16 20:13:15.222537 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:13:15.222456 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5f9c4d7w4lj8" podUID="2c75db93-cb3e-4fa5-890a-d1032cdaada6" containerName="main" probeResult="failure" output="service unhealthy (responded with \"NOT_SERVING\")" Apr 16 20:13:15.243655 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:13:15.243597 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5f9c4d7w4lj8" podStartSLOduration=2.365332097 podStartE2EDuration="34.243578534s" podCreationTimestamp="2026-04-16 20:12:41 +0000 UTC" firstStartedPulling="2026-04-16 20:12:43.086028803 +0000 UTC m=+1130.264790810" lastFinishedPulling="2026-04-16 20:13:14.964275238 +0000 UTC m=+1162.143037247" observedRunningTime="2026-04-16 20:13:15.241323207 +0000 UTC m=+1162.420085236" watchObservedRunningTime="2026-04-16 20:13:15.243578534 +0000 UTC m=+1162.422340559" Apr 16 20:13:16.224767 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:13:16.224712 2572 generic.go:358] "Generic (PLEG): container finished" podID="2c75db93-cb3e-4fa5-890a-d1032cdaada6" containerID="7337e024308b5aba19d0b110dcc2b3f944073fa3be7df5f2d8d61de8721adebf" exitCode=0 Apr 16 20:13:16.225223 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:13:16.224887 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5f9c4d7w4lj8" event={"ID":"2c75db93-cb3e-4fa5-890a-d1032cdaada6","Type":"ContainerDied","Data":"7337e024308b5aba19d0b110dcc2b3f944073fa3be7df5f2d8d61de8721adebf"} Apr 16 20:13:17.828216 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:13:17.828144 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-857cc6b66-kq5l9" podUID="4ae3ec14-d235-43c1-acdf-65b5e4368321" containerName="main" probeResult="failure" output="Get \"https://10.132.0.29:8000/health\": dial tcp 10.132.0.29:8000: connect: connection refused" Apr 16 20:13:21.846394 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:13:21.846350 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5f9c4d7w4lj8" Apr 16 20:13:25.219972 ip-10-0-128-48 kubenswrapper[2572]: W0416 20:13:25.219944 2572 logging.go:55] [core] [Channel #20 SubChannel #21]grpc: addrConn.createTransport failed to connect to {Addr: "10.132.0.30:9003", ServerName: "10.132.0.30:9003", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 10.132.0.30:9003: connect: connection refused" Apr 16 20:13:26.220255 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:13:26.220213 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5f9c4d7w4lj8" podUID="2c75db93-cb3e-4fa5-890a-d1032cdaada6" containerName="main" probeResult="failure" output="timeout: failed to connect service \"10.132.0.30:9003\" within 1s: context deadline exceeded" Apr 16 20:13:27.827873 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:13:27.827838 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-857cc6b66-kq5l9" podUID="4ae3ec14-d235-43c1-acdf-65b5e4368321" containerName="main" probeResult="failure" output="Get \"https://10.132.0.29:8000/health\": dial tcp 10.132.0.29:8000: connect: connection refused" Apr 16 20:13:30.832651 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:13:30.832609 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-f866f4fdc-5qtrk"] Apr 16 20:13:30.832997 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:13:30.832933 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d7685da0-508e-483c-8474-145e3f203c64" containerName="main" Apr 16 20:13:30.832997 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:13:30.832944 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7685da0-508e-483c-8474-145e3f203c64" containerName="main" Apr 16 20:13:30.832997 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:13:30.832965 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d7685da0-508e-483c-8474-145e3f203c64" containerName="storage-initializer" Apr 16 20:13:30.832997 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:13:30.832971 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7685da0-508e-483c-8474-145e3f203c64" containerName="storage-initializer" Apr 16 20:13:30.833152 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:13:30.833020 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="d7685da0-508e-483c-8474-145e3f203c64" containerName="main" Apr 16 20:13:31.392184 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:13:31.392144 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-f866f4fdc-5qtrk"] Apr 16 20:13:31.392369 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:13:31.392303 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-f866f4fdc-5qtrk" Apr 16 20:13:31.395780 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:13:31.395756 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"precise-prefix-cache-test-kserve-self-signed-certs\"" Apr 16 20:13:31.568029 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:13:31.567997 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/3c56a126-64ef-4f72-9bed-a8f5211ec939-dshm\") pod \"precise-prefix-cache-test-kserve-f866f4fdc-5qtrk\" (UID: \"3c56a126-64ef-4f72-9bed-a8f5211ec939\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-f866f4fdc-5qtrk" Apr 16 20:13:31.568207 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:13:31.568041 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/3c56a126-64ef-4f72-9bed-a8f5211ec939-model-cache\") pod \"precise-prefix-cache-test-kserve-f866f4fdc-5qtrk\" (UID: \"3c56a126-64ef-4f72-9bed-a8f5211ec939\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-f866f4fdc-5qtrk" Apr 16 20:13:31.568207 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:13:31.568079 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x9n5l\" (UniqueName: \"kubernetes.io/projected/3c56a126-64ef-4f72-9bed-a8f5211ec939-kube-api-access-x9n5l\") pod \"precise-prefix-cache-test-kserve-f866f4fdc-5qtrk\" (UID: \"3c56a126-64ef-4f72-9bed-a8f5211ec939\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-f866f4fdc-5qtrk" Apr 16 20:13:31.568207 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:13:31.568151 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3c56a126-64ef-4f72-9bed-a8f5211ec939-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-f866f4fdc-5qtrk\" (UID: \"3c56a126-64ef-4f72-9bed-a8f5211ec939\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-f866f4fdc-5qtrk" Apr 16 20:13:31.568207 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:13:31.568181 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/3c56a126-64ef-4f72-9bed-a8f5211ec939-tls-certs\") pod \"precise-prefix-cache-test-kserve-f866f4fdc-5qtrk\" (UID: \"3c56a126-64ef-4f72-9bed-a8f5211ec939\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-f866f4fdc-5qtrk" Apr 16 20:13:31.568342 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:13:31.568268 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/3c56a126-64ef-4f72-9bed-a8f5211ec939-home\") pod \"precise-prefix-cache-test-kserve-f866f4fdc-5qtrk\" (UID: \"3c56a126-64ef-4f72-9bed-a8f5211ec939\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-f866f4fdc-5qtrk" Apr 16 20:13:31.669558 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:13:31.669486 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/3c56a126-64ef-4f72-9bed-a8f5211ec939-dshm\") pod \"precise-prefix-cache-test-kserve-f866f4fdc-5qtrk\" (UID: \"3c56a126-64ef-4f72-9bed-a8f5211ec939\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-f866f4fdc-5qtrk" Apr 16 20:13:31.669558 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:13:31.669526 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/3c56a126-64ef-4f72-9bed-a8f5211ec939-model-cache\") pod \"precise-prefix-cache-test-kserve-f866f4fdc-5qtrk\" (UID: \"3c56a126-64ef-4f72-9bed-a8f5211ec939\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-f866f4fdc-5qtrk" Apr 16 20:13:31.669558 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:13:31.669548 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x9n5l\" (UniqueName: \"kubernetes.io/projected/3c56a126-64ef-4f72-9bed-a8f5211ec939-kube-api-access-x9n5l\") pod \"precise-prefix-cache-test-kserve-f866f4fdc-5qtrk\" (UID: \"3c56a126-64ef-4f72-9bed-a8f5211ec939\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-f866f4fdc-5qtrk" Apr 16 20:13:31.669814 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:13:31.669588 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3c56a126-64ef-4f72-9bed-a8f5211ec939-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-f866f4fdc-5qtrk\" (UID: \"3c56a126-64ef-4f72-9bed-a8f5211ec939\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-f866f4fdc-5qtrk" Apr 16 20:13:31.669814 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:13:31.669618 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/3c56a126-64ef-4f72-9bed-a8f5211ec939-tls-certs\") pod \"precise-prefix-cache-test-kserve-f866f4fdc-5qtrk\" (UID: \"3c56a126-64ef-4f72-9bed-a8f5211ec939\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-f866f4fdc-5qtrk" Apr 16 20:13:31.669814 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:13:31.669671 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/3c56a126-64ef-4f72-9bed-a8f5211ec939-home\") pod \"precise-prefix-cache-test-kserve-f866f4fdc-5qtrk\" (UID: \"3c56a126-64ef-4f72-9bed-a8f5211ec939\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-f866f4fdc-5qtrk" Apr 16 20:13:31.670281 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:13:31.670010 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/3c56a126-64ef-4f72-9bed-a8f5211ec939-model-cache\") pod \"precise-prefix-cache-test-kserve-f866f4fdc-5qtrk\" (UID: \"3c56a126-64ef-4f72-9bed-a8f5211ec939\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-f866f4fdc-5qtrk" Apr 16 20:13:31.670281 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:13:31.670051 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3c56a126-64ef-4f72-9bed-a8f5211ec939-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-f866f4fdc-5qtrk\" (UID: \"3c56a126-64ef-4f72-9bed-a8f5211ec939\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-f866f4fdc-5qtrk" Apr 16 20:13:31.670281 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:13:31.670066 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/3c56a126-64ef-4f72-9bed-a8f5211ec939-home\") pod \"precise-prefix-cache-test-kserve-f866f4fdc-5qtrk\" (UID: \"3c56a126-64ef-4f72-9bed-a8f5211ec939\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-f866f4fdc-5qtrk" Apr 16 20:13:31.672519 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:13:31.672488 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/3c56a126-64ef-4f72-9bed-a8f5211ec939-dshm\") pod \"precise-prefix-cache-test-kserve-f866f4fdc-5qtrk\" (UID: \"3c56a126-64ef-4f72-9bed-a8f5211ec939\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-f866f4fdc-5qtrk" Apr 16 20:13:31.672634 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:13:31.672563 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/3c56a126-64ef-4f72-9bed-a8f5211ec939-tls-certs\") pod \"precise-prefix-cache-test-kserve-f866f4fdc-5qtrk\" (UID: \"3c56a126-64ef-4f72-9bed-a8f5211ec939\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-f866f4fdc-5qtrk" Apr 16 20:13:31.681444 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:13:31.681417 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x9n5l\" (UniqueName: \"kubernetes.io/projected/3c56a126-64ef-4f72-9bed-a8f5211ec939-kube-api-access-x9n5l\") pod \"precise-prefix-cache-test-kserve-f866f4fdc-5qtrk\" (UID: \"3c56a126-64ef-4f72-9bed-a8f5211ec939\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-f866f4fdc-5qtrk" Apr 16 20:13:31.703036 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:13:31.703005 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-f866f4fdc-5qtrk" Apr 16 20:13:31.838942 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:13:31.838917 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-f866f4fdc-5qtrk"] Apr 16 20:13:31.841354 ip-10-0-128-48 kubenswrapper[2572]: W0416 20:13:31.841322 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3c56a126_64ef_4f72_9bed_a8f5211ec939.slice/crio-c5ddff472d2577451ad4d9bf49a3b43ec3e7313570173eaa7502acc40bed7e7d WatchSource:0}: Error finding container c5ddff472d2577451ad4d9bf49a3b43ec3e7313570173eaa7502acc40bed7e7d: Status 404 returned error can't find the container with id c5ddff472d2577451ad4d9bf49a3b43ec3e7313570173eaa7502acc40bed7e7d Apr 16 20:13:32.287879 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:13:32.287797 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-f866f4fdc-5qtrk" event={"ID":"3c56a126-64ef-4f72-9bed-a8f5211ec939","Type":"ContainerStarted","Data":"a94ddd19cf0666bcea292a4f3307f194fb261bf07c5460738c6f561cf9fbfd34"} Apr 16 20:13:32.287879 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:13:32.287840 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-f866f4fdc-5qtrk" event={"ID":"3c56a126-64ef-4f72-9bed-a8f5211ec939","Type":"ContainerStarted","Data":"c5ddff472d2577451ad4d9bf49a3b43ec3e7313570173eaa7502acc40bed7e7d"} Apr 16 20:13:35.220311 ip-10-0-128-48 kubenswrapper[2572]: W0416 20:13:35.220282 2572 logging.go:55] [core] [Channel #22 SubChannel #23]grpc: addrConn.createTransport failed to connect to {Addr: "10.132.0.30:9003", ServerName: "10.132.0.30:9003", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 10.132.0.30:9003: connect: connection refused" Apr 16 20:13:36.220446 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:13:36.220405 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5f9c4d7w4lj8" podUID="2c75db93-cb3e-4fa5-890a-d1032cdaada6" containerName="main" probeResult="failure" output="timeout: failed to connect service \"10.132.0.30:9003\" within 1s: context deadline exceeded" Apr 16 20:13:37.307969 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:13:37.307934 2572 generic.go:358] "Generic (PLEG): container finished" podID="3c56a126-64ef-4f72-9bed-a8f5211ec939" containerID="a94ddd19cf0666bcea292a4f3307f194fb261bf07c5460738c6f561cf9fbfd34" exitCode=0 Apr 16 20:13:37.308376 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:13:37.307989 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-f866f4fdc-5qtrk" event={"ID":"3c56a126-64ef-4f72-9bed-a8f5211ec939","Type":"ContainerDied","Data":"a94ddd19cf0666bcea292a4f3307f194fb261bf07c5460738c6f561cf9fbfd34"} Apr 16 20:13:37.827973 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:13:37.827931 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-857cc6b66-kq5l9" podUID="4ae3ec14-d235-43c1-acdf-65b5e4368321" containerName="main" probeResult="failure" output="Get \"https://10.132.0.29:8000/health\": dial tcp 10.132.0.29:8000: connect: connection refused" Apr 16 20:13:38.312764 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:13:38.312725 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-f866f4fdc-5qtrk" event={"ID":"3c56a126-64ef-4f72-9bed-a8f5211ec939","Type":"ContainerStarted","Data":"b98de5d8739ceffe77a15ae1bc896535c52726b5b3af5efb9590424b21316f78"} Apr 16 20:13:38.335466 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:13:38.335421 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-f866f4fdc-5qtrk" podStartSLOduration=8.335406933 podStartE2EDuration="8.335406933s" podCreationTimestamp="2026-04-16 20:13:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:13:38.334312433 +0000 UTC m=+1185.513074457" watchObservedRunningTime="2026-04-16 20:13:38.335406933 +0000 UTC m=+1185.514168957" Apr 16 20:13:41.703406 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:13:41.703371 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-f866f4fdc-5qtrk" Apr 16 20:13:41.703953 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:13:41.703535 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-f866f4fdc-5qtrk" Apr 16 20:13:41.716070 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:13:41.716046 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-f866f4fdc-5qtrk" Apr 16 20:13:42.337425 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:13:42.337391 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-f866f4fdc-5qtrk" Apr 16 20:13:45.220317 ip-10-0-128-48 kubenswrapper[2572]: W0416 20:13:45.220289 2572 logging.go:55] [core] [Channel #24 SubChannel #25]grpc: addrConn.createTransport failed to connect to {Addr: "10.132.0.30:9003", ServerName: "10.132.0.30:9003", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 10.132.0.30:9003: connect: connection refused" Apr 16 20:13:45.337618 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:13:45.337590 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_scheduler-ha-replicas-test-kserve-router-scheduler-5f9c4d7w4lj8_2c75db93-cb3e-4fa5-890a-d1032cdaada6/tokenizer/0.log" Apr 16 20:13:45.338274 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:13:45.338243 2572 generic.go:358] "Generic (PLEG): container finished" podID="2c75db93-cb3e-4fa5-890a-d1032cdaada6" containerID="1fac42792fe134411ca146457120d4c13ff9129467ebb3ce839080736280b57f" exitCode=137 Apr 16 20:13:45.338401 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:13:45.338319 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5f9c4d7w4lj8" event={"ID":"2c75db93-cb3e-4fa5-890a-d1032cdaada6","Type":"ContainerDied","Data":"1fac42792fe134411ca146457120d4c13ff9129467ebb3ce839080736280b57f"} Apr 16 20:13:45.889301 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:13:45.889279 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_scheduler-ha-replicas-test-kserve-router-scheduler-5f9c4d7w4lj8_2c75db93-cb3e-4fa5-890a-d1032cdaada6/tokenizer/0.log" Apr 16 20:13:45.889994 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:13:45.889975 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5f9c4d7w4lj8" Apr 16 20:13:45.986188 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:13:45.986159 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-94bbj\" (UniqueName: \"kubernetes.io/projected/2c75db93-cb3e-4fa5-890a-d1032cdaada6-kube-api-access-94bbj\") pod \"2c75db93-cb3e-4fa5-890a-d1032cdaada6\" (UID: \"2c75db93-cb3e-4fa5-890a-d1032cdaada6\") " Apr 16 20:13:45.986350 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:13:45.986208 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2c75db93-cb3e-4fa5-890a-d1032cdaada6-kserve-provision-location\") pod \"2c75db93-cb3e-4fa5-890a-d1032cdaada6\" (UID: \"2c75db93-cb3e-4fa5-890a-d1032cdaada6\") " Apr 16 20:13:45.986350 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:13:45.986232 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/2c75db93-cb3e-4fa5-890a-d1032cdaada6-tokenizer-cache\") pod \"2c75db93-cb3e-4fa5-890a-d1032cdaada6\" (UID: \"2c75db93-cb3e-4fa5-890a-d1032cdaada6\") " Apr 16 20:13:45.986350 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:13:45.986312 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2c75db93-cb3e-4fa5-890a-d1032cdaada6-tls-certs\") pod \"2c75db93-cb3e-4fa5-890a-d1032cdaada6\" (UID: \"2c75db93-cb3e-4fa5-890a-d1032cdaada6\") " Apr 16 20:13:45.986350 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:13:45.986344 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/2c75db93-cb3e-4fa5-890a-d1032cdaada6-tokenizer-uds\") pod \"2c75db93-cb3e-4fa5-890a-d1032cdaada6\" (UID: \"2c75db93-cb3e-4fa5-890a-d1032cdaada6\") " Apr 16 20:13:45.986552 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:13:45.986381 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/2c75db93-cb3e-4fa5-890a-d1032cdaada6-tokenizer-tmp\") pod \"2c75db93-cb3e-4fa5-890a-d1032cdaada6\" (UID: \"2c75db93-cb3e-4fa5-890a-d1032cdaada6\") " Apr 16 20:13:45.986552 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:13:45.986522 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2c75db93-cb3e-4fa5-890a-d1032cdaada6-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "2c75db93-cb3e-4fa5-890a-d1032cdaada6" (UID: "2c75db93-cb3e-4fa5-890a-d1032cdaada6"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:13:45.986653 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:13:45.986639 2572 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/2c75db93-cb3e-4fa5-890a-d1032cdaada6-tokenizer-cache\") on node \"ip-10-0-128-48.ec2.internal\" DevicePath \"\"" Apr 16 20:13:45.986711 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:13:45.986657 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2c75db93-cb3e-4fa5-890a-d1032cdaada6-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "2c75db93-cb3e-4fa5-890a-d1032cdaada6" (UID: "2c75db93-cb3e-4fa5-890a-d1032cdaada6"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:13:45.986824 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:13:45.986802 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2c75db93-cb3e-4fa5-890a-d1032cdaada6-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "2c75db93-cb3e-4fa5-890a-d1032cdaada6" (UID: "2c75db93-cb3e-4fa5-890a-d1032cdaada6"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:13:45.987070 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:13:45.987041 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2c75db93-cb3e-4fa5-890a-d1032cdaada6-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "2c75db93-cb3e-4fa5-890a-d1032cdaada6" (UID: "2c75db93-cb3e-4fa5-890a-d1032cdaada6"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:13:45.988373 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:13:45.988352 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c75db93-cb3e-4fa5-890a-d1032cdaada6-kube-api-access-94bbj" (OuterVolumeSpecName: "kube-api-access-94bbj") pod "2c75db93-cb3e-4fa5-890a-d1032cdaada6" (UID: "2c75db93-cb3e-4fa5-890a-d1032cdaada6"). InnerVolumeSpecName "kube-api-access-94bbj". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 20:13:45.988456 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:13:45.988410 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c75db93-cb3e-4fa5-890a-d1032cdaada6-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "2c75db93-cb3e-4fa5-890a-d1032cdaada6" (UID: "2c75db93-cb3e-4fa5-890a-d1032cdaada6"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:13:46.087942 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:13:46.087905 2572 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2c75db93-cb3e-4fa5-890a-d1032cdaada6-tls-certs\") on node \"ip-10-0-128-48.ec2.internal\" DevicePath \"\"" Apr 16 20:13:46.087942 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:13:46.087936 2572 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/2c75db93-cb3e-4fa5-890a-d1032cdaada6-tokenizer-uds\") on node \"ip-10-0-128-48.ec2.internal\" DevicePath \"\"" Apr 16 20:13:46.087942 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:13:46.087945 2572 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/2c75db93-cb3e-4fa5-890a-d1032cdaada6-tokenizer-tmp\") on node \"ip-10-0-128-48.ec2.internal\" DevicePath \"\"" Apr 16 20:13:46.088180 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:13:46.087955 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-94bbj\" (UniqueName: \"kubernetes.io/projected/2c75db93-cb3e-4fa5-890a-d1032cdaada6-kube-api-access-94bbj\") on node \"ip-10-0-128-48.ec2.internal\" DevicePath \"\"" Apr 16 20:13:46.088180 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:13:46.087964 2572 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2c75db93-cb3e-4fa5-890a-d1032cdaada6-kserve-provision-location\") on node \"ip-10-0-128-48.ec2.internal\" DevicePath \"\"" Apr 16 20:13:46.220460 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:13:46.220407 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5f9c4d7w4lj8" podUID="2c75db93-cb3e-4fa5-890a-d1032cdaada6" containerName="main" probeResult="failure" output="timeout: failed to connect service \"10.132.0.30:9003\" within 1s: context deadline exceeded" Apr 16 20:13:46.343152 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:13:46.343074 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_scheduler-ha-replicas-test-kserve-router-scheduler-5f9c4d7w4lj8_2c75db93-cb3e-4fa5-890a-d1032cdaada6/tokenizer/0.log" Apr 16 20:13:46.343852 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:13:46.343824 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5f9c4d7w4lj8" event={"ID":"2c75db93-cb3e-4fa5-890a-d1032cdaada6","Type":"ContainerDied","Data":"637cf16c4f51189f843c782b6b0f582b2ec8f7889326e68f03b960abbdc3f5ff"} Apr 16 20:13:46.343939 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:13:46.343873 2572 scope.go:117] "RemoveContainer" containerID="1fac42792fe134411ca146457120d4c13ff9129467ebb3ce839080736280b57f" Apr 16 20:13:46.343939 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:13:46.343836 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5f9c4d7w4lj8" Apr 16 20:13:46.352623 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:13:46.352600 2572 scope.go:117] "RemoveContainer" containerID="7337e024308b5aba19d0b110dcc2b3f944073fa3be7df5f2d8d61de8721adebf" Apr 16 20:13:46.359884 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:13:46.359865 2572 scope.go:117] "RemoveContainer" containerID="9bd7831b3f53ad51b8dccbe011631775490437088b5e1d47e11462ec6a6d132d" Apr 16 20:13:46.366879 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:13:46.366858 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5f9c4d7w4lj8"] Apr 16 20:13:46.370860 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:13:46.370839 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-5f9c4d7w4lj8"] Apr 16 20:13:47.366342 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:13:47.366307 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2c75db93-cb3e-4fa5-890a-d1032cdaada6" path="/var/lib/kubelet/pods/2c75db93-cb3e-4fa5-890a-d1032cdaada6/volumes" Apr 16 20:13:47.828431 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:13:47.828383 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-857cc6b66-kq5l9" podUID="4ae3ec14-d235-43c1-acdf-65b5e4368321" containerName="main" probeResult="failure" output="Get \"https://10.132.0.29:8000/health\": dial tcp 10.132.0.29:8000: connect: connection refused" Apr 16 20:13:53.296414 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:13:53.296380 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2xttw_d7cbe800-699f-48fe-9a8a-b74c32bf0dcc/console-operator/2.log" Apr 16 20:13:53.298788 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:13:53.298760 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2xttw_d7cbe800-699f-48fe-9a8a-b74c32bf0dcc/console-operator/2.log" Apr 16 20:13:53.304023 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:13:53.303998 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bsbmz_40cddc46-c9be-411b-92b8-a65e04009fc9/ovn-acl-logging/0.log" Apr 16 20:13:53.306542 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:13:53.306522 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bsbmz_40cddc46-c9be-411b-92b8-a65e04009fc9/ovn-acl-logging/0.log" Apr 16 20:13:57.828396 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:13:57.828351 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-857cc6b66-kq5l9" podUID="4ae3ec14-d235-43c1-acdf-65b5e4368321" containerName="main" probeResult="failure" output="Get \"https://10.132.0.29:8000/health\": dial tcp 10.132.0.29:8000: connect: connection refused" Apr 16 20:14:03.869904 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:03.869867 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-f866f4fdc-5qtrk"] Apr 16 20:14:03.870418 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:03.870149 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-f866f4fdc-5qtrk" podUID="3c56a126-64ef-4f72-9bed-a8f5211ec939" containerName="main" containerID="cri-o://b98de5d8739ceffe77a15ae1bc896535c52726b5b3af5efb9590424b21316f78" gracePeriod=30 Apr 16 20:14:04.152374 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:04.152352 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-f866f4fdc-5qtrk" Apr 16 20:14:04.239325 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:04.239300 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/3c56a126-64ef-4f72-9bed-a8f5211ec939-home\") pod \"3c56a126-64ef-4f72-9bed-a8f5211ec939\" (UID: \"3c56a126-64ef-4f72-9bed-a8f5211ec939\") " Apr 16 20:14:04.239486 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:04.239335 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/3c56a126-64ef-4f72-9bed-a8f5211ec939-dshm\") pod \"3c56a126-64ef-4f72-9bed-a8f5211ec939\" (UID: \"3c56a126-64ef-4f72-9bed-a8f5211ec939\") " Apr 16 20:14:04.239486 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:04.239406 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x9n5l\" (UniqueName: \"kubernetes.io/projected/3c56a126-64ef-4f72-9bed-a8f5211ec939-kube-api-access-x9n5l\") pod \"3c56a126-64ef-4f72-9bed-a8f5211ec939\" (UID: \"3c56a126-64ef-4f72-9bed-a8f5211ec939\") " Apr 16 20:14:04.239486 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:04.239423 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3c56a126-64ef-4f72-9bed-a8f5211ec939-kserve-provision-location\") pod \"3c56a126-64ef-4f72-9bed-a8f5211ec939\" (UID: \"3c56a126-64ef-4f72-9bed-a8f5211ec939\") " Apr 16 20:14:04.239486 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:04.239450 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/3c56a126-64ef-4f72-9bed-a8f5211ec939-tls-certs\") pod \"3c56a126-64ef-4f72-9bed-a8f5211ec939\" (UID: \"3c56a126-64ef-4f72-9bed-a8f5211ec939\") " Apr 16 20:14:04.239486 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:04.239478 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/3c56a126-64ef-4f72-9bed-a8f5211ec939-model-cache\") pod \"3c56a126-64ef-4f72-9bed-a8f5211ec939\" (UID: \"3c56a126-64ef-4f72-9bed-a8f5211ec939\") " Apr 16 20:14:04.239720 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:04.239600 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3c56a126-64ef-4f72-9bed-a8f5211ec939-home" (OuterVolumeSpecName: "home") pod "3c56a126-64ef-4f72-9bed-a8f5211ec939" (UID: "3c56a126-64ef-4f72-9bed-a8f5211ec939"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:14:04.239767 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:04.239744 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3c56a126-64ef-4f72-9bed-a8f5211ec939-model-cache" (OuterVolumeSpecName: "model-cache") pod "3c56a126-64ef-4f72-9bed-a8f5211ec939" (UID: "3c56a126-64ef-4f72-9bed-a8f5211ec939"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:14:04.241607 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:04.241580 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3c56a126-64ef-4f72-9bed-a8f5211ec939-dshm" (OuterVolumeSpecName: "dshm") pod "3c56a126-64ef-4f72-9bed-a8f5211ec939" (UID: "3c56a126-64ef-4f72-9bed-a8f5211ec939"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:14:04.241607 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:04.241593 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c56a126-64ef-4f72-9bed-a8f5211ec939-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "3c56a126-64ef-4f72-9bed-a8f5211ec939" (UID: "3c56a126-64ef-4f72-9bed-a8f5211ec939"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:14:04.241744 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:04.241589 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3c56a126-64ef-4f72-9bed-a8f5211ec939-kube-api-access-x9n5l" (OuterVolumeSpecName: "kube-api-access-x9n5l") pod "3c56a126-64ef-4f72-9bed-a8f5211ec939" (UID: "3c56a126-64ef-4f72-9bed-a8f5211ec939"). InnerVolumeSpecName "kube-api-access-x9n5l". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 20:14:04.340441 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:04.340407 2572 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/3c56a126-64ef-4f72-9bed-a8f5211ec939-model-cache\") on node \"ip-10-0-128-48.ec2.internal\" DevicePath \"\"" Apr 16 20:14:04.340441 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:04.340437 2572 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/3c56a126-64ef-4f72-9bed-a8f5211ec939-home\") on node \"ip-10-0-128-48.ec2.internal\" DevicePath \"\"" Apr 16 20:14:04.340706 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:04.340450 2572 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/3c56a126-64ef-4f72-9bed-a8f5211ec939-dshm\") on node \"ip-10-0-128-48.ec2.internal\" DevicePath \"\"" Apr 16 20:14:04.340706 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:04.340461 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-x9n5l\" (UniqueName: \"kubernetes.io/projected/3c56a126-64ef-4f72-9bed-a8f5211ec939-kube-api-access-x9n5l\") on node \"ip-10-0-128-48.ec2.internal\" DevicePath \"\"" Apr 16 20:14:04.340706 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:04.340473 2572 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/3c56a126-64ef-4f72-9bed-a8f5211ec939-tls-certs\") on node \"ip-10-0-128-48.ec2.internal\" DevicePath \"\"" Apr 16 20:14:04.407624 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:04.407554 2572 generic.go:358] "Generic (PLEG): container finished" podID="3c56a126-64ef-4f72-9bed-a8f5211ec939" containerID="b98de5d8739ceffe77a15ae1bc896535c52726b5b3af5efb9590424b21316f78" exitCode=0 Apr 16 20:14:04.407624 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:04.407606 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-f866f4fdc-5qtrk" event={"ID":"3c56a126-64ef-4f72-9bed-a8f5211ec939","Type":"ContainerDied","Data":"b98de5d8739ceffe77a15ae1bc896535c52726b5b3af5efb9590424b21316f78"} Apr 16 20:14:04.407826 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:04.407624 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-f866f4fdc-5qtrk" Apr 16 20:14:04.407826 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:04.407651 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-f866f4fdc-5qtrk" event={"ID":"3c56a126-64ef-4f72-9bed-a8f5211ec939","Type":"ContainerDied","Data":"c5ddff472d2577451ad4d9bf49a3b43ec3e7313570173eaa7502acc40bed7e7d"} Apr 16 20:14:04.407826 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:04.407668 2572 scope.go:117] "RemoveContainer" containerID="b98de5d8739ceffe77a15ae1bc896535c52726b5b3af5efb9590424b21316f78" Apr 16 20:14:04.415647 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:04.415626 2572 scope.go:117] "RemoveContainer" containerID="a94ddd19cf0666bcea292a4f3307f194fb261bf07c5460738c6f561cf9fbfd34" Apr 16 20:14:04.477022 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:04.476988 2572 scope.go:117] "RemoveContainer" containerID="b98de5d8739ceffe77a15ae1bc896535c52726b5b3af5efb9590424b21316f78" Apr 16 20:14:04.477352 ip-10-0-128-48 kubenswrapper[2572]: E0416 20:14:04.477332 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b98de5d8739ceffe77a15ae1bc896535c52726b5b3af5efb9590424b21316f78\": container with ID starting with b98de5d8739ceffe77a15ae1bc896535c52726b5b3af5efb9590424b21316f78 not found: ID does not exist" containerID="b98de5d8739ceffe77a15ae1bc896535c52726b5b3af5efb9590424b21316f78" Apr 16 20:14:04.477441 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:04.477366 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b98de5d8739ceffe77a15ae1bc896535c52726b5b3af5efb9590424b21316f78"} err="failed to get container status \"b98de5d8739ceffe77a15ae1bc896535c52726b5b3af5efb9590424b21316f78\": rpc error: code = NotFound desc = could not find container \"b98de5d8739ceffe77a15ae1bc896535c52726b5b3af5efb9590424b21316f78\": container with ID starting with b98de5d8739ceffe77a15ae1bc896535c52726b5b3af5efb9590424b21316f78 not found: ID does not exist" Apr 16 20:14:04.477441 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:04.477393 2572 scope.go:117] "RemoveContainer" containerID="a94ddd19cf0666bcea292a4f3307f194fb261bf07c5460738c6f561cf9fbfd34" Apr 16 20:14:04.477671 ip-10-0-128-48 kubenswrapper[2572]: E0416 20:14:04.477655 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a94ddd19cf0666bcea292a4f3307f194fb261bf07c5460738c6f561cf9fbfd34\": container with ID starting with a94ddd19cf0666bcea292a4f3307f194fb261bf07c5460738c6f561cf9fbfd34 not found: ID does not exist" containerID="a94ddd19cf0666bcea292a4f3307f194fb261bf07c5460738c6f561cf9fbfd34" Apr 16 20:14:04.477721 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:04.477680 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a94ddd19cf0666bcea292a4f3307f194fb261bf07c5460738c6f561cf9fbfd34"} err="failed to get container status \"a94ddd19cf0666bcea292a4f3307f194fb261bf07c5460738c6f561cf9fbfd34\": rpc error: code = NotFound desc = could not find container \"a94ddd19cf0666bcea292a4f3307f194fb261bf07c5460738c6f561cf9fbfd34\": container with ID starting with a94ddd19cf0666bcea292a4f3307f194fb261bf07c5460738c6f561cf9fbfd34 not found: ID does not exist" Apr 16 20:14:04.637518 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:04.637470 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3c56a126-64ef-4f72-9bed-a8f5211ec939-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "3c56a126-64ef-4f72-9bed-a8f5211ec939" (UID: "3c56a126-64ef-4f72-9bed-a8f5211ec939"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:14:04.642790 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:04.642770 2572 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3c56a126-64ef-4f72-9bed-a8f5211ec939-kserve-provision-location\") on node \"ip-10-0-128-48.ec2.internal\" DevicePath \"\"" Apr 16 20:14:04.731623 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:04.731595 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-f866f4fdc-5qtrk"] Apr 16 20:14:04.735234 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:04.735210 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-f866f4fdc-5qtrk"] Apr 16 20:14:05.365906 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:05.365868 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3c56a126-64ef-4f72-9bed-a8f5211ec939" path="/var/lib/kubelet/pods/3c56a126-64ef-4f72-9bed-a8f5211ec939/volumes" Apr 16 20:14:07.794332 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:07.794306 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/conv-test-criticality-kserve-566d56d49c-s92hf"] Apr 16 20:14:07.794669 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:07.794654 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2c75db93-cb3e-4fa5-890a-d1032cdaada6" containerName="main" Apr 16 20:14:07.794735 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:07.794670 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c75db93-cb3e-4fa5-890a-d1032cdaada6" containerName="main" Apr 16 20:14:07.794735 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:07.794682 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3c56a126-64ef-4f72-9bed-a8f5211ec939" containerName="main" Apr 16 20:14:07.794735 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:07.794687 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c56a126-64ef-4f72-9bed-a8f5211ec939" containerName="main" Apr 16 20:14:07.794735 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:07.794702 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3c56a126-64ef-4f72-9bed-a8f5211ec939" containerName="storage-initializer" Apr 16 20:14:07.794735 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:07.794708 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c56a126-64ef-4f72-9bed-a8f5211ec939" containerName="storage-initializer" Apr 16 20:14:07.794735 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:07.794717 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2c75db93-cb3e-4fa5-890a-d1032cdaada6" containerName="storage-initializer" Apr 16 20:14:07.794735 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:07.794722 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c75db93-cb3e-4fa5-890a-d1032cdaada6" containerName="storage-initializer" Apr 16 20:14:07.794735 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:07.794733 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2c75db93-cb3e-4fa5-890a-d1032cdaada6" containerName="tokenizer" Apr 16 20:14:07.795076 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:07.794742 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c75db93-cb3e-4fa5-890a-d1032cdaada6" containerName="tokenizer" Apr 16 20:14:07.795076 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:07.794825 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="3c56a126-64ef-4f72-9bed-a8f5211ec939" containerName="main" Apr 16 20:14:07.795076 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:07.794836 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="2c75db93-cb3e-4fa5-890a-d1032cdaada6" containerName="main" Apr 16 20:14:07.795076 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:07.794847 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="2c75db93-cb3e-4fa5-890a-d1032cdaada6" containerName="tokenizer" Apr 16 20:14:07.798200 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:07.798178 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/conv-test-criticality-kserve-566d56d49c-s92hf" Apr 16 20:14:07.801143 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:07.801125 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"conv-test-criticality-kserve-self-signed-certs\"" Apr 16 20:14:07.808814 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:07.808792 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/conv-test-criticality-kserve-566d56d49c-s92hf"] Apr 16 20:14:07.838698 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:07.838673 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-857cc6b66-kq5l9" Apr 16 20:14:07.846376 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:07.846354 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-857cc6b66-kq5l9" Apr 16 20:14:07.865329 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:07.865309 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/7712cba0-bd22-4ce1-99a7-7775f0f0e342-dshm\") pod \"conv-test-criticality-kserve-566d56d49c-s92hf\" (UID: \"7712cba0-bd22-4ce1-99a7-7775f0f0e342\") " pod="kserve-ci-e2e-test/conv-test-criticality-kserve-566d56d49c-s92hf" Apr 16 20:14:07.865432 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:07.865338 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/7712cba0-bd22-4ce1-99a7-7775f0f0e342-home\") pod \"conv-test-criticality-kserve-566d56d49c-s92hf\" (UID: \"7712cba0-bd22-4ce1-99a7-7775f0f0e342\") " pod="kserve-ci-e2e-test/conv-test-criticality-kserve-566d56d49c-s92hf" Apr 16 20:14:07.865432 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:07.865357 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/7712cba0-bd22-4ce1-99a7-7775f0f0e342-tls-certs\") pod \"conv-test-criticality-kserve-566d56d49c-s92hf\" (UID: \"7712cba0-bd22-4ce1-99a7-7775f0f0e342\") " pod="kserve-ci-e2e-test/conv-test-criticality-kserve-566d56d49c-s92hf" Apr 16 20:14:07.865432 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:07.865416 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/7712cba0-bd22-4ce1-99a7-7775f0f0e342-model-cache\") pod \"conv-test-criticality-kserve-566d56d49c-s92hf\" (UID: \"7712cba0-bd22-4ce1-99a7-7775f0f0e342\") " pod="kserve-ci-e2e-test/conv-test-criticality-kserve-566d56d49c-s92hf" Apr 16 20:14:07.865535 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:07.865472 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g4gxm\" (UniqueName: \"kubernetes.io/projected/7712cba0-bd22-4ce1-99a7-7775f0f0e342-kube-api-access-g4gxm\") pod \"conv-test-criticality-kserve-566d56d49c-s92hf\" (UID: \"7712cba0-bd22-4ce1-99a7-7775f0f0e342\") " pod="kserve-ci-e2e-test/conv-test-criticality-kserve-566d56d49c-s92hf" Apr 16 20:14:07.865535 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:07.865505 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7712cba0-bd22-4ce1-99a7-7775f0f0e342-kserve-provision-location\") pod \"conv-test-criticality-kserve-566d56d49c-s92hf\" (UID: \"7712cba0-bd22-4ce1-99a7-7775f0f0e342\") " pod="kserve-ci-e2e-test/conv-test-criticality-kserve-566d56d49c-s92hf" Apr 16 20:14:07.966503 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:07.966468 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g4gxm\" (UniqueName: \"kubernetes.io/projected/7712cba0-bd22-4ce1-99a7-7775f0f0e342-kube-api-access-g4gxm\") pod \"conv-test-criticality-kserve-566d56d49c-s92hf\" (UID: \"7712cba0-bd22-4ce1-99a7-7775f0f0e342\") " pod="kserve-ci-e2e-test/conv-test-criticality-kserve-566d56d49c-s92hf" Apr 16 20:14:07.966660 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:07.966523 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7712cba0-bd22-4ce1-99a7-7775f0f0e342-kserve-provision-location\") pod \"conv-test-criticality-kserve-566d56d49c-s92hf\" (UID: \"7712cba0-bd22-4ce1-99a7-7775f0f0e342\") " pod="kserve-ci-e2e-test/conv-test-criticality-kserve-566d56d49c-s92hf" Apr 16 20:14:07.966660 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:07.966584 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/7712cba0-bd22-4ce1-99a7-7775f0f0e342-dshm\") pod \"conv-test-criticality-kserve-566d56d49c-s92hf\" (UID: \"7712cba0-bd22-4ce1-99a7-7775f0f0e342\") " pod="kserve-ci-e2e-test/conv-test-criticality-kserve-566d56d49c-s92hf" Apr 16 20:14:07.966660 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:07.966614 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/7712cba0-bd22-4ce1-99a7-7775f0f0e342-home\") pod \"conv-test-criticality-kserve-566d56d49c-s92hf\" (UID: \"7712cba0-bd22-4ce1-99a7-7775f0f0e342\") " pod="kserve-ci-e2e-test/conv-test-criticality-kserve-566d56d49c-s92hf" Apr 16 20:14:07.966824 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:07.966728 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/7712cba0-bd22-4ce1-99a7-7775f0f0e342-tls-certs\") pod \"conv-test-criticality-kserve-566d56d49c-s92hf\" (UID: \"7712cba0-bd22-4ce1-99a7-7775f0f0e342\") " pod="kserve-ci-e2e-test/conv-test-criticality-kserve-566d56d49c-s92hf" Apr 16 20:14:07.966882 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:07.966823 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/7712cba0-bd22-4ce1-99a7-7775f0f0e342-model-cache\") pod \"conv-test-criticality-kserve-566d56d49c-s92hf\" (UID: \"7712cba0-bd22-4ce1-99a7-7775f0f0e342\") " pod="kserve-ci-e2e-test/conv-test-criticality-kserve-566d56d49c-s92hf" Apr 16 20:14:07.966978 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:07.966956 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/7712cba0-bd22-4ce1-99a7-7775f0f0e342-home\") pod \"conv-test-criticality-kserve-566d56d49c-s92hf\" (UID: \"7712cba0-bd22-4ce1-99a7-7775f0f0e342\") " pod="kserve-ci-e2e-test/conv-test-criticality-kserve-566d56d49c-s92hf" Apr 16 20:14:07.967050 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:07.967029 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7712cba0-bd22-4ce1-99a7-7775f0f0e342-kserve-provision-location\") pod \"conv-test-criticality-kserve-566d56d49c-s92hf\" (UID: \"7712cba0-bd22-4ce1-99a7-7775f0f0e342\") " pod="kserve-ci-e2e-test/conv-test-criticality-kserve-566d56d49c-s92hf" Apr 16 20:14:07.967167 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:07.967146 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/7712cba0-bd22-4ce1-99a7-7775f0f0e342-model-cache\") pod \"conv-test-criticality-kserve-566d56d49c-s92hf\" (UID: \"7712cba0-bd22-4ce1-99a7-7775f0f0e342\") " pod="kserve-ci-e2e-test/conv-test-criticality-kserve-566d56d49c-s92hf" Apr 16 20:14:07.968796 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:07.968773 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/7712cba0-bd22-4ce1-99a7-7775f0f0e342-dshm\") pod \"conv-test-criticality-kserve-566d56d49c-s92hf\" (UID: \"7712cba0-bd22-4ce1-99a7-7775f0f0e342\") " pod="kserve-ci-e2e-test/conv-test-criticality-kserve-566d56d49c-s92hf" Apr 16 20:14:07.969360 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:07.969339 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/7712cba0-bd22-4ce1-99a7-7775f0f0e342-tls-certs\") pod \"conv-test-criticality-kserve-566d56d49c-s92hf\" (UID: \"7712cba0-bd22-4ce1-99a7-7775f0f0e342\") " pod="kserve-ci-e2e-test/conv-test-criticality-kserve-566d56d49c-s92hf" Apr 16 20:14:07.974920 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:07.974901 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g4gxm\" (UniqueName: \"kubernetes.io/projected/7712cba0-bd22-4ce1-99a7-7775f0f0e342-kube-api-access-g4gxm\") pod \"conv-test-criticality-kserve-566d56d49c-s92hf\" (UID: \"7712cba0-bd22-4ce1-99a7-7775f0f0e342\") " pod="kserve-ci-e2e-test/conv-test-criticality-kserve-566d56d49c-s92hf" Apr 16 20:14:08.109267 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:08.109185 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/conv-test-criticality-kserve-566d56d49c-s92hf" Apr 16 20:14:08.238577 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:08.238465 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/conv-test-criticality-kserve-566d56d49c-s92hf"] Apr 16 20:14:08.242230 ip-10-0-128-48 kubenswrapper[2572]: W0416 20:14:08.242190 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7712cba0_bd22_4ce1_99a7_7775f0f0e342.slice/crio-8506075380cb37b8a13357f64af630f8df3431d50df1b20a11ae4a06152419df WatchSource:0}: Error finding container 8506075380cb37b8a13357f64af630f8df3431d50df1b20a11ae4a06152419df: Status 404 returned error can't find the container with id 8506075380cb37b8a13357f64af630f8df3431d50df1b20a11ae4a06152419df Apr 16 20:14:08.421987 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:08.421906 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/conv-test-criticality-kserve-566d56d49c-s92hf" event={"ID":"7712cba0-bd22-4ce1-99a7-7775f0f0e342","Type":"ContainerStarted","Data":"9cffef42b2e95fac6b2471230fbc0888d314132985a70c383aff05636ef26c02"} Apr 16 20:14:08.421987 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:08.421942 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/conv-test-criticality-kserve-566d56d49c-s92hf" event={"ID":"7712cba0-bd22-4ce1-99a7-7775f0f0e342","Type":"ContainerStarted","Data":"8506075380cb37b8a13357f64af630f8df3431d50df1b20a11ae4a06152419df"} Apr 16 20:14:12.436292 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:12.436261 2572 generic.go:358] "Generic (PLEG): container finished" podID="7712cba0-bd22-4ce1-99a7-7775f0f0e342" containerID="9cffef42b2e95fac6b2471230fbc0888d314132985a70c383aff05636ef26c02" exitCode=0 Apr 16 20:14:12.436292 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:12.436296 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/conv-test-criticality-kserve-566d56d49c-s92hf" event={"ID":"7712cba0-bd22-4ce1-99a7-7775f0f0e342","Type":"ContainerDied","Data":"9cffef42b2e95fac6b2471230fbc0888d314132985a70c383aff05636ef26c02"} Apr 16 20:14:13.441337 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:13.441305 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/conv-test-criticality-kserve-566d56d49c-s92hf" event={"ID":"7712cba0-bd22-4ce1-99a7-7775f0f0e342","Type":"ContainerStarted","Data":"1c5964ab9c2d57d0529eac66d4015f5620c74fad576a32dd76a76d5c8296d503"} Apr 16 20:14:13.464500 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:13.464456 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/conv-test-criticality-kserve-566d56d49c-s92hf" podStartSLOduration=6.4644423060000005 podStartE2EDuration="6.464442306s" podCreationTimestamp="2026-04-16 20:14:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:14:13.462683291 +0000 UTC m=+1220.641445328" watchObservedRunningTime="2026-04-16 20:14:13.464442306 +0000 UTC m=+1220.643204331" Apr 16 20:14:13.817153 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:13.817118 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-55c8bffd6-2lnnw"] Apr 16 20:14:13.819385 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:13.819363 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-55c8bffd6-2lnnw" Apr 16 20:14:13.822314 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:13.822293 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"stop-feature-test-kserve-self-signed-certs\"" Apr 16 20:14:13.831283 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:13.831257 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-55c8bffd6-2lnnw"] Apr 16 20:14:13.920188 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:13.920149 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/05be8fbd-8c8e-4bef-aa87-0ccb67a6875f-home\") pod \"stop-feature-test-kserve-55c8bffd6-2lnnw\" (UID: \"05be8fbd-8c8e-4bef-aa87-0ccb67a6875f\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-55c8bffd6-2lnnw" Apr 16 20:14:13.920188 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:13.920186 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/05be8fbd-8c8e-4bef-aa87-0ccb67a6875f-tls-certs\") pod \"stop-feature-test-kserve-55c8bffd6-2lnnw\" (UID: \"05be8fbd-8c8e-4bef-aa87-0ccb67a6875f\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-55c8bffd6-2lnnw" Apr 16 20:14:13.920381 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:13.920248 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/05be8fbd-8c8e-4bef-aa87-0ccb67a6875f-model-cache\") pod \"stop-feature-test-kserve-55c8bffd6-2lnnw\" (UID: \"05be8fbd-8c8e-4bef-aa87-0ccb67a6875f\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-55c8bffd6-2lnnw" Apr 16 20:14:13.920381 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:13.920289 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/05be8fbd-8c8e-4bef-aa87-0ccb67a6875f-kserve-provision-location\") pod \"stop-feature-test-kserve-55c8bffd6-2lnnw\" (UID: \"05be8fbd-8c8e-4bef-aa87-0ccb67a6875f\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-55c8bffd6-2lnnw" Apr 16 20:14:13.920381 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:13.920317 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sdfl5\" (UniqueName: \"kubernetes.io/projected/05be8fbd-8c8e-4bef-aa87-0ccb67a6875f-kube-api-access-sdfl5\") pod \"stop-feature-test-kserve-55c8bffd6-2lnnw\" (UID: \"05be8fbd-8c8e-4bef-aa87-0ccb67a6875f\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-55c8bffd6-2lnnw" Apr 16 20:14:13.920505 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:13.920394 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/05be8fbd-8c8e-4bef-aa87-0ccb67a6875f-dshm\") pod \"stop-feature-test-kserve-55c8bffd6-2lnnw\" (UID: \"05be8fbd-8c8e-4bef-aa87-0ccb67a6875f\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-55c8bffd6-2lnnw" Apr 16 20:14:14.021600 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:14.021564 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/05be8fbd-8c8e-4bef-aa87-0ccb67a6875f-model-cache\") pod \"stop-feature-test-kserve-55c8bffd6-2lnnw\" (UID: \"05be8fbd-8c8e-4bef-aa87-0ccb67a6875f\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-55c8bffd6-2lnnw" Apr 16 20:14:14.021600 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:14.021601 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/05be8fbd-8c8e-4bef-aa87-0ccb67a6875f-kserve-provision-location\") pod \"stop-feature-test-kserve-55c8bffd6-2lnnw\" (UID: \"05be8fbd-8c8e-4bef-aa87-0ccb67a6875f\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-55c8bffd6-2lnnw" Apr 16 20:14:14.021798 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:14.021628 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sdfl5\" (UniqueName: \"kubernetes.io/projected/05be8fbd-8c8e-4bef-aa87-0ccb67a6875f-kube-api-access-sdfl5\") pod \"stop-feature-test-kserve-55c8bffd6-2lnnw\" (UID: \"05be8fbd-8c8e-4bef-aa87-0ccb67a6875f\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-55c8bffd6-2lnnw" Apr 16 20:14:14.021798 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:14.021691 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/05be8fbd-8c8e-4bef-aa87-0ccb67a6875f-dshm\") pod \"stop-feature-test-kserve-55c8bffd6-2lnnw\" (UID: \"05be8fbd-8c8e-4bef-aa87-0ccb67a6875f\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-55c8bffd6-2lnnw" Apr 16 20:14:14.021798 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:14.021745 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/05be8fbd-8c8e-4bef-aa87-0ccb67a6875f-home\") pod \"stop-feature-test-kserve-55c8bffd6-2lnnw\" (UID: \"05be8fbd-8c8e-4bef-aa87-0ccb67a6875f\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-55c8bffd6-2lnnw" Apr 16 20:14:14.021798 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:14.021769 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/05be8fbd-8c8e-4bef-aa87-0ccb67a6875f-tls-certs\") pod \"stop-feature-test-kserve-55c8bffd6-2lnnw\" (UID: \"05be8fbd-8c8e-4bef-aa87-0ccb67a6875f\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-55c8bffd6-2lnnw" Apr 16 20:14:14.022030 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:14.022005 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/05be8fbd-8c8e-4bef-aa87-0ccb67a6875f-model-cache\") pod \"stop-feature-test-kserve-55c8bffd6-2lnnw\" (UID: \"05be8fbd-8c8e-4bef-aa87-0ccb67a6875f\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-55c8bffd6-2lnnw" Apr 16 20:14:14.022117 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:14.022055 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/05be8fbd-8c8e-4bef-aa87-0ccb67a6875f-kserve-provision-location\") pod \"stop-feature-test-kserve-55c8bffd6-2lnnw\" (UID: \"05be8fbd-8c8e-4bef-aa87-0ccb67a6875f\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-55c8bffd6-2lnnw" Apr 16 20:14:14.022117 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:14.022085 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/05be8fbd-8c8e-4bef-aa87-0ccb67a6875f-home\") pod \"stop-feature-test-kserve-55c8bffd6-2lnnw\" (UID: \"05be8fbd-8c8e-4bef-aa87-0ccb67a6875f\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-55c8bffd6-2lnnw" Apr 16 20:14:14.023999 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:14.023977 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/05be8fbd-8c8e-4bef-aa87-0ccb67a6875f-dshm\") pod \"stop-feature-test-kserve-55c8bffd6-2lnnw\" (UID: \"05be8fbd-8c8e-4bef-aa87-0ccb67a6875f\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-55c8bffd6-2lnnw" Apr 16 20:14:14.024445 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:14.024424 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/05be8fbd-8c8e-4bef-aa87-0ccb67a6875f-tls-certs\") pod \"stop-feature-test-kserve-55c8bffd6-2lnnw\" (UID: \"05be8fbd-8c8e-4bef-aa87-0ccb67a6875f\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-55c8bffd6-2lnnw" Apr 16 20:14:14.033656 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:14.033629 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sdfl5\" (UniqueName: \"kubernetes.io/projected/05be8fbd-8c8e-4bef-aa87-0ccb67a6875f-kube-api-access-sdfl5\") pod \"stop-feature-test-kserve-55c8bffd6-2lnnw\" (UID: \"05be8fbd-8c8e-4bef-aa87-0ccb67a6875f\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-55c8bffd6-2lnnw" Apr 16 20:14:14.130263 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:14.130180 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-55c8bffd6-2lnnw" Apr 16 20:14:14.273918 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:14.273804 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-55c8bffd6-2lnnw"] Apr 16 20:14:14.276696 ip-10-0-128-48 kubenswrapper[2572]: W0416 20:14:14.276655 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod05be8fbd_8c8e_4bef_aa87_0ccb67a6875f.slice/crio-e982cff5ce02998fbe78bd1e93b30c34822b1ea78b430850d410a27e7ca827e5 WatchSource:0}: Error finding container e982cff5ce02998fbe78bd1e93b30c34822b1ea78b430850d410a27e7ca827e5: Status 404 returned error can't find the container with id e982cff5ce02998fbe78bd1e93b30c34822b1ea78b430850d410a27e7ca827e5 Apr 16 20:14:14.446582 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:14.446495 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-55c8bffd6-2lnnw" event={"ID":"05be8fbd-8c8e-4bef-aa87-0ccb67a6875f","Type":"ContainerStarted","Data":"0c7a480cd26efb06096747426a98b228761130437f6f3b14453149ecc1e40d88"} Apr 16 20:14:14.446582 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:14.446544 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-55c8bffd6-2lnnw" event={"ID":"05be8fbd-8c8e-4bef-aa87-0ccb67a6875f","Type":"ContainerStarted","Data":"e982cff5ce02998fbe78bd1e93b30c34822b1ea78b430850d410a27e7ca827e5"} Apr 16 20:14:16.596166 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:16.596127 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/conv-test-criticality-kserve-566d56d49c-s92hf"] Apr 16 20:14:16.596604 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:16.596499 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/conv-test-criticality-kserve-566d56d49c-s92hf" podUID="7712cba0-bd22-4ce1-99a7-7775f0f0e342" containerName="main" containerID="cri-o://1c5964ab9c2d57d0529eac66d4015f5620c74fad576a32dd76a76d5c8296d503" gracePeriod=30 Apr 16 20:14:18.109512 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:18.109462 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/conv-test-criticality-kserve-566d56d49c-s92hf" Apr 16 20:14:18.462759 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:18.462678 2572 generic.go:358] "Generic (PLEG): container finished" podID="05be8fbd-8c8e-4bef-aa87-0ccb67a6875f" containerID="0c7a480cd26efb06096747426a98b228761130437f6f3b14453149ecc1e40d88" exitCode=0 Apr 16 20:14:18.462901 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:18.462752 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-55c8bffd6-2lnnw" event={"ID":"05be8fbd-8c8e-4bef-aa87-0ccb67a6875f","Type":"ContainerDied","Data":"0c7a480cd26efb06096747426a98b228761130437f6f3b14453149ecc1e40d88"} Apr 16 20:14:19.469018 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:19.468971 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-55c8bffd6-2lnnw" event={"ID":"05be8fbd-8c8e-4bef-aa87-0ccb67a6875f","Type":"ContainerStarted","Data":"da91519ead7d06207b26d530a88a8ccfb9e90eb31b6bbf0043ce830dfc392a39"} Apr 16 20:14:19.492149 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:19.492076 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/stop-feature-test-kserve-55c8bffd6-2lnnw" podStartSLOduration=6.492059258 podStartE2EDuration="6.492059258s" podCreationTimestamp="2026-04-16 20:14:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:14:19.48935549 +0000 UTC m=+1226.668117516" watchObservedRunningTime="2026-04-16 20:14:19.492059258 +0000 UTC m=+1226.670821283" Apr 16 20:14:23.108963 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:23.108931 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-857cc6b66-kq5l9"] Apr 16 20:14:23.109455 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:23.109420 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-857cc6b66-kq5l9" podUID="4ae3ec14-d235-43c1-acdf-65b5e4368321" containerName="main" containerID="cri-o://e29228268e2ed63f8ff4664503756c655f054b1e6f17a59c964b0bcd72b9dec6" gracePeriod=30 Apr 16 20:14:24.130919 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:24.130877 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/stop-feature-test-kserve-55c8bffd6-2lnnw" Apr 16 20:14:24.131396 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:24.130963 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-55c8bffd6-2lnnw" Apr 16 20:14:24.132532 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:24.132498 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-55c8bffd6-2lnnw" podUID="05be8fbd-8c8e-4bef-aa87-0ccb67a6875f" containerName="main" probeResult="failure" output="Get \"https://10.132.0.33:8000/health\": dial tcp 10.132.0.33:8000: connect: connection refused" Apr 16 20:14:30.447827 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:30.447783 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-f5b5466ff-frvkh"] Apr 16 20:14:30.487433 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:30.487398 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6967dsr"] Apr 16 20:14:30.487614 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:30.487554 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-f5b5466ff-frvkh" Apr 16 20:14:30.490985 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:30.490928 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-dockercfg-8kvzn\"" Apr 16 20:14:30.491134 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:30.491122 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisv8f1a6f044e8c7a4d31a250e0c4861caf-kserve-self-signed-certs\"" Apr 16 20:14:30.520123 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:30.520078 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-f5b5466ff-frvkh"] Apr 16 20:14:30.520123 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:30.520114 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6967dsr"] Apr 16 20:14:30.520290 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:30.520206 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6967dsr" Apr 16 20:14:30.664236 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:30.664193 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mnd27\" (UniqueName: \"kubernetes.io/projected/93e8c4ec-79da-4421-be85-19b6810b3069-kube-api-access-mnd27\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6967dsr\" (UID: \"93e8c4ec-79da-4421-be85-19b6810b3069\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6967dsr" Apr 16 20:14:30.664422 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:30.664247 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/d527aadd-baba-4971-b2e9-5df1ede5f2d0-dshm\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-f5b5466ff-frvkh\" (UID: \"d527aadd-baba-4971-b2e9-5df1ede5f2d0\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-f5b5466ff-frvkh" Apr 16 20:14:30.664422 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:30.664291 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d527aadd-baba-4971-b2e9-5df1ede5f2d0-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-f5b5466ff-frvkh\" (UID: \"d527aadd-baba-4971-b2e9-5df1ede5f2d0\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-f5b5466ff-frvkh" Apr 16 20:14:30.664422 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:30.664335 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/93e8c4ec-79da-4421-be85-19b6810b3069-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6967dsr\" (UID: \"93e8c4ec-79da-4421-be85-19b6810b3069\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6967dsr" Apr 16 20:14:30.664422 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:30.664375 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/93e8c4ec-79da-4421-be85-19b6810b3069-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6967dsr\" (UID: \"93e8c4ec-79da-4421-be85-19b6810b3069\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6967dsr" Apr 16 20:14:30.664657 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:30.664449 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d527aadd-baba-4971-b2e9-5df1ede5f2d0-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-f5b5466ff-frvkh\" (UID: \"d527aadd-baba-4971-b2e9-5df1ede5f2d0\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-f5b5466ff-frvkh" Apr 16 20:14:30.664657 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:30.664486 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/93e8c4ec-79da-4421-be85-19b6810b3069-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6967dsr\" (UID: \"93e8c4ec-79da-4421-be85-19b6810b3069\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6967dsr" Apr 16 20:14:30.664657 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:30.664509 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/d527aadd-baba-4971-b2e9-5df1ede5f2d0-home\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-f5b5466ff-frvkh\" (UID: \"d527aadd-baba-4971-b2e9-5df1ede5f2d0\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-f5b5466ff-frvkh" Apr 16 20:14:30.664657 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:30.664527 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/93e8c4ec-79da-4421-be85-19b6810b3069-dshm\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6967dsr\" (UID: \"93e8c4ec-79da-4421-be85-19b6810b3069\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6967dsr" Apr 16 20:14:30.664657 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:30.664544 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/d527aadd-baba-4971-b2e9-5df1ede5f2d0-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-f5b5466ff-frvkh\" (UID: \"d527aadd-baba-4971-b2e9-5df1ede5f2d0\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-f5b5466ff-frvkh" Apr 16 20:14:30.664657 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:30.664564 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/93e8c4ec-79da-4421-be85-19b6810b3069-home\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6967dsr\" (UID: \"93e8c4ec-79da-4421-be85-19b6810b3069\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6967dsr" Apr 16 20:14:30.664657 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:30.664584 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-njq55\" (UniqueName: \"kubernetes.io/projected/d527aadd-baba-4971-b2e9-5df1ede5f2d0-kube-api-access-njq55\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-f5b5466ff-frvkh\" (UID: \"d527aadd-baba-4971-b2e9-5df1ede5f2d0\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-f5b5466ff-frvkh" Apr 16 20:14:30.765386 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:30.765293 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/93e8c4ec-79da-4421-be85-19b6810b3069-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6967dsr\" (UID: \"93e8c4ec-79da-4421-be85-19b6810b3069\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6967dsr" Apr 16 20:14:30.765386 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:30.765368 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d527aadd-baba-4971-b2e9-5df1ede5f2d0-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-f5b5466ff-frvkh\" (UID: \"d527aadd-baba-4971-b2e9-5df1ede5f2d0\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-f5b5466ff-frvkh" Apr 16 20:14:30.765594 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:30.765414 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/93e8c4ec-79da-4421-be85-19b6810b3069-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6967dsr\" (UID: \"93e8c4ec-79da-4421-be85-19b6810b3069\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6967dsr" Apr 16 20:14:30.765594 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:30.765448 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/d527aadd-baba-4971-b2e9-5df1ede5f2d0-home\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-f5b5466ff-frvkh\" (UID: \"d527aadd-baba-4971-b2e9-5df1ede5f2d0\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-f5b5466ff-frvkh" Apr 16 20:14:30.765594 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:30.765475 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/93e8c4ec-79da-4421-be85-19b6810b3069-dshm\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6967dsr\" (UID: \"93e8c4ec-79da-4421-be85-19b6810b3069\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6967dsr" Apr 16 20:14:30.765594 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:30.765497 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/d527aadd-baba-4971-b2e9-5df1ede5f2d0-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-f5b5466ff-frvkh\" (UID: \"d527aadd-baba-4971-b2e9-5df1ede5f2d0\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-f5b5466ff-frvkh" Apr 16 20:14:30.765594 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:30.765520 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/93e8c4ec-79da-4421-be85-19b6810b3069-home\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6967dsr\" (UID: \"93e8c4ec-79da-4421-be85-19b6810b3069\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6967dsr" Apr 16 20:14:30.765594 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:30.765548 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-njq55\" (UniqueName: \"kubernetes.io/projected/d527aadd-baba-4971-b2e9-5df1ede5f2d0-kube-api-access-njq55\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-f5b5466ff-frvkh\" (UID: \"d527aadd-baba-4971-b2e9-5df1ede5f2d0\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-f5b5466ff-frvkh" Apr 16 20:14:30.765594 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:30.765583 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mnd27\" (UniqueName: \"kubernetes.io/projected/93e8c4ec-79da-4421-be85-19b6810b3069-kube-api-access-mnd27\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6967dsr\" (UID: \"93e8c4ec-79da-4421-be85-19b6810b3069\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6967dsr" Apr 16 20:14:30.765969 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:30.765613 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/d527aadd-baba-4971-b2e9-5df1ede5f2d0-dshm\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-f5b5466ff-frvkh\" (UID: \"d527aadd-baba-4971-b2e9-5df1ede5f2d0\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-f5b5466ff-frvkh" Apr 16 20:14:30.765969 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:30.765646 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d527aadd-baba-4971-b2e9-5df1ede5f2d0-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-f5b5466ff-frvkh\" (UID: \"d527aadd-baba-4971-b2e9-5df1ede5f2d0\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-f5b5466ff-frvkh" Apr 16 20:14:30.765969 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:30.765683 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/93e8c4ec-79da-4421-be85-19b6810b3069-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6967dsr\" (UID: \"93e8c4ec-79da-4421-be85-19b6810b3069\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6967dsr" Apr 16 20:14:30.765969 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:30.765727 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/93e8c4ec-79da-4421-be85-19b6810b3069-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6967dsr\" (UID: \"93e8c4ec-79da-4421-be85-19b6810b3069\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6967dsr" Apr 16 20:14:30.765969 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:30.765786 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d527aadd-baba-4971-b2e9-5df1ede5f2d0-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-f5b5466ff-frvkh\" (UID: \"d527aadd-baba-4971-b2e9-5df1ede5f2d0\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-f5b5466ff-frvkh" Apr 16 20:14:30.765969 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:30.765800 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/d527aadd-baba-4971-b2e9-5df1ede5f2d0-home\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-f5b5466ff-frvkh\" (UID: \"d527aadd-baba-4971-b2e9-5df1ede5f2d0\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-f5b5466ff-frvkh" Apr 16 20:14:30.765969 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:30.765963 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/93e8c4ec-79da-4421-be85-19b6810b3069-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6967dsr\" (UID: \"93e8c4ec-79da-4421-be85-19b6810b3069\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6967dsr" Apr 16 20:14:30.766370 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:30.766112 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/d527aadd-baba-4971-b2e9-5df1ede5f2d0-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-f5b5466ff-frvkh\" (UID: \"d527aadd-baba-4971-b2e9-5df1ede5f2d0\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-f5b5466ff-frvkh" Apr 16 20:14:30.766370 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:30.766208 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/93e8c4ec-79da-4421-be85-19b6810b3069-home\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6967dsr\" (UID: \"93e8c4ec-79da-4421-be85-19b6810b3069\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6967dsr" Apr 16 20:14:30.768648 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:30.768614 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/93e8c4ec-79da-4421-be85-19b6810b3069-dshm\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6967dsr\" (UID: \"93e8c4ec-79da-4421-be85-19b6810b3069\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6967dsr" Apr 16 20:14:30.768797 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:30.768733 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/d527aadd-baba-4971-b2e9-5df1ede5f2d0-dshm\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-f5b5466ff-frvkh\" (UID: \"d527aadd-baba-4971-b2e9-5df1ede5f2d0\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-f5b5466ff-frvkh" Apr 16 20:14:30.769032 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:30.768993 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d527aadd-baba-4971-b2e9-5df1ede5f2d0-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-f5b5466ff-frvkh\" (UID: \"d527aadd-baba-4971-b2e9-5df1ede5f2d0\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-f5b5466ff-frvkh" Apr 16 20:14:30.769620 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:30.769598 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/93e8c4ec-79da-4421-be85-19b6810b3069-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6967dsr\" (UID: \"93e8c4ec-79da-4421-be85-19b6810b3069\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6967dsr" Apr 16 20:14:30.778171 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:30.778149 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-njq55\" (UniqueName: \"kubernetes.io/projected/d527aadd-baba-4971-b2e9-5df1ede5f2d0-kube-api-access-njq55\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-f5b5466ff-frvkh\" (UID: \"d527aadd-baba-4971-b2e9-5df1ede5f2d0\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-f5b5466ff-frvkh" Apr 16 20:14:30.785591 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:30.785564 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mnd27\" (UniqueName: \"kubernetes.io/projected/93e8c4ec-79da-4421-be85-19b6810b3069-kube-api-access-mnd27\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6967dsr\" (UID: \"93e8c4ec-79da-4421-be85-19b6810b3069\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6967dsr" Apr 16 20:14:30.798357 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:30.798330 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-f5b5466ff-frvkh" Apr 16 20:14:30.830549 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:30.830508 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6967dsr" Apr 16 20:14:30.976916 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:30.976877 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-f5b5466ff-frvkh"] Apr 16 20:14:30.979391 ip-10-0-128-48 kubenswrapper[2572]: W0416 20:14:30.979359 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd527aadd_baba_4971_b2e9_5df1ede5f2d0.slice/crio-aee5638edb506d488fe3675ea8333ab51b5504c967def73164be68ea6eb9e4d4 WatchSource:0}: Error finding container aee5638edb506d488fe3675ea8333ab51b5504c967def73164be68ea6eb9e4d4: Status 404 returned error can't find the container with id aee5638edb506d488fe3675ea8333ab51b5504c967def73164be68ea6eb9e4d4 Apr 16 20:14:30.991877 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:30.991851 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6967dsr"] Apr 16 20:14:31.524488 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:31.524417 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6967dsr" event={"ID":"93e8c4ec-79da-4421-be85-19b6810b3069","Type":"ContainerStarted","Data":"fc6588f0c92e2913244039890c1a36f827b6ee1a8f76b65183698a0d259176bf"} Apr 16 20:14:31.524488 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:31.524460 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6967dsr" event={"ID":"93e8c4ec-79da-4421-be85-19b6810b3069","Type":"ContainerStarted","Data":"3aec01e45da837e52b0e0089e32b9346c1a36262f3f574d1d5b4791bc30edcad"} Apr 16 20:14:31.527782 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:31.527744 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-f5b5466ff-frvkh" event={"ID":"d527aadd-baba-4971-b2e9-5df1ede5f2d0","Type":"ContainerStarted","Data":"aee5638edb506d488fe3675ea8333ab51b5504c967def73164be68ea6eb9e4d4"} Apr 16 20:14:33.539665 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:33.539613 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-f5b5466ff-frvkh" event={"ID":"d527aadd-baba-4971-b2e9-5df1ede5f2d0","Type":"ContainerStarted","Data":"b7e081c84f432c85178363ebde3cdf303d0234509525fc701394854075bfa44e"} Apr 16 20:14:33.540170 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:33.539824 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-f5b5466ff-frvkh" Apr 16 20:14:34.131760 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:34.131705 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-55c8bffd6-2lnnw" podUID="05be8fbd-8c8e-4bef-aa87-0ccb67a6875f" containerName="main" probeResult="failure" output="Get \"https://10.132.0.33:8000/health\": dial tcp 10.132.0.33:8000: connect: connection refused" Apr 16 20:14:34.550077 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:34.550029 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-f5b5466ff-frvkh" event={"ID":"d527aadd-baba-4971-b2e9-5df1ede5f2d0","Type":"ContainerStarted","Data":"21cd938444174e89f25e61963f831e15340c0391b3598e97d6a9a45fa9b4031f"} Apr 16 20:14:35.555167 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:35.555133 2572 generic.go:358] "Generic (PLEG): container finished" podID="93e8c4ec-79da-4421-be85-19b6810b3069" containerID="fc6588f0c92e2913244039890c1a36f827b6ee1a8f76b65183698a0d259176bf" exitCode=0 Apr 16 20:14:35.555693 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:35.555203 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6967dsr" event={"ID":"93e8c4ec-79da-4421-be85-19b6810b3069","Type":"ContainerDied","Data":"fc6588f0c92e2913244039890c1a36f827b6ee1a8f76b65183698a0d259176bf"} Apr 16 20:14:36.562240 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:36.562191 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6967dsr" event={"ID":"93e8c4ec-79da-4421-be85-19b6810b3069","Type":"ContainerStarted","Data":"48d20975e294cbccfb434acaabbeb864c96fea15d80b070c5434d549e099f5d8"} Apr 16 20:14:36.584087 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:36.584019 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6967dsr" podStartSLOduration=6.583999082 podStartE2EDuration="6.583999082s" podCreationTimestamp="2026-04-16 20:14:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:14:36.582361236 +0000 UTC m=+1243.761123277" watchObservedRunningTime="2026-04-16 20:14:36.583999082 +0000 UTC m=+1243.762761108" Apr 16 20:14:38.571850 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:38.571809 2572 generic.go:358] "Generic (PLEG): container finished" podID="d527aadd-baba-4971-b2e9-5df1ede5f2d0" containerID="21cd938444174e89f25e61963f831e15340c0391b3598e97d6a9a45fa9b4031f" exitCode=0 Apr 16 20:14:38.572566 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:38.571855 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-f5b5466ff-frvkh" event={"ID":"d527aadd-baba-4971-b2e9-5df1ede5f2d0","Type":"ContainerDied","Data":"21cd938444174e89f25e61963f831e15340c0391b3598e97d6a9a45fa9b4031f"} Apr 16 20:14:39.578361 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:39.578314 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-f5b5466ff-frvkh" event={"ID":"d527aadd-baba-4971-b2e9-5df1ede5f2d0","Type":"ContainerStarted","Data":"80999210b81f313b611bfe44191ee8260144aafd07df018b54224475ac9de3e0"} Apr 16 20:14:39.602039 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:39.601961 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-f5b5466ff-frvkh" podStartSLOduration=8.003978357 podStartE2EDuration="9.601942395s" podCreationTimestamp="2026-04-16 20:14:30 +0000 UTC" firstStartedPulling="2026-04-16 20:14:30.981245899 +0000 UTC m=+1238.160007903" lastFinishedPulling="2026-04-16 20:14:32.579209931 +0000 UTC m=+1239.757971941" observedRunningTime="2026-04-16 20:14:39.599742529 +0000 UTC m=+1246.778504591" watchObservedRunningTime="2026-04-16 20:14:39.601942395 +0000 UTC m=+1246.780704420" Apr 16 20:14:40.799046 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:40.799004 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-f5b5466ff-frvkh" Apr 16 20:14:40.799046 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:40.799055 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-f5b5466ff-frvkh" Apr 16 20:14:40.800897 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:40.800864 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-f5b5466ff-frvkh" podUID="d527aadd-baba-4971-b2e9-5df1ede5f2d0" containerName="main" probeResult="failure" output="Get \"https://10.132.0.34:8001/health\": dial tcp 10.132.0.34:8001: connect: connection refused" Apr 16 20:14:40.831401 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:40.831349 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6967dsr" Apr 16 20:14:40.831401 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:40.831406 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6967dsr" Apr 16 20:14:40.833126 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:40.833067 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6967dsr" podUID="93e8c4ec-79da-4421-be85-19b6810b3069" containerName="main" probeResult="failure" output="Get \"https://10.132.0.35:8000/health\": dial tcp 10.132.0.35:8000: connect: connection refused" Apr 16 20:14:44.131003 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:44.130884 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-55c8bffd6-2lnnw" podUID="05be8fbd-8c8e-4bef-aa87-0ccb67a6875f" containerName="main" probeResult="failure" output="Get \"https://10.132.0.33:8000/health\": dial tcp 10.132.0.33:8000: connect: connection refused" Apr 16 20:14:46.884550 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:46.884520 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_conv-test-criticality-kserve-566d56d49c-s92hf_7712cba0-bd22-4ce1-99a7-7775f0f0e342/main/0.log" Apr 16 20:14:46.884951 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:46.884934 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/conv-test-criticality-kserve-566d56d49c-s92hf" Apr 16 20:14:46.931180 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:46.931148 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g4gxm\" (UniqueName: \"kubernetes.io/projected/7712cba0-bd22-4ce1-99a7-7775f0f0e342-kube-api-access-g4gxm\") pod \"7712cba0-bd22-4ce1-99a7-7775f0f0e342\" (UID: \"7712cba0-bd22-4ce1-99a7-7775f0f0e342\") " Apr 16 20:14:46.931369 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:46.931227 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/7712cba0-bd22-4ce1-99a7-7775f0f0e342-tls-certs\") pod \"7712cba0-bd22-4ce1-99a7-7775f0f0e342\" (UID: \"7712cba0-bd22-4ce1-99a7-7775f0f0e342\") " Apr 16 20:14:46.931369 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:46.931272 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/7712cba0-bd22-4ce1-99a7-7775f0f0e342-home\") pod \"7712cba0-bd22-4ce1-99a7-7775f0f0e342\" (UID: \"7712cba0-bd22-4ce1-99a7-7775f0f0e342\") " Apr 16 20:14:46.931369 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:46.931319 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/7712cba0-bd22-4ce1-99a7-7775f0f0e342-dshm\") pod \"7712cba0-bd22-4ce1-99a7-7775f0f0e342\" (UID: \"7712cba0-bd22-4ce1-99a7-7775f0f0e342\") " Apr 16 20:14:46.931538 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:46.931381 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7712cba0-bd22-4ce1-99a7-7775f0f0e342-kserve-provision-location\") pod \"7712cba0-bd22-4ce1-99a7-7775f0f0e342\" (UID: \"7712cba0-bd22-4ce1-99a7-7775f0f0e342\") " Apr 16 20:14:46.931538 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:46.931414 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/7712cba0-bd22-4ce1-99a7-7775f0f0e342-model-cache\") pod \"7712cba0-bd22-4ce1-99a7-7775f0f0e342\" (UID: \"7712cba0-bd22-4ce1-99a7-7775f0f0e342\") " Apr 16 20:14:46.931841 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:46.931798 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7712cba0-bd22-4ce1-99a7-7775f0f0e342-home" (OuterVolumeSpecName: "home") pod "7712cba0-bd22-4ce1-99a7-7775f0f0e342" (UID: "7712cba0-bd22-4ce1-99a7-7775f0f0e342"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:14:46.932184 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:46.932148 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7712cba0-bd22-4ce1-99a7-7775f0f0e342-model-cache" (OuterVolumeSpecName: "model-cache") pod "7712cba0-bd22-4ce1-99a7-7775f0f0e342" (UID: "7712cba0-bd22-4ce1-99a7-7775f0f0e342"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:14:46.933897 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:46.933871 2572 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/7712cba0-bd22-4ce1-99a7-7775f0f0e342-model-cache\") on node \"ip-10-0-128-48.ec2.internal\" DevicePath \"\"" Apr 16 20:14:46.933897 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:46.933894 2572 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/7712cba0-bd22-4ce1-99a7-7775f0f0e342-home\") on node \"ip-10-0-128-48.ec2.internal\" DevicePath \"\"" Apr 16 20:14:46.934052 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:46.933981 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7712cba0-bd22-4ce1-99a7-7775f0f0e342-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "7712cba0-bd22-4ce1-99a7-7775f0f0e342" (UID: "7712cba0-bd22-4ce1-99a7-7775f0f0e342"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:14:46.934391 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:46.934312 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7712cba0-bd22-4ce1-99a7-7775f0f0e342-kube-api-access-g4gxm" (OuterVolumeSpecName: "kube-api-access-g4gxm") pod "7712cba0-bd22-4ce1-99a7-7775f0f0e342" (UID: "7712cba0-bd22-4ce1-99a7-7775f0f0e342"). InnerVolumeSpecName "kube-api-access-g4gxm". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 20:14:46.935384 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:46.935352 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7712cba0-bd22-4ce1-99a7-7775f0f0e342-dshm" (OuterVolumeSpecName: "dshm") pod "7712cba0-bd22-4ce1-99a7-7775f0f0e342" (UID: "7712cba0-bd22-4ce1-99a7-7775f0f0e342"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:14:46.992682 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:46.992622 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7712cba0-bd22-4ce1-99a7-7775f0f0e342-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "7712cba0-bd22-4ce1-99a7-7775f0f0e342" (UID: "7712cba0-bd22-4ce1-99a7-7775f0f0e342"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:14:47.035192 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:47.035156 2572 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/7712cba0-bd22-4ce1-99a7-7775f0f0e342-tls-certs\") on node \"ip-10-0-128-48.ec2.internal\" DevicePath \"\"" Apr 16 20:14:47.035192 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:47.035185 2572 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/7712cba0-bd22-4ce1-99a7-7775f0f0e342-dshm\") on node \"ip-10-0-128-48.ec2.internal\" DevicePath \"\"" Apr 16 20:14:47.035192 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:47.035195 2572 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7712cba0-bd22-4ce1-99a7-7775f0f0e342-kserve-provision-location\") on node \"ip-10-0-128-48.ec2.internal\" DevicePath \"\"" Apr 16 20:14:47.035525 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:47.035205 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-g4gxm\" (UniqueName: \"kubernetes.io/projected/7712cba0-bd22-4ce1-99a7-7775f0f0e342-kube-api-access-g4gxm\") on node \"ip-10-0-128-48.ec2.internal\" DevicePath \"\"" Apr 16 20:14:47.632656 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:47.632555 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_conv-test-criticality-kserve-566d56d49c-s92hf_7712cba0-bd22-4ce1-99a7-7775f0f0e342/main/0.log" Apr 16 20:14:47.632972 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:47.632943 2572 generic.go:358] "Generic (PLEG): container finished" podID="7712cba0-bd22-4ce1-99a7-7775f0f0e342" containerID="1c5964ab9c2d57d0529eac66d4015f5620c74fad576a32dd76a76d5c8296d503" exitCode=137 Apr 16 20:14:47.633053 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:47.633029 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/conv-test-criticality-kserve-566d56d49c-s92hf" event={"ID":"7712cba0-bd22-4ce1-99a7-7775f0f0e342","Type":"ContainerDied","Data":"1c5964ab9c2d57d0529eac66d4015f5620c74fad576a32dd76a76d5c8296d503"} Apr 16 20:14:47.633125 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:47.633065 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/conv-test-criticality-kserve-566d56d49c-s92hf" event={"ID":"7712cba0-bd22-4ce1-99a7-7775f0f0e342","Type":"ContainerDied","Data":"8506075380cb37b8a13357f64af630f8df3431d50df1b20a11ae4a06152419df"} Apr 16 20:14:47.633125 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:47.633065 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/conv-test-criticality-kserve-566d56d49c-s92hf" Apr 16 20:14:47.633125 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:47.633081 2572 scope.go:117] "RemoveContainer" containerID="1c5964ab9c2d57d0529eac66d4015f5620c74fad576a32dd76a76d5c8296d503" Apr 16 20:14:47.644935 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:47.644907 2572 scope.go:117] "RemoveContainer" containerID="9cffef42b2e95fac6b2471230fbc0888d314132985a70c383aff05636ef26c02" Apr 16 20:14:47.654835 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:47.654798 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/conv-test-criticality-kserve-566d56d49c-s92hf"] Apr 16 20:14:47.657989 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:47.657963 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/conv-test-criticality-kserve-566d56d49c-s92hf"] Apr 16 20:14:47.717472 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:47.717445 2572 scope.go:117] "RemoveContainer" containerID="1c5964ab9c2d57d0529eac66d4015f5620c74fad576a32dd76a76d5c8296d503" Apr 16 20:14:47.717823 ip-10-0-128-48 kubenswrapper[2572]: E0416 20:14:47.717792 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1c5964ab9c2d57d0529eac66d4015f5620c74fad576a32dd76a76d5c8296d503\": container with ID starting with 1c5964ab9c2d57d0529eac66d4015f5620c74fad576a32dd76a76d5c8296d503 not found: ID does not exist" containerID="1c5964ab9c2d57d0529eac66d4015f5620c74fad576a32dd76a76d5c8296d503" Apr 16 20:14:47.717899 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:47.717854 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1c5964ab9c2d57d0529eac66d4015f5620c74fad576a32dd76a76d5c8296d503"} err="failed to get container status \"1c5964ab9c2d57d0529eac66d4015f5620c74fad576a32dd76a76d5c8296d503\": rpc error: code = NotFound desc = could not find container \"1c5964ab9c2d57d0529eac66d4015f5620c74fad576a32dd76a76d5c8296d503\": container with ID starting with 1c5964ab9c2d57d0529eac66d4015f5620c74fad576a32dd76a76d5c8296d503 not found: ID does not exist" Apr 16 20:14:47.717899 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:47.717882 2572 scope.go:117] "RemoveContainer" containerID="9cffef42b2e95fac6b2471230fbc0888d314132985a70c383aff05636ef26c02" Apr 16 20:14:47.718399 ip-10-0-128-48 kubenswrapper[2572]: E0416 20:14:47.718372 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9cffef42b2e95fac6b2471230fbc0888d314132985a70c383aff05636ef26c02\": container with ID starting with 9cffef42b2e95fac6b2471230fbc0888d314132985a70c383aff05636ef26c02 not found: ID does not exist" containerID="9cffef42b2e95fac6b2471230fbc0888d314132985a70c383aff05636ef26c02" Apr 16 20:14:47.718463 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:47.718410 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9cffef42b2e95fac6b2471230fbc0888d314132985a70c383aff05636ef26c02"} err="failed to get container status \"9cffef42b2e95fac6b2471230fbc0888d314132985a70c383aff05636ef26c02\": rpc error: code = NotFound desc = could not find container \"9cffef42b2e95fac6b2471230fbc0888d314132985a70c383aff05636ef26c02\": container with ID starting with 9cffef42b2e95fac6b2471230fbc0888d314132985a70c383aff05636ef26c02 not found: ID does not exist" Apr 16 20:14:49.366845 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:49.366807 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7712cba0-bd22-4ce1-99a7-7775f0f0e342" path="/var/lib/kubelet/pods/7712cba0-bd22-4ce1-99a7-7775f0f0e342/volumes" Apr 16 20:14:50.798961 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:50.798916 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-f5b5466ff-frvkh" podUID="d527aadd-baba-4971-b2e9-5df1ede5f2d0" containerName="main" probeResult="failure" output="Get \"https://10.132.0.34:8001/health\": dial tcp 10.132.0.34:8001: connect: connection refused" Apr 16 20:14:50.817000 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:50.816969 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-f5b5466ff-frvkh" Apr 16 20:14:50.831085 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:50.831041 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6967dsr" podUID="93e8c4ec-79da-4421-be85-19b6810b3069" containerName="main" probeResult="failure" output="Get \"https://10.132.0.35:8000/health\": dial tcp 10.132.0.35:8000: connect: connection refused" Apr 16 20:14:53.439186 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:53.439157 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-test-kserve-857cc6b66-kq5l9_4ae3ec14-d235-43c1-acdf-65b5e4368321/main/0.log" Apr 16 20:14:53.439623 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:53.439604 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-857cc6b66-kq5l9" Apr 16 20:14:53.496798 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:53.496765 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/4ae3ec14-d235-43c1-acdf-65b5e4368321-tls-certs\") pod \"4ae3ec14-d235-43c1-acdf-65b5e4368321\" (UID: \"4ae3ec14-d235-43c1-acdf-65b5e4368321\") " Apr 16 20:14:53.496985 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:53.496837 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4ae3ec14-d235-43c1-acdf-65b5e4368321-kserve-provision-location\") pod \"4ae3ec14-d235-43c1-acdf-65b5e4368321\" (UID: \"4ae3ec14-d235-43c1-acdf-65b5e4368321\") " Apr 16 20:14:53.496985 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:53.496891 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/4ae3ec14-d235-43c1-acdf-65b5e4368321-dshm\") pod \"4ae3ec14-d235-43c1-acdf-65b5e4368321\" (UID: \"4ae3ec14-d235-43c1-acdf-65b5e4368321\") " Apr 16 20:14:53.496985 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:53.496943 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/4ae3ec14-d235-43c1-acdf-65b5e4368321-home\") pod \"4ae3ec14-d235-43c1-acdf-65b5e4368321\" (UID: \"4ae3ec14-d235-43c1-acdf-65b5e4368321\") " Apr 16 20:14:53.497186 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:53.497007 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/4ae3ec14-d235-43c1-acdf-65b5e4368321-model-cache\") pod \"4ae3ec14-d235-43c1-acdf-65b5e4368321\" (UID: \"4ae3ec14-d235-43c1-acdf-65b5e4368321\") " Apr 16 20:14:53.497186 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:53.497037 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sn9dz\" (UniqueName: \"kubernetes.io/projected/4ae3ec14-d235-43c1-acdf-65b5e4368321-kube-api-access-sn9dz\") pod \"4ae3ec14-d235-43c1-acdf-65b5e4368321\" (UID: \"4ae3ec14-d235-43c1-acdf-65b5e4368321\") " Apr 16 20:14:53.501303 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:53.501180 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4ae3ec14-d235-43c1-acdf-65b5e4368321-home" (OuterVolumeSpecName: "home") pod "4ae3ec14-d235-43c1-acdf-65b5e4368321" (UID: "4ae3ec14-d235-43c1-acdf-65b5e4368321"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:14:53.511204 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:53.500881 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4ae3ec14-d235-43c1-acdf-65b5e4368321-model-cache" (OuterVolumeSpecName: "model-cache") pod "4ae3ec14-d235-43c1-acdf-65b5e4368321" (UID: "4ae3ec14-d235-43c1-acdf-65b5e4368321"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:14:53.513627 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:53.513329 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4ae3ec14-d235-43c1-acdf-65b5e4368321-dshm" (OuterVolumeSpecName: "dshm") pod "4ae3ec14-d235-43c1-acdf-65b5e4368321" (UID: "4ae3ec14-d235-43c1-acdf-65b5e4368321"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:14:53.516882 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:53.516815 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ae3ec14-d235-43c1-acdf-65b5e4368321-kube-api-access-sn9dz" (OuterVolumeSpecName: "kube-api-access-sn9dz") pod "4ae3ec14-d235-43c1-acdf-65b5e4368321" (UID: "4ae3ec14-d235-43c1-acdf-65b5e4368321"). InnerVolumeSpecName "kube-api-access-sn9dz". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 20:14:53.518332 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:53.518288 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ae3ec14-d235-43c1-acdf-65b5e4368321-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "4ae3ec14-d235-43c1-acdf-65b5e4368321" (UID: "4ae3ec14-d235-43c1-acdf-65b5e4368321"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:14:53.573247 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:53.573185 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4ae3ec14-d235-43c1-acdf-65b5e4368321-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "4ae3ec14-d235-43c1-acdf-65b5e4368321" (UID: "4ae3ec14-d235-43c1-acdf-65b5e4368321"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:14:53.598265 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:53.598218 2572 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/4ae3ec14-d235-43c1-acdf-65b5e4368321-tls-certs\") on node \"ip-10-0-128-48.ec2.internal\" DevicePath \"\"" Apr 16 20:14:53.598265 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:53.598261 2572 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4ae3ec14-d235-43c1-acdf-65b5e4368321-kserve-provision-location\") on node \"ip-10-0-128-48.ec2.internal\" DevicePath \"\"" Apr 16 20:14:53.598265 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:53.598276 2572 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/4ae3ec14-d235-43c1-acdf-65b5e4368321-dshm\") on node \"ip-10-0-128-48.ec2.internal\" DevicePath \"\"" Apr 16 20:14:53.598562 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:53.598289 2572 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/4ae3ec14-d235-43c1-acdf-65b5e4368321-home\") on node \"ip-10-0-128-48.ec2.internal\" DevicePath \"\"" Apr 16 20:14:53.598562 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:53.598303 2572 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/4ae3ec14-d235-43c1-acdf-65b5e4368321-model-cache\") on node \"ip-10-0-128-48.ec2.internal\" DevicePath \"\"" Apr 16 20:14:53.598562 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:53.598316 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-sn9dz\" (UniqueName: \"kubernetes.io/projected/4ae3ec14-d235-43c1-acdf-65b5e4368321-kube-api-access-sn9dz\") on node \"ip-10-0-128-48.ec2.internal\" DevicePath \"\"" Apr 16 20:14:53.660823 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:53.660778 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-test-kserve-857cc6b66-kq5l9_4ae3ec14-d235-43c1-acdf-65b5e4368321/main/0.log" Apr 16 20:14:53.661216 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:53.661182 2572 generic.go:358] "Generic (PLEG): container finished" podID="4ae3ec14-d235-43c1-acdf-65b5e4368321" containerID="e29228268e2ed63f8ff4664503756c655f054b1e6f17a59c964b0bcd72b9dec6" exitCode=137 Apr 16 20:14:53.661328 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:53.661270 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-857cc6b66-kq5l9" event={"ID":"4ae3ec14-d235-43c1-acdf-65b5e4368321","Type":"ContainerDied","Data":"e29228268e2ed63f8ff4664503756c655f054b1e6f17a59c964b0bcd72b9dec6"} Apr 16 20:14:53.661328 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:53.661319 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-857cc6b66-kq5l9" event={"ID":"4ae3ec14-d235-43c1-acdf-65b5e4368321","Type":"ContainerDied","Data":"cbe38fa380473d9d086fcd52ff544a9e0e7e903d759675c3aa6033b87cefda32"} Apr 16 20:14:53.661435 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:53.661291 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-857cc6b66-kq5l9" Apr 16 20:14:53.661435 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:53.661344 2572 scope.go:117] "RemoveContainer" containerID="e29228268e2ed63f8ff4664503756c655f054b1e6f17a59c964b0bcd72b9dec6" Apr 16 20:14:53.690247 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:53.690173 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-857cc6b66-kq5l9"] Apr 16 20:14:53.691988 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:53.691959 2572 scope.go:117] "RemoveContainer" containerID="4b336e1ed070fc1b126e7428d87c2a392548e2e3505e67f0b8ec9de508a1aa85" Apr 16 20:14:53.692845 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:53.692813 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-857cc6b66-kq5l9"] Apr 16 20:14:53.755870 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:53.755846 2572 scope.go:117] "RemoveContainer" containerID="e29228268e2ed63f8ff4664503756c655f054b1e6f17a59c964b0bcd72b9dec6" Apr 16 20:14:53.756282 ip-10-0-128-48 kubenswrapper[2572]: E0416 20:14:53.756256 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e29228268e2ed63f8ff4664503756c655f054b1e6f17a59c964b0bcd72b9dec6\": container with ID starting with e29228268e2ed63f8ff4664503756c655f054b1e6f17a59c964b0bcd72b9dec6 not found: ID does not exist" containerID="e29228268e2ed63f8ff4664503756c655f054b1e6f17a59c964b0bcd72b9dec6" Apr 16 20:14:53.756380 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:53.756296 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e29228268e2ed63f8ff4664503756c655f054b1e6f17a59c964b0bcd72b9dec6"} err="failed to get container status \"e29228268e2ed63f8ff4664503756c655f054b1e6f17a59c964b0bcd72b9dec6\": rpc error: code = NotFound desc = could not find container \"e29228268e2ed63f8ff4664503756c655f054b1e6f17a59c964b0bcd72b9dec6\": container with ID starting with e29228268e2ed63f8ff4664503756c655f054b1e6f17a59c964b0bcd72b9dec6 not found: ID does not exist" Apr 16 20:14:53.756380 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:53.756324 2572 scope.go:117] "RemoveContainer" containerID="4b336e1ed070fc1b126e7428d87c2a392548e2e3505e67f0b8ec9de508a1aa85" Apr 16 20:14:53.756601 ip-10-0-128-48 kubenswrapper[2572]: E0416 20:14:53.756581 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4b336e1ed070fc1b126e7428d87c2a392548e2e3505e67f0b8ec9de508a1aa85\": container with ID starting with 4b336e1ed070fc1b126e7428d87c2a392548e2e3505e67f0b8ec9de508a1aa85 not found: ID does not exist" containerID="4b336e1ed070fc1b126e7428d87c2a392548e2e3505e67f0b8ec9de508a1aa85" Apr 16 20:14:53.756718 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:53.756610 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4b336e1ed070fc1b126e7428d87c2a392548e2e3505e67f0b8ec9de508a1aa85"} err="failed to get container status \"4b336e1ed070fc1b126e7428d87c2a392548e2e3505e67f0b8ec9de508a1aa85\": rpc error: code = NotFound desc = could not find container \"4b336e1ed070fc1b126e7428d87c2a392548e2e3505e67f0b8ec9de508a1aa85\": container with ID starting with 4b336e1ed070fc1b126e7428d87c2a392548e2e3505e67f0b8ec9de508a1aa85 not found: ID does not exist" Apr 16 20:14:54.131372 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:54.131327 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-55c8bffd6-2lnnw" podUID="05be8fbd-8c8e-4bef-aa87-0ccb67a6875f" containerName="main" probeResult="failure" output="Get \"https://10.132.0.33:8000/health\": dial tcp 10.132.0.33:8000: connect: connection refused" Apr 16 20:14:55.367580 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:14:55.367538 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4ae3ec14-d235-43c1-acdf-65b5e4368321" path="/var/lib/kubelet/pods/4ae3ec14-d235-43c1-acdf-65b5e4368321/volumes" Apr 16 20:15:00.799306 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:15:00.799258 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-f5b5466ff-frvkh" podUID="d527aadd-baba-4971-b2e9-5df1ede5f2d0" containerName="main" probeResult="failure" output="Get \"https://10.132.0.34:8001/health\": dial tcp 10.132.0.34:8001: connect: connection refused" Apr 16 20:15:00.832037 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:15:00.831991 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6967dsr" podUID="93e8c4ec-79da-4421-be85-19b6810b3069" containerName="main" probeResult="failure" output="Get \"https://10.132.0.35:8000/health\": dial tcp 10.132.0.35:8000: connect: connection refused" Apr 16 20:15:04.131135 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:15:04.131068 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-55c8bffd6-2lnnw" podUID="05be8fbd-8c8e-4bef-aa87-0ccb67a6875f" containerName="main" probeResult="failure" output="Get \"https://10.132.0.33:8000/health\": dial tcp 10.132.0.33:8000: connect: connection refused" Apr 16 20:15:10.799735 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:15:10.799660 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-f5b5466ff-frvkh" podUID="d527aadd-baba-4971-b2e9-5df1ede5f2d0" containerName="main" probeResult="failure" output="Get \"https://10.132.0.34:8001/health\": dial tcp 10.132.0.34:8001: connect: connection refused" Apr 16 20:15:10.831296 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:15:10.831257 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6967dsr" podUID="93e8c4ec-79da-4421-be85-19b6810b3069" containerName="main" probeResult="failure" output="Get \"https://10.132.0.35:8000/health\": dial tcp 10.132.0.35:8000: connect: connection refused" Apr 16 20:15:14.131061 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:15:14.131005 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-55c8bffd6-2lnnw" podUID="05be8fbd-8c8e-4bef-aa87-0ccb67a6875f" containerName="main" probeResult="failure" output="Get \"https://10.132.0.33:8000/health\": dial tcp 10.132.0.33:8000: connect: connection refused" Apr 16 20:15:20.799490 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:15:20.799415 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-f5b5466ff-frvkh" podUID="d527aadd-baba-4971-b2e9-5df1ede5f2d0" containerName="main" probeResult="failure" output="Get \"https://10.132.0.34:8001/health\": dial tcp 10.132.0.34:8001: connect: connection refused" Apr 16 20:15:20.831619 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:15:20.831579 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6967dsr" podUID="93e8c4ec-79da-4421-be85-19b6810b3069" containerName="main" probeResult="failure" output="Get \"https://10.132.0.35:8000/health\": dial tcp 10.132.0.35:8000: connect: connection refused" Apr 16 20:15:24.130962 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:15:24.130906 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-55c8bffd6-2lnnw" podUID="05be8fbd-8c8e-4bef-aa87-0ccb67a6875f" containerName="main" probeResult="failure" output="Get \"https://10.132.0.33:8000/health\": dial tcp 10.132.0.33:8000: connect: connection refused" Apr 16 20:15:30.799112 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:15:30.799048 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-f5b5466ff-frvkh" podUID="d527aadd-baba-4971-b2e9-5df1ede5f2d0" containerName="main" probeResult="failure" output="Get \"https://10.132.0.34:8001/health\": dial tcp 10.132.0.34:8001: connect: connection refused" Apr 16 20:15:30.831523 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:15:30.831482 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6967dsr" podUID="93e8c4ec-79da-4421-be85-19b6810b3069" containerName="main" probeResult="failure" output="Get \"https://10.132.0.35:8000/health\": dial tcp 10.132.0.35:8000: connect: connection refused" Apr 16 20:15:34.131452 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:15:34.131408 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-55c8bffd6-2lnnw" podUID="05be8fbd-8c8e-4bef-aa87-0ccb67a6875f" containerName="main" probeResult="failure" output="Get \"https://10.132.0.33:8000/health\": dial tcp 10.132.0.33:8000: connect: connection refused" Apr 16 20:15:40.798981 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:15:40.798934 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-f5b5466ff-frvkh" podUID="d527aadd-baba-4971-b2e9-5df1ede5f2d0" containerName="main" probeResult="failure" output="Get \"https://10.132.0.34:8001/health\": dial tcp 10.132.0.34:8001: connect: connection refused" Apr 16 20:15:40.831189 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:15:40.831153 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6967dsr" podUID="93e8c4ec-79da-4421-be85-19b6810b3069" containerName="main" probeResult="failure" output="Get \"https://10.132.0.35:8000/health\": dial tcp 10.132.0.35:8000: connect: connection refused" Apr 16 20:15:44.130761 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:15:44.130715 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-55c8bffd6-2lnnw" podUID="05be8fbd-8c8e-4bef-aa87-0ccb67a6875f" containerName="main" probeResult="failure" output="Get \"https://10.132.0.33:8000/health\": dial tcp 10.132.0.33:8000: connect: connection refused" Apr 16 20:15:50.799201 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:15:50.799154 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-f5b5466ff-frvkh" podUID="d527aadd-baba-4971-b2e9-5df1ede5f2d0" containerName="main" probeResult="failure" output="Get \"https://10.132.0.34:8001/health\": dial tcp 10.132.0.34:8001: connect: connection refused" Apr 16 20:15:50.831014 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:15:50.830977 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6967dsr" podUID="93e8c4ec-79da-4421-be85-19b6810b3069" containerName="main" probeResult="failure" output="Get \"https://10.132.0.35:8000/health\": dial tcp 10.132.0.35:8000: connect: connection refused" Apr 16 20:15:54.130911 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:15:54.130865 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-55c8bffd6-2lnnw" podUID="05be8fbd-8c8e-4bef-aa87-0ccb67a6875f" containerName="main" probeResult="failure" output="Get \"https://10.132.0.33:8000/health\": dial tcp 10.132.0.33:8000: connect: connection refused" Apr 16 20:16:00.798856 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:16:00.798812 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-f5b5466ff-frvkh" podUID="d527aadd-baba-4971-b2e9-5df1ede5f2d0" containerName="main" probeResult="failure" output="Get \"https://10.132.0.34:8001/health\": dial tcp 10.132.0.34:8001: connect: connection refused" Apr 16 20:16:00.831799 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:16:00.831762 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6967dsr" podUID="93e8c4ec-79da-4421-be85-19b6810b3069" containerName="main" probeResult="failure" output="Get \"https://10.132.0.35:8000/health\": dial tcp 10.132.0.35:8000: connect: connection refused" Apr 16 20:16:04.130859 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:16:04.130818 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-55c8bffd6-2lnnw" podUID="05be8fbd-8c8e-4bef-aa87-0ccb67a6875f" containerName="main" probeResult="failure" output="Get \"https://10.132.0.33:8000/health\": dial tcp 10.132.0.33:8000: connect: connection refused" Apr 16 20:16:10.799313 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:16:10.799263 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-f5b5466ff-frvkh" podUID="d527aadd-baba-4971-b2e9-5df1ede5f2d0" containerName="main" probeResult="failure" output="Get \"https://10.132.0.34:8001/health\": dial tcp 10.132.0.34:8001: connect: connection refused" Apr 16 20:16:10.831624 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:16:10.831577 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6967dsr" podUID="93e8c4ec-79da-4421-be85-19b6810b3069" containerName="main" probeResult="failure" output="Get \"https://10.132.0.35:8000/health\": dial tcp 10.132.0.35:8000: connect: connection refused" Apr 16 20:16:14.131567 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:16:14.131487 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-55c8bffd6-2lnnw" podUID="05be8fbd-8c8e-4bef-aa87-0ccb67a6875f" containerName="main" probeResult="failure" output="Get \"https://10.132.0.33:8000/health\": dial tcp 10.132.0.33:8000: connect: connection refused" Apr 16 20:16:20.799376 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:16:20.799334 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-f5b5466ff-frvkh" podUID="d527aadd-baba-4971-b2e9-5df1ede5f2d0" containerName="main" probeResult="failure" output="Get \"https://10.132.0.34:8001/health\": dial tcp 10.132.0.34:8001: connect: connection refused" Apr 16 20:16:20.831576 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:16:20.831536 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6967dsr" podUID="93e8c4ec-79da-4421-be85-19b6810b3069" containerName="main" probeResult="failure" output="Get \"https://10.132.0.35:8000/health\": dial tcp 10.132.0.35:8000: connect: connection refused" Apr 16 20:16:24.131601 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:16:24.131522 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-55c8bffd6-2lnnw" podUID="05be8fbd-8c8e-4bef-aa87-0ccb67a6875f" containerName="main" probeResult="failure" output="Get \"https://10.132.0.33:8000/health\": dial tcp 10.132.0.33:8000: connect: connection refused" Apr 16 20:16:30.799402 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:16:30.799357 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-f5b5466ff-frvkh" podUID="d527aadd-baba-4971-b2e9-5df1ede5f2d0" containerName="main" probeResult="failure" output="Get \"https://10.132.0.34:8001/health\": dial tcp 10.132.0.34:8001: connect: connection refused" Apr 16 20:16:30.831957 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:16:30.831909 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6967dsr" podUID="93e8c4ec-79da-4421-be85-19b6810b3069" containerName="main" probeResult="failure" output="Get \"https://10.132.0.35:8000/health\": dial tcp 10.132.0.35:8000: connect: connection refused" Apr 16 20:16:34.140819 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:16:34.140785 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/stop-feature-test-kserve-55c8bffd6-2lnnw" Apr 16 20:16:34.148867 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:16:34.148841 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-55c8bffd6-2lnnw" Apr 16 20:16:35.091417 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:16:35.091385 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-55c8bffd6-2lnnw"] Apr 16 20:16:35.113035 ip-10-0-128-48 kubenswrapper[2572]: E0416 20:16:35.113005 2572 secret.go:189] Couldn't get secret kserve-ci-e2e-test/stop-feature-test-kserve-self-signed-certs: secret "stop-feature-test-kserve-self-signed-certs" not found Apr 16 20:16:35.113197 ip-10-0-128-48 kubenswrapper[2572]: E0416 20:16:35.113079 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/05be8fbd-8c8e-4bef-aa87-0ccb67a6875f-tls-certs podName:05be8fbd-8c8e-4bef-aa87-0ccb67a6875f nodeName:}" failed. No retries permitted until 2026-04-16 20:16:35.613058714 +0000 UTC m=+1362.791820727 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-certs" (UniqueName: "kubernetes.io/secret/05be8fbd-8c8e-4bef-aa87-0ccb67a6875f-tls-certs") pod "stop-feature-test-kserve-55c8bffd6-2lnnw" (UID: "05be8fbd-8c8e-4bef-aa87-0ccb67a6875f") : secret "stop-feature-test-kserve-self-signed-certs" not found Apr 16 20:16:35.617965 ip-10-0-128-48 kubenswrapper[2572]: E0416 20:16:35.617929 2572 secret.go:189] Couldn't get secret kserve-ci-e2e-test/stop-feature-test-kserve-self-signed-certs: secret "stop-feature-test-kserve-self-signed-certs" not found Apr 16 20:16:35.618367 ip-10-0-128-48 kubenswrapper[2572]: E0416 20:16:35.618051 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/05be8fbd-8c8e-4bef-aa87-0ccb67a6875f-tls-certs podName:05be8fbd-8c8e-4bef-aa87-0ccb67a6875f nodeName:}" failed. No retries permitted until 2026-04-16 20:16:36.618031028 +0000 UTC m=+1363.796793055 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "tls-certs" (UniqueName: "kubernetes.io/secret/05be8fbd-8c8e-4bef-aa87-0ccb67a6875f-tls-certs") pod "stop-feature-test-kserve-55c8bffd6-2lnnw" (UID: "05be8fbd-8c8e-4bef-aa87-0ccb67a6875f") : secret "stop-feature-test-kserve-self-signed-certs" not found Apr 16 20:16:36.027145 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:16:36.027032 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/stop-feature-test-kserve-55c8bffd6-2lnnw" podUID="05be8fbd-8c8e-4bef-aa87-0ccb67a6875f" containerName="main" containerID="cri-o://da91519ead7d06207b26d530a88a8ccfb9e90eb31b6bbf0043ce830dfc392a39" gracePeriod=30 Apr 16 20:16:36.628554 ip-10-0-128-48 kubenswrapper[2572]: E0416 20:16:36.628518 2572 secret.go:189] Couldn't get secret kserve-ci-e2e-test/stop-feature-test-kserve-self-signed-certs: secret "stop-feature-test-kserve-self-signed-certs" not found Apr 16 20:16:36.628933 ip-10-0-128-48 kubenswrapper[2572]: E0416 20:16:36.628602 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/05be8fbd-8c8e-4bef-aa87-0ccb67a6875f-tls-certs podName:05be8fbd-8c8e-4bef-aa87-0ccb67a6875f nodeName:}" failed. No retries permitted until 2026-04-16 20:16:38.628583123 +0000 UTC m=+1365.807345131 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "tls-certs" (UniqueName: "kubernetes.io/secret/05be8fbd-8c8e-4bef-aa87-0ccb67a6875f-tls-certs") pod "stop-feature-test-kserve-55c8bffd6-2lnnw" (UID: "05be8fbd-8c8e-4bef-aa87-0ccb67a6875f") : secret "stop-feature-test-kserve-self-signed-certs" not found Apr 16 20:16:38.646394 ip-10-0-128-48 kubenswrapper[2572]: E0416 20:16:38.646355 2572 secret.go:189] Couldn't get secret kserve-ci-e2e-test/stop-feature-test-kserve-self-signed-certs: secret "stop-feature-test-kserve-self-signed-certs" not found Apr 16 20:16:38.646778 ip-10-0-128-48 kubenswrapper[2572]: E0416 20:16:38.646427 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/05be8fbd-8c8e-4bef-aa87-0ccb67a6875f-tls-certs podName:05be8fbd-8c8e-4bef-aa87-0ccb67a6875f nodeName:}" failed. No retries permitted until 2026-04-16 20:16:42.646412617 +0000 UTC m=+1369.825174620 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "tls-certs" (UniqueName: "kubernetes.io/secret/05be8fbd-8c8e-4bef-aa87-0ccb67a6875f-tls-certs") pod "stop-feature-test-kserve-55c8bffd6-2lnnw" (UID: "05be8fbd-8c8e-4bef-aa87-0ccb67a6875f") : secret "stop-feature-test-kserve-self-signed-certs" not found Apr 16 20:16:40.799722 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:16:40.799674 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-f5b5466ff-frvkh" podUID="d527aadd-baba-4971-b2e9-5df1ede5f2d0" containerName="main" probeResult="failure" output="Get \"https://10.132.0.34:8001/health\": dial tcp 10.132.0.34:8001: connect: connection refused" Apr 16 20:16:40.831863 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:16:40.831827 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6967dsr" podUID="93e8c4ec-79da-4421-be85-19b6810b3069" containerName="main" probeResult="failure" output="Get \"https://10.132.0.35:8000/health\": dial tcp 10.132.0.35:8000: connect: connection refused" Apr 16 20:16:42.682810 ip-10-0-128-48 kubenswrapper[2572]: E0416 20:16:42.682776 2572 secret.go:189] Couldn't get secret kserve-ci-e2e-test/stop-feature-test-kserve-self-signed-certs: secret "stop-feature-test-kserve-self-signed-certs" not found Apr 16 20:16:42.683260 ip-10-0-128-48 kubenswrapper[2572]: E0416 20:16:42.682856 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/05be8fbd-8c8e-4bef-aa87-0ccb67a6875f-tls-certs podName:05be8fbd-8c8e-4bef-aa87-0ccb67a6875f nodeName:}" failed. No retries permitted until 2026-04-16 20:16:50.682838485 +0000 UTC m=+1377.861600489 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "tls-certs" (UniqueName: "kubernetes.io/secret/05be8fbd-8c8e-4bef-aa87-0ccb67a6875f-tls-certs") pod "stop-feature-test-kserve-55c8bffd6-2lnnw" (UID: "05be8fbd-8c8e-4bef-aa87-0ccb67a6875f") : secret "stop-feature-test-kserve-self-signed-certs" not found Apr 16 20:16:50.759544 ip-10-0-128-48 kubenswrapper[2572]: E0416 20:16:50.759503 2572 secret.go:189] Couldn't get secret kserve-ci-e2e-test/stop-feature-test-kserve-self-signed-certs: secret "stop-feature-test-kserve-self-signed-certs" not found Apr 16 20:16:50.759995 ip-10-0-128-48 kubenswrapper[2572]: E0416 20:16:50.759577 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/05be8fbd-8c8e-4bef-aa87-0ccb67a6875f-tls-certs podName:05be8fbd-8c8e-4bef-aa87-0ccb67a6875f nodeName:}" failed. No retries permitted until 2026-04-16 20:17:06.759562966 +0000 UTC m=+1393.938324969 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "tls-certs" (UniqueName: "kubernetes.io/secret/05be8fbd-8c8e-4bef-aa87-0ccb67a6875f-tls-certs") pod "stop-feature-test-kserve-55c8bffd6-2lnnw" (UID: "05be8fbd-8c8e-4bef-aa87-0ccb67a6875f") : secret "stop-feature-test-kserve-self-signed-certs" not found Apr 16 20:16:50.799465 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:16:50.799411 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-f5b5466ff-frvkh" podUID="d527aadd-baba-4971-b2e9-5df1ede5f2d0" containerName="main" probeResult="failure" output="Get \"https://10.132.0.34:8001/health\": dial tcp 10.132.0.34:8001: connect: connection refused" Apr 16 20:16:50.831459 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:16:50.831424 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6967dsr" podUID="93e8c4ec-79da-4421-be85-19b6810b3069" containerName="main" probeResult="failure" output="Get \"https://10.132.0.35:8000/health\": dial tcp 10.132.0.35:8000: connect: connection refused" Apr 16 20:16:58.817152 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:16:58.817120 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-55c8bffd6-9jwhw"] Apr 16 20:16:58.817611 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:16:58.817593 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4ae3ec14-d235-43c1-acdf-65b5e4368321" containerName="main" Apr 16 20:16:58.817677 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:16:58.817613 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ae3ec14-d235-43c1-acdf-65b5e4368321" containerName="main" Apr 16 20:16:58.817677 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:16:58.817623 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7712cba0-bd22-4ce1-99a7-7775f0f0e342" containerName="storage-initializer" Apr 16 20:16:58.817677 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:16:58.817629 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="7712cba0-bd22-4ce1-99a7-7775f0f0e342" containerName="storage-initializer" Apr 16 20:16:58.817677 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:16:58.817643 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7712cba0-bd22-4ce1-99a7-7775f0f0e342" containerName="main" Apr 16 20:16:58.817677 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:16:58.817649 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="7712cba0-bd22-4ce1-99a7-7775f0f0e342" containerName="main" Apr 16 20:16:58.817677 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:16:58.817677 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4ae3ec14-d235-43c1-acdf-65b5e4368321" containerName="storage-initializer" Apr 16 20:16:58.817937 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:16:58.817686 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ae3ec14-d235-43c1-acdf-65b5e4368321" containerName="storage-initializer" Apr 16 20:16:58.817937 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:16:58.817750 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="4ae3ec14-d235-43c1-acdf-65b5e4368321" containerName="main" Apr 16 20:16:58.817937 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:16:58.817763 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="7712cba0-bd22-4ce1-99a7-7775f0f0e342" containerName="main" Apr 16 20:16:58.823020 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:16:58.823001 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-55c8bffd6-9jwhw" Apr 16 20:16:58.833787 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:16:58.833766 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-55c8bffd6-9jwhw"] Apr 16 20:16:58.936516 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:16:58.936483 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/45dc5182-f8dc-4793-b9e7-aa120a718845-home\") pod \"stop-feature-test-kserve-55c8bffd6-9jwhw\" (UID: \"45dc5182-f8dc-4793-b9e7-aa120a718845\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-55c8bffd6-9jwhw" Apr 16 20:16:58.936699 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:16:58.936533 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/45dc5182-f8dc-4793-b9e7-aa120a718845-model-cache\") pod \"stop-feature-test-kserve-55c8bffd6-9jwhw\" (UID: \"45dc5182-f8dc-4793-b9e7-aa120a718845\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-55c8bffd6-9jwhw" Apr 16 20:16:58.936699 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:16:58.936591 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/45dc5182-f8dc-4793-b9e7-aa120a718845-kserve-provision-location\") pod \"stop-feature-test-kserve-55c8bffd6-9jwhw\" (UID: \"45dc5182-f8dc-4793-b9e7-aa120a718845\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-55c8bffd6-9jwhw" Apr 16 20:16:58.936699 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:16:58.936668 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/45dc5182-f8dc-4793-b9e7-aa120a718845-tls-certs\") pod \"stop-feature-test-kserve-55c8bffd6-9jwhw\" (UID: \"45dc5182-f8dc-4793-b9e7-aa120a718845\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-55c8bffd6-9jwhw" Apr 16 20:16:58.936825 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:16:58.936699 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/45dc5182-f8dc-4793-b9e7-aa120a718845-dshm\") pod \"stop-feature-test-kserve-55c8bffd6-9jwhw\" (UID: \"45dc5182-f8dc-4793-b9e7-aa120a718845\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-55c8bffd6-9jwhw" Apr 16 20:16:58.936825 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:16:58.936734 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k5qcj\" (UniqueName: \"kubernetes.io/projected/45dc5182-f8dc-4793-b9e7-aa120a718845-kube-api-access-k5qcj\") pod \"stop-feature-test-kserve-55c8bffd6-9jwhw\" (UID: \"45dc5182-f8dc-4793-b9e7-aa120a718845\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-55c8bffd6-9jwhw" Apr 16 20:16:59.038027 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:16:59.037982 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/45dc5182-f8dc-4793-b9e7-aa120a718845-tls-certs\") pod \"stop-feature-test-kserve-55c8bffd6-9jwhw\" (UID: \"45dc5182-f8dc-4793-b9e7-aa120a718845\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-55c8bffd6-9jwhw" Apr 16 20:16:59.038260 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:16:59.038057 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/45dc5182-f8dc-4793-b9e7-aa120a718845-dshm\") pod \"stop-feature-test-kserve-55c8bffd6-9jwhw\" (UID: \"45dc5182-f8dc-4793-b9e7-aa120a718845\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-55c8bffd6-9jwhw" Apr 16 20:16:59.038260 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:16:59.038161 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k5qcj\" (UniqueName: \"kubernetes.io/projected/45dc5182-f8dc-4793-b9e7-aa120a718845-kube-api-access-k5qcj\") pod \"stop-feature-test-kserve-55c8bffd6-9jwhw\" (UID: \"45dc5182-f8dc-4793-b9e7-aa120a718845\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-55c8bffd6-9jwhw" Apr 16 20:16:59.038260 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:16:59.038240 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/45dc5182-f8dc-4793-b9e7-aa120a718845-home\") pod \"stop-feature-test-kserve-55c8bffd6-9jwhw\" (UID: \"45dc5182-f8dc-4793-b9e7-aa120a718845\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-55c8bffd6-9jwhw" Apr 16 20:16:59.038432 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:16:59.038296 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/45dc5182-f8dc-4793-b9e7-aa120a718845-model-cache\") pod \"stop-feature-test-kserve-55c8bffd6-9jwhw\" (UID: \"45dc5182-f8dc-4793-b9e7-aa120a718845\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-55c8bffd6-9jwhw" Apr 16 20:16:59.038588 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:16:59.038561 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/45dc5182-f8dc-4793-b9e7-aa120a718845-kserve-provision-location\") pod \"stop-feature-test-kserve-55c8bffd6-9jwhw\" (UID: \"45dc5182-f8dc-4793-b9e7-aa120a718845\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-55c8bffd6-9jwhw" Apr 16 20:16:59.038652 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:16:59.038615 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/45dc5182-f8dc-4793-b9e7-aa120a718845-home\") pod \"stop-feature-test-kserve-55c8bffd6-9jwhw\" (UID: \"45dc5182-f8dc-4793-b9e7-aa120a718845\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-55c8bffd6-9jwhw" Apr 16 20:16:59.038729 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:16:59.038703 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/45dc5182-f8dc-4793-b9e7-aa120a718845-model-cache\") pod \"stop-feature-test-kserve-55c8bffd6-9jwhw\" (UID: \"45dc5182-f8dc-4793-b9e7-aa120a718845\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-55c8bffd6-9jwhw" Apr 16 20:16:59.038948 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:16:59.038920 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/45dc5182-f8dc-4793-b9e7-aa120a718845-kserve-provision-location\") pod \"stop-feature-test-kserve-55c8bffd6-9jwhw\" (UID: \"45dc5182-f8dc-4793-b9e7-aa120a718845\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-55c8bffd6-9jwhw" Apr 16 20:16:59.041274 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:16:59.041241 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/45dc5182-f8dc-4793-b9e7-aa120a718845-tls-certs\") pod \"stop-feature-test-kserve-55c8bffd6-9jwhw\" (UID: \"45dc5182-f8dc-4793-b9e7-aa120a718845\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-55c8bffd6-9jwhw" Apr 16 20:16:59.041607 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:16:59.041574 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/45dc5182-f8dc-4793-b9e7-aa120a718845-dshm\") pod \"stop-feature-test-kserve-55c8bffd6-9jwhw\" (UID: \"45dc5182-f8dc-4793-b9e7-aa120a718845\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-55c8bffd6-9jwhw" Apr 16 20:16:59.051878 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:16:59.051812 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k5qcj\" (UniqueName: \"kubernetes.io/projected/45dc5182-f8dc-4793-b9e7-aa120a718845-kube-api-access-k5qcj\") pod \"stop-feature-test-kserve-55c8bffd6-9jwhw\" (UID: \"45dc5182-f8dc-4793-b9e7-aa120a718845\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-55c8bffd6-9jwhw" Apr 16 20:16:59.133692 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:16:59.133597 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-55c8bffd6-9jwhw" Apr 16 20:16:59.276293 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:16:59.276263 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-55c8bffd6-9jwhw"] Apr 16 20:16:59.279802 ip-10-0-128-48 kubenswrapper[2572]: W0416 20:16:59.279775 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod45dc5182_f8dc_4793_b9e7_aa120a718845.slice/crio-2ec8a850deecd3f90866b64be0322cf48f88a54b3ed453dc9b26f7991ca4ed26 WatchSource:0}: Error finding container 2ec8a850deecd3f90866b64be0322cf48f88a54b3ed453dc9b26f7991ca4ed26: Status 404 returned error can't find the container with id 2ec8a850deecd3f90866b64be0322cf48f88a54b3ed453dc9b26f7991ca4ed26 Apr 16 20:17:00.117144 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:17:00.117085 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-55c8bffd6-9jwhw" event={"ID":"45dc5182-f8dc-4793-b9e7-aa120a718845","Type":"ContainerStarted","Data":"7fb51a82719af14a531c234a55855885ed862cf51d325924d3c02ae54bbf6130"} Apr 16 20:17:00.117516 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:17:00.117151 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-55c8bffd6-9jwhw" event={"ID":"45dc5182-f8dc-4793-b9e7-aa120a718845","Type":"ContainerStarted","Data":"2ec8a850deecd3f90866b64be0322cf48f88a54b3ed453dc9b26f7991ca4ed26"} Apr 16 20:17:00.799000 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:17:00.798951 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-f5b5466ff-frvkh" podUID="d527aadd-baba-4971-b2e9-5df1ede5f2d0" containerName="main" probeResult="failure" output="Get \"https://10.132.0.34:8001/health\": dial tcp 10.132.0.34:8001: connect: connection refused" Apr 16 20:17:00.831215 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:17:00.831165 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6967dsr" podUID="93e8c4ec-79da-4421-be85-19b6810b3069" containerName="main" probeResult="failure" output="Get \"https://10.132.0.35:8000/health\": dial tcp 10.132.0.35:8000: connect: connection refused" Apr 16 20:17:05.138329 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:17:05.138290 2572 generic.go:358] "Generic (PLEG): container finished" podID="45dc5182-f8dc-4793-b9e7-aa120a718845" containerID="7fb51a82719af14a531c234a55855885ed862cf51d325924d3c02ae54bbf6130" exitCode=0 Apr 16 20:17:05.138716 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:17:05.138357 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-55c8bffd6-9jwhw" event={"ID":"45dc5182-f8dc-4793-b9e7-aa120a718845","Type":"ContainerDied","Data":"7fb51a82719af14a531c234a55855885ed862cf51d325924d3c02ae54bbf6130"} Apr 16 20:17:06.143841 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:17:06.143803 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-55c8bffd6-9jwhw" event={"ID":"45dc5182-f8dc-4793-b9e7-aa120a718845","Type":"ContainerStarted","Data":"12edcd79d04f6334c88db91fd0a36e79df70705c5e9f6dd211d2c6c82852e658"} Apr 16 20:17:06.166789 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:17:06.166712 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/stop-feature-test-kserve-55c8bffd6-9jwhw" podStartSLOduration=8.166693277 podStartE2EDuration="8.166693277s" podCreationTimestamp="2026-04-16 20:16:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:17:06.164851318 +0000 UTC m=+1393.343613342" watchObservedRunningTime="2026-04-16 20:17:06.166693277 +0000 UTC m=+1393.345455306" Apr 16 20:17:06.304138 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:17:06.304087 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-55c8bffd6-2lnnw_05be8fbd-8c8e-4bef-aa87-0ccb67a6875f/main/0.log" Apr 16 20:17:06.304473 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:17:06.304457 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-55c8bffd6-2lnnw" Apr 16 20:17:06.411512 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:17:06.411479 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/05be8fbd-8c8e-4bef-aa87-0ccb67a6875f-kserve-provision-location\") pod \"05be8fbd-8c8e-4bef-aa87-0ccb67a6875f\" (UID: \"05be8fbd-8c8e-4bef-aa87-0ccb67a6875f\") " Apr 16 20:17:06.411684 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:17:06.411521 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/05be8fbd-8c8e-4bef-aa87-0ccb67a6875f-dshm\") pod \"05be8fbd-8c8e-4bef-aa87-0ccb67a6875f\" (UID: \"05be8fbd-8c8e-4bef-aa87-0ccb67a6875f\") " Apr 16 20:17:06.411684 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:17:06.411576 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/05be8fbd-8c8e-4bef-aa87-0ccb67a6875f-tls-certs\") pod \"05be8fbd-8c8e-4bef-aa87-0ccb67a6875f\" (UID: \"05be8fbd-8c8e-4bef-aa87-0ccb67a6875f\") " Apr 16 20:17:06.411684 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:17:06.411634 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/05be8fbd-8c8e-4bef-aa87-0ccb67a6875f-model-cache\") pod \"05be8fbd-8c8e-4bef-aa87-0ccb67a6875f\" (UID: \"05be8fbd-8c8e-4bef-aa87-0ccb67a6875f\") " Apr 16 20:17:06.411851 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:17:06.411686 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sdfl5\" (UniqueName: \"kubernetes.io/projected/05be8fbd-8c8e-4bef-aa87-0ccb67a6875f-kube-api-access-sdfl5\") pod \"05be8fbd-8c8e-4bef-aa87-0ccb67a6875f\" (UID: \"05be8fbd-8c8e-4bef-aa87-0ccb67a6875f\") " Apr 16 20:17:06.411851 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:17:06.411714 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/05be8fbd-8c8e-4bef-aa87-0ccb67a6875f-home\") pod \"05be8fbd-8c8e-4bef-aa87-0ccb67a6875f\" (UID: \"05be8fbd-8c8e-4bef-aa87-0ccb67a6875f\") " Apr 16 20:17:06.412065 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:17:06.412028 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/05be8fbd-8c8e-4bef-aa87-0ccb67a6875f-model-cache" (OuterVolumeSpecName: "model-cache") pod "05be8fbd-8c8e-4bef-aa87-0ccb67a6875f" (UID: "05be8fbd-8c8e-4bef-aa87-0ccb67a6875f"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:17:06.412355 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:17:06.412332 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/05be8fbd-8c8e-4bef-aa87-0ccb67a6875f-home" (OuterVolumeSpecName: "home") pod "05be8fbd-8c8e-4bef-aa87-0ccb67a6875f" (UID: "05be8fbd-8c8e-4bef-aa87-0ccb67a6875f"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:17:06.414280 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:17:06.414250 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/05be8fbd-8c8e-4bef-aa87-0ccb67a6875f-dshm" (OuterVolumeSpecName: "dshm") pod "05be8fbd-8c8e-4bef-aa87-0ccb67a6875f" (UID: "05be8fbd-8c8e-4bef-aa87-0ccb67a6875f"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:17:06.414406 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:17:06.414286 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/05be8fbd-8c8e-4bef-aa87-0ccb67a6875f-kube-api-access-sdfl5" (OuterVolumeSpecName: "kube-api-access-sdfl5") pod "05be8fbd-8c8e-4bef-aa87-0ccb67a6875f" (UID: "05be8fbd-8c8e-4bef-aa87-0ccb67a6875f"). InnerVolumeSpecName "kube-api-access-sdfl5". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 20:17:06.414484 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:17:06.414467 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/05be8fbd-8c8e-4bef-aa87-0ccb67a6875f-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "05be8fbd-8c8e-4bef-aa87-0ccb67a6875f" (UID: "05be8fbd-8c8e-4bef-aa87-0ccb67a6875f"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:17:06.468902 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:17:06.468857 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/05be8fbd-8c8e-4bef-aa87-0ccb67a6875f-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "05be8fbd-8c8e-4bef-aa87-0ccb67a6875f" (UID: "05be8fbd-8c8e-4bef-aa87-0ccb67a6875f"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:17:06.513317 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:17:06.513278 2572 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/05be8fbd-8c8e-4bef-aa87-0ccb67a6875f-model-cache\") on node \"ip-10-0-128-48.ec2.internal\" DevicePath \"\"" Apr 16 20:17:06.513317 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:17:06.513314 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-sdfl5\" (UniqueName: \"kubernetes.io/projected/05be8fbd-8c8e-4bef-aa87-0ccb67a6875f-kube-api-access-sdfl5\") on node \"ip-10-0-128-48.ec2.internal\" DevicePath \"\"" Apr 16 20:17:06.513508 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:17:06.513330 2572 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/05be8fbd-8c8e-4bef-aa87-0ccb67a6875f-home\") on node \"ip-10-0-128-48.ec2.internal\" DevicePath \"\"" Apr 16 20:17:06.513508 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:17:06.513344 2572 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/05be8fbd-8c8e-4bef-aa87-0ccb67a6875f-kserve-provision-location\") on node \"ip-10-0-128-48.ec2.internal\" DevicePath \"\"" Apr 16 20:17:06.513508 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:17:06.513356 2572 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/05be8fbd-8c8e-4bef-aa87-0ccb67a6875f-dshm\") on node \"ip-10-0-128-48.ec2.internal\" DevicePath \"\"" Apr 16 20:17:06.513508 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:17:06.513367 2572 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/05be8fbd-8c8e-4bef-aa87-0ccb67a6875f-tls-certs\") on node \"ip-10-0-128-48.ec2.internal\" DevicePath \"\"" Apr 16 20:17:07.148917 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:17:07.148886 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-55c8bffd6-2lnnw_05be8fbd-8c8e-4bef-aa87-0ccb67a6875f/main/0.log" Apr 16 20:17:07.149333 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:17:07.149293 2572 generic.go:358] "Generic (PLEG): container finished" podID="05be8fbd-8c8e-4bef-aa87-0ccb67a6875f" containerID="da91519ead7d06207b26d530a88a8ccfb9e90eb31b6bbf0043ce830dfc392a39" exitCode=137 Apr 16 20:17:07.149387 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:17:07.149333 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-55c8bffd6-2lnnw" event={"ID":"05be8fbd-8c8e-4bef-aa87-0ccb67a6875f","Type":"ContainerDied","Data":"da91519ead7d06207b26d530a88a8ccfb9e90eb31b6bbf0043ce830dfc392a39"} Apr 16 20:17:07.149387 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:17:07.149360 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-55c8bffd6-2lnnw" event={"ID":"05be8fbd-8c8e-4bef-aa87-0ccb67a6875f","Type":"ContainerDied","Data":"e982cff5ce02998fbe78bd1e93b30c34822b1ea78b430850d410a27e7ca827e5"} Apr 16 20:17:07.149387 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:17:07.149375 2572 scope.go:117] "RemoveContainer" containerID="da91519ead7d06207b26d530a88a8ccfb9e90eb31b6bbf0043ce830dfc392a39" Apr 16 20:17:07.149524 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:17:07.149419 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-55c8bffd6-2lnnw" Apr 16 20:17:07.171452 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:17:07.171203 2572 scope.go:117] "RemoveContainer" containerID="0c7a480cd26efb06096747426a98b228761130437f6f3b14453149ecc1e40d88" Apr 16 20:17:07.175334 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:17:07.175310 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-55c8bffd6-2lnnw"] Apr 16 20:17:07.179468 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:17:07.179441 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-55c8bffd6-2lnnw"] Apr 16 20:17:07.183669 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:17:07.183643 2572 scope.go:117] "RemoveContainer" containerID="da91519ead7d06207b26d530a88a8ccfb9e90eb31b6bbf0043ce830dfc392a39" Apr 16 20:17:07.183969 ip-10-0-128-48 kubenswrapper[2572]: E0416 20:17:07.183942 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"da91519ead7d06207b26d530a88a8ccfb9e90eb31b6bbf0043ce830dfc392a39\": container with ID starting with da91519ead7d06207b26d530a88a8ccfb9e90eb31b6bbf0043ce830dfc392a39 not found: ID does not exist" containerID="da91519ead7d06207b26d530a88a8ccfb9e90eb31b6bbf0043ce830dfc392a39" Apr 16 20:17:07.184050 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:17:07.183978 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da91519ead7d06207b26d530a88a8ccfb9e90eb31b6bbf0043ce830dfc392a39"} err="failed to get container status \"da91519ead7d06207b26d530a88a8ccfb9e90eb31b6bbf0043ce830dfc392a39\": rpc error: code = NotFound desc = could not find container \"da91519ead7d06207b26d530a88a8ccfb9e90eb31b6bbf0043ce830dfc392a39\": container with ID starting with da91519ead7d06207b26d530a88a8ccfb9e90eb31b6bbf0043ce830dfc392a39 not found: ID does not exist" Apr 16 20:17:07.184050 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:17:07.184003 2572 scope.go:117] "RemoveContainer" containerID="0c7a480cd26efb06096747426a98b228761130437f6f3b14453149ecc1e40d88" Apr 16 20:17:07.184295 ip-10-0-128-48 kubenswrapper[2572]: E0416 20:17:07.184275 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0c7a480cd26efb06096747426a98b228761130437f6f3b14453149ecc1e40d88\": container with ID starting with 0c7a480cd26efb06096747426a98b228761130437f6f3b14453149ecc1e40d88 not found: ID does not exist" containerID="0c7a480cd26efb06096747426a98b228761130437f6f3b14453149ecc1e40d88" Apr 16 20:17:07.184374 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:17:07.184305 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0c7a480cd26efb06096747426a98b228761130437f6f3b14453149ecc1e40d88"} err="failed to get container status \"0c7a480cd26efb06096747426a98b228761130437f6f3b14453149ecc1e40d88\": rpc error: code = NotFound desc = could not find container \"0c7a480cd26efb06096747426a98b228761130437f6f3b14453149ecc1e40d88\": container with ID starting with 0c7a480cd26efb06096747426a98b228761130437f6f3b14453149ecc1e40d88 not found: ID does not exist" Apr 16 20:17:07.366960 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:17:07.366926 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="05be8fbd-8c8e-4bef-aa87-0ccb67a6875f" path="/var/lib/kubelet/pods/05be8fbd-8c8e-4bef-aa87-0ccb67a6875f/volumes" Apr 16 20:17:09.134064 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:17:09.134020 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-55c8bffd6-9jwhw" Apr 16 20:17:09.134611 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:17:09.134086 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/stop-feature-test-kserve-55c8bffd6-9jwhw" Apr 16 20:17:09.136222 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:17:09.136179 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-55c8bffd6-9jwhw" podUID="45dc5182-f8dc-4793-b9e7-aa120a718845" containerName="main" probeResult="failure" output="Get \"https://10.132.0.36:8000/health\": dial tcp 10.132.0.36:8000: connect: connection refused" Apr 16 20:17:10.808592 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:17:10.808561 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-f5b5466ff-frvkh" Apr 16 20:17:10.828597 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:17:10.828569 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-f5b5466ff-frvkh" Apr 16 20:17:10.832125 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:17:10.831803 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6967dsr" podUID="93e8c4ec-79da-4421-be85-19b6810b3069" containerName="main" probeResult="failure" output="Get \"https://10.132.0.35:8000/health\": dial tcp 10.132.0.35:8000: connect: connection refused" Apr 16 20:17:19.134489 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:17:19.134447 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-55c8bffd6-9jwhw" podUID="45dc5182-f8dc-4793-b9e7-aa120a718845" containerName="main" probeResult="failure" output="Get \"https://10.132.0.36:8000/health\": dial tcp 10.132.0.36:8000: connect: connection refused" Apr 16 20:17:20.841736 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:17:20.841698 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6967dsr" Apr 16 20:17:20.849967 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:17:20.849939 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6967dsr" Apr 16 20:17:29.134901 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:17:29.134359 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-55c8bffd6-9jwhw" podUID="45dc5182-f8dc-4793-b9e7-aa120a718845" containerName="main" probeResult="failure" output="Get \"https://10.132.0.36:8000/health\": dial tcp 10.132.0.36:8000: connect: connection refused" Apr 16 20:17:32.311776 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:17:32.311742 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6967dsr"] Apr 16 20:17:32.312453 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:17:32.312201 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6967dsr" podUID="93e8c4ec-79da-4421-be85-19b6810b3069" containerName="main" containerID="cri-o://48d20975e294cbccfb434acaabbeb864c96fea15d80b070c5434d549e099f5d8" gracePeriod=30 Apr 16 20:17:32.317991 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:17:32.317962 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-f5b5466ff-frvkh"] Apr 16 20:17:32.318290 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:17:32.318267 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-f5b5466ff-frvkh" podUID="d527aadd-baba-4971-b2e9-5df1ede5f2d0" containerName="main" containerID="cri-o://80999210b81f313b611bfe44191ee8260144aafd07df018b54224475ac9de3e0" gracePeriod=30 Apr 16 20:17:35.553711 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:17:35.553669 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6699675ffd-sm45k"] Apr 16 20:17:35.554088 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:17:35.553990 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="05be8fbd-8c8e-4bef-aa87-0ccb67a6875f" containerName="storage-initializer" Apr 16 20:17:35.554088 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:17:35.554002 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="05be8fbd-8c8e-4bef-aa87-0ccb67a6875f" containerName="storage-initializer" Apr 16 20:17:35.554088 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:17:35.554010 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="05be8fbd-8c8e-4bef-aa87-0ccb67a6875f" containerName="main" Apr 16 20:17:35.554088 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:17:35.554015 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="05be8fbd-8c8e-4bef-aa87-0ccb67a6875f" containerName="main" Apr 16 20:17:35.554088 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:17:35.554064 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="05be8fbd-8c8e-4bef-aa87-0ccb67a6875f" containerName="main" Apr 16 20:17:35.558714 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:17:35.558697 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6699675ffd-sm45k" Apr 16 20:17:35.561409 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:17:35.561384 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"custom-route-timeout-pd-test-kserve-dockercfg-zhchc\"" Apr 16 20:17:35.561530 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:17:35.561384 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"custom-route-timeout-pd-test-kserve-self-signed-certs\"" Apr 16 20:17:35.568827 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:17:35.568807 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6699675ffd-sm45k"] Apr 16 20:17:35.580069 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:17:35.580045 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-56987b47c5-7rx8c"] Apr 16 20:17:35.583632 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:17:35.583597 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-56987b47c5-7rx8c" Apr 16 20:17:35.596524 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:17:35.596499 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-56987b47c5-7rx8c"] Apr 16 20:17:35.678039 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:17:35.678015 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/55379d66-7d64-4888-ad47-e977a8686a02-dshm\") pod \"custom-route-timeout-pd-test-kserve-6699675ffd-sm45k\" (UID: \"55379d66-7d64-4888-ad47-e977a8686a02\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6699675ffd-sm45k" Apr 16 20:17:35.678193 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:17:35.678053 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/53d396a1-380e-4e12-aef0-d8cb1c29f4bb-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-prefill-56987b47c5-7rx8c\" (UID: \"53d396a1-380e-4e12-aef0-d8cb1c29f4bb\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-56987b47c5-7rx8c" Apr 16 20:17:35.678193 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:17:35.678084 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/53d396a1-380e-4e12-aef0-d8cb1c29f4bb-dshm\") pod \"custom-route-timeout-pd-test-kserve-prefill-56987b47c5-7rx8c\" (UID: \"53d396a1-380e-4e12-aef0-d8cb1c29f4bb\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-56987b47c5-7rx8c" Apr 16 20:17:35.678193 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:17:35.678154 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/55379d66-7d64-4888-ad47-e977a8686a02-home\") pod \"custom-route-timeout-pd-test-kserve-6699675ffd-sm45k\" (UID: \"55379d66-7d64-4888-ad47-e977a8686a02\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6699675ffd-sm45k" Apr 16 20:17:35.678332 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:17:35.678193 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/53d396a1-380e-4e12-aef0-d8cb1c29f4bb-home\") pod \"custom-route-timeout-pd-test-kserve-prefill-56987b47c5-7rx8c\" (UID: \"53d396a1-380e-4e12-aef0-d8cb1c29f4bb\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-56987b47c5-7rx8c" Apr 16 20:17:35.678332 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:17:35.678224 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/53d396a1-380e-4e12-aef0-d8cb1c29f4bb-model-cache\") pod \"custom-route-timeout-pd-test-kserve-prefill-56987b47c5-7rx8c\" (UID: \"53d396a1-380e-4e12-aef0-d8cb1c29f4bb\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-56987b47c5-7rx8c" Apr 16 20:17:35.678332 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:17:35.678245 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kpdnw\" (UniqueName: \"kubernetes.io/projected/53d396a1-380e-4e12-aef0-d8cb1c29f4bb-kube-api-access-kpdnw\") pod \"custom-route-timeout-pd-test-kserve-prefill-56987b47c5-7rx8c\" (UID: \"53d396a1-380e-4e12-aef0-d8cb1c29f4bb\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-56987b47c5-7rx8c" Apr 16 20:17:35.678332 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:17:35.678274 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8mbnd\" (UniqueName: \"kubernetes.io/projected/55379d66-7d64-4888-ad47-e977a8686a02-kube-api-access-8mbnd\") pod \"custom-route-timeout-pd-test-kserve-6699675ffd-sm45k\" (UID: \"55379d66-7d64-4888-ad47-e977a8686a02\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6699675ffd-sm45k" Apr 16 20:17:35.678332 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:17:35.678307 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/55379d66-7d64-4888-ad47-e977a8686a02-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-6699675ffd-sm45k\" (UID: \"55379d66-7d64-4888-ad47-e977a8686a02\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6699675ffd-sm45k" Apr 16 20:17:35.678557 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:17:35.678372 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/55379d66-7d64-4888-ad47-e977a8686a02-model-cache\") pod \"custom-route-timeout-pd-test-kserve-6699675ffd-sm45k\" (UID: \"55379d66-7d64-4888-ad47-e977a8686a02\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6699675ffd-sm45k" Apr 16 20:17:35.678557 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:17:35.678408 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/53d396a1-380e-4e12-aef0-d8cb1c29f4bb-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-prefill-56987b47c5-7rx8c\" (UID: \"53d396a1-380e-4e12-aef0-d8cb1c29f4bb\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-56987b47c5-7rx8c" Apr 16 20:17:35.678557 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:17:35.678444 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/55379d66-7d64-4888-ad47-e977a8686a02-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-6699675ffd-sm45k\" (UID: \"55379d66-7d64-4888-ad47-e977a8686a02\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6699675ffd-sm45k" Apr 16 20:17:35.779842 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:17:35.779809 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/53d396a1-380e-4e12-aef0-d8cb1c29f4bb-dshm\") pod \"custom-route-timeout-pd-test-kserve-prefill-56987b47c5-7rx8c\" (UID: \"53d396a1-380e-4e12-aef0-d8cb1c29f4bb\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-56987b47c5-7rx8c" Apr 16 20:17:35.780013 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:17:35.779851 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/55379d66-7d64-4888-ad47-e977a8686a02-home\") pod \"custom-route-timeout-pd-test-kserve-6699675ffd-sm45k\" (UID: \"55379d66-7d64-4888-ad47-e977a8686a02\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6699675ffd-sm45k" Apr 16 20:17:35.780013 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:17:35.779892 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/53d396a1-380e-4e12-aef0-d8cb1c29f4bb-home\") pod \"custom-route-timeout-pd-test-kserve-prefill-56987b47c5-7rx8c\" (UID: \"53d396a1-380e-4e12-aef0-d8cb1c29f4bb\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-56987b47c5-7rx8c" Apr 16 20:17:35.780013 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:17:35.779928 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/53d396a1-380e-4e12-aef0-d8cb1c29f4bb-model-cache\") pod \"custom-route-timeout-pd-test-kserve-prefill-56987b47c5-7rx8c\" (UID: \"53d396a1-380e-4e12-aef0-d8cb1c29f4bb\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-56987b47c5-7rx8c" Apr 16 20:17:35.780013 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:17:35.779956 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kpdnw\" (UniqueName: \"kubernetes.io/projected/53d396a1-380e-4e12-aef0-d8cb1c29f4bb-kube-api-access-kpdnw\") pod \"custom-route-timeout-pd-test-kserve-prefill-56987b47c5-7rx8c\" (UID: \"53d396a1-380e-4e12-aef0-d8cb1c29f4bb\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-56987b47c5-7rx8c" Apr 16 20:17:35.780013 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:17:35.779992 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8mbnd\" (UniqueName: \"kubernetes.io/projected/55379d66-7d64-4888-ad47-e977a8686a02-kube-api-access-8mbnd\") pod \"custom-route-timeout-pd-test-kserve-6699675ffd-sm45k\" (UID: \"55379d66-7d64-4888-ad47-e977a8686a02\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6699675ffd-sm45k" Apr 16 20:17:35.780315 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:17:35.780045 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/55379d66-7d64-4888-ad47-e977a8686a02-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-6699675ffd-sm45k\" (UID: \"55379d66-7d64-4888-ad47-e977a8686a02\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6699675ffd-sm45k" Apr 16 20:17:35.780315 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:17:35.780074 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/55379d66-7d64-4888-ad47-e977a8686a02-model-cache\") pod \"custom-route-timeout-pd-test-kserve-6699675ffd-sm45k\" (UID: \"55379d66-7d64-4888-ad47-e977a8686a02\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6699675ffd-sm45k" Apr 16 20:17:35.780315 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:17:35.780117 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/53d396a1-380e-4e12-aef0-d8cb1c29f4bb-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-prefill-56987b47c5-7rx8c\" (UID: \"53d396a1-380e-4e12-aef0-d8cb1c29f4bb\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-56987b47c5-7rx8c" Apr 16 20:17:35.780315 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:17:35.780153 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/55379d66-7d64-4888-ad47-e977a8686a02-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-6699675ffd-sm45k\" (UID: \"55379d66-7d64-4888-ad47-e977a8686a02\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6699675ffd-sm45k" Apr 16 20:17:35.780315 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:17:35.780188 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/55379d66-7d64-4888-ad47-e977a8686a02-dshm\") pod \"custom-route-timeout-pd-test-kserve-6699675ffd-sm45k\" (UID: \"55379d66-7d64-4888-ad47-e977a8686a02\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6699675ffd-sm45k" Apr 16 20:17:35.780315 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:17:35.780235 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/53d396a1-380e-4e12-aef0-d8cb1c29f4bb-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-prefill-56987b47c5-7rx8c\" (UID: \"53d396a1-380e-4e12-aef0-d8cb1c29f4bb\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-56987b47c5-7rx8c" Apr 16 20:17:35.780620 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:17:35.780313 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/55379d66-7d64-4888-ad47-e977a8686a02-home\") pod \"custom-route-timeout-pd-test-kserve-6699675ffd-sm45k\" (UID: \"55379d66-7d64-4888-ad47-e977a8686a02\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6699675ffd-sm45k" Apr 16 20:17:35.780620 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:17:35.780335 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/53d396a1-380e-4e12-aef0-d8cb1c29f4bb-home\") pod \"custom-route-timeout-pd-test-kserve-prefill-56987b47c5-7rx8c\" (UID: \"53d396a1-380e-4e12-aef0-d8cb1c29f4bb\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-56987b47c5-7rx8c" Apr 16 20:17:35.780620 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:17:35.780544 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/53d396a1-380e-4e12-aef0-d8cb1c29f4bb-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-prefill-56987b47c5-7rx8c\" (UID: \"53d396a1-380e-4e12-aef0-d8cb1c29f4bb\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-56987b47c5-7rx8c" Apr 16 20:17:35.780620 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:17:35.780583 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/55379d66-7d64-4888-ad47-e977a8686a02-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-6699675ffd-sm45k\" (UID: \"55379d66-7d64-4888-ad47-e977a8686a02\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6699675ffd-sm45k" Apr 16 20:17:35.780815 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:17:35.780630 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/55379d66-7d64-4888-ad47-e977a8686a02-model-cache\") pod \"custom-route-timeout-pd-test-kserve-6699675ffd-sm45k\" (UID: \"55379d66-7d64-4888-ad47-e977a8686a02\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6699675ffd-sm45k" Apr 16 20:17:35.781058 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:17:35.781035 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/53d396a1-380e-4e12-aef0-d8cb1c29f4bb-model-cache\") pod \"custom-route-timeout-pd-test-kserve-prefill-56987b47c5-7rx8c\" (UID: \"53d396a1-380e-4e12-aef0-d8cb1c29f4bb\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-56987b47c5-7rx8c" Apr 16 20:17:35.782652 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:17:35.782627 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/55379d66-7d64-4888-ad47-e977a8686a02-dshm\") pod \"custom-route-timeout-pd-test-kserve-6699675ffd-sm45k\" (UID: \"55379d66-7d64-4888-ad47-e977a8686a02\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6699675ffd-sm45k" Apr 16 20:17:35.782756 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:17:35.782697 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/53d396a1-380e-4e12-aef0-d8cb1c29f4bb-dshm\") pod \"custom-route-timeout-pd-test-kserve-prefill-56987b47c5-7rx8c\" (UID: \"53d396a1-380e-4e12-aef0-d8cb1c29f4bb\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-56987b47c5-7rx8c" Apr 16 20:17:35.782820 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:17:35.782804 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/55379d66-7d64-4888-ad47-e977a8686a02-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-6699675ffd-sm45k\" (UID: \"55379d66-7d64-4888-ad47-e977a8686a02\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6699675ffd-sm45k" Apr 16 20:17:35.783167 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:17:35.783148 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/53d396a1-380e-4e12-aef0-d8cb1c29f4bb-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-prefill-56987b47c5-7rx8c\" (UID: \"53d396a1-380e-4e12-aef0-d8cb1c29f4bb\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-56987b47c5-7rx8c" Apr 16 20:17:35.789314 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:17:35.789289 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8mbnd\" (UniqueName: \"kubernetes.io/projected/55379d66-7d64-4888-ad47-e977a8686a02-kube-api-access-8mbnd\") pod \"custom-route-timeout-pd-test-kserve-6699675ffd-sm45k\" (UID: \"55379d66-7d64-4888-ad47-e977a8686a02\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6699675ffd-sm45k" Apr 16 20:17:35.790749 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:17:35.790721 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kpdnw\" (UniqueName: \"kubernetes.io/projected/53d396a1-380e-4e12-aef0-d8cb1c29f4bb-kube-api-access-kpdnw\") pod \"custom-route-timeout-pd-test-kserve-prefill-56987b47c5-7rx8c\" (UID: \"53d396a1-380e-4e12-aef0-d8cb1c29f4bb\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-56987b47c5-7rx8c" Apr 16 20:17:35.870833 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:17:35.870757 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6699675ffd-sm45k" Apr 16 20:17:35.898824 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:17:35.898792 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-56987b47c5-7rx8c" Apr 16 20:17:36.022428 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:17:36.021695 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6699675ffd-sm45k"] Apr 16 20:17:36.025305 ip-10-0-128-48 kubenswrapper[2572]: W0416 20:17:36.025084 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod55379d66_7d64_4888_ad47_e977a8686a02.slice/crio-5f886570908f0e7ba1cd4d3aced28ea7b6f310d40676e7f19d43f08f8c81ad75 WatchSource:0}: Error finding container 5f886570908f0e7ba1cd4d3aced28ea7b6f310d40676e7f19d43f08f8c81ad75: Status 404 returned error can't find the container with id 5f886570908f0e7ba1cd4d3aced28ea7b6f310d40676e7f19d43f08f8c81ad75 Apr 16 20:17:36.027399 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:17:36.027375 2572 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 20:17:36.047776 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:17:36.047753 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-56987b47c5-7rx8c"] Apr 16 20:17:36.050987 ip-10-0-128-48 kubenswrapper[2572]: W0416 20:17:36.050965 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod53d396a1_380e_4e12_aef0_d8cb1c29f4bb.slice/crio-a8f1ea1d23f49dea6dc0006b98a6b599e5f821eaa1afe98daf7acd344a73560e WatchSource:0}: Error finding container a8f1ea1d23f49dea6dc0006b98a6b599e5f821eaa1afe98daf7acd344a73560e: Status 404 returned error can't find the container with id a8f1ea1d23f49dea6dc0006b98a6b599e5f821eaa1afe98daf7acd344a73560e Apr 16 20:17:36.255820 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:17:36.255685 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6699675ffd-sm45k" event={"ID":"55379d66-7d64-4888-ad47-e977a8686a02","Type":"ContainerStarted","Data":"a05bd43e0fd99b4d1c37eb876a649416190319e3d8594ef87ebb41c28e9b2273"} Apr 16 20:17:36.255820 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:17:36.255735 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6699675ffd-sm45k" event={"ID":"55379d66-7d64-4888-ad47-e977a8686a02","Type":"ContainerStarted","Data":"5f886570908f0e7ba1cd4d3aced28ea7b6f310d40676e7f19d43f08f8c81ad75"} Apr 16 20:17:36.255820 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:17:36.255757 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6699675ffd-sm45k" Apr 16 20:17:36.257266 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:17:36.257233 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-56987b47c5-7rx8c" event={"ID":"53d396a1-380e-4e12-aef0-d8cb1c29f4bb","Type":"ContainerStarted","Data":"57a1a33896d9298a404363ebe5163ed2e6f407958aa0f22c91cefc38cff5fc86"} Apr 16 20:17:36.257445 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:17:36.257271 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-56987b47c5-7rx8c" event={"ID":"53d396a1-380e-4e12-aef0-d8cb1c29f4bb","Type":"ContainerStarted","Data":"a8f1ea1d23f49dea6dc0006b98a6b599e5f821eaa1afe98daf7acd344a73560e"} Apr 16 20:17:37.263932 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:17:37.263891 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6699675ffd-sm45k" event={"ID":"55379d66-7d64-4888-ad47-e977a8686a02","Type":"ContainerStarted","Data":"8f000e126a7d0b4ae10eaaf0bd5ee95d9d104eb1f7a03d506d4e3b30b6ae3f7d"} Apr 16 20:17:39.134399 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:17:39.134349 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-55c8bffd6-9jwhw" podUID="45dc5182-f8dc-4793-b9e7-aa120a718845" containerName="main" probeResult="failure" output="Get \"https://10.132.0.36:8000/health\": dial tcp 10.132.0.36:8000: connect: connection refused" Apr 16 20:17:41.283011 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:17:41.282976 2572 generic.go:358] "Generic (PLEG): container finished" podID="53d396a1-380e-4e12-aef0-d8cb1c29f4bb" containerID="57a1a33896d9298a404363ebe5163ed2e6f407958aa0f22c91cefc38cff5fc86" exitCode=0 Apr 16 20:17:41.283488 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:17:41.283058 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-56987b47c5-7rx8c" event={"ID":"53d396a1-380e-4e12-aef0-d8cb1c29f4bb","Type":"ContainerDied","Data":"57a1a33896d9298a404363ebe5163ed2e6f407958aa0f22c91cefc38cff5fc86"} Apr 16 20:17:41.284978 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:17:41.284950 2572 generic.go:358] "Generic (PLEG): container finished" podID="55379d66-7d64-4888-ad47-e977a8686a02" containerID="8f000e126a7d0b4ae10eaaf0bd5ee95d9d104eb1f7a03d506d4e3b30b6ae3f7d" exitCode=0 Apr 16 20:17:41.285077 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:17:41.285019 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6699675ffd-sm45k" event={"ID":"55379d66-7d64-4888-ad47-e977a8686a02","Type":"ContainerDied","Data":"8f000e126a7d0b4ae10eaaf0bd5ee95d9d104eb1f7a03d506d4e3b30b6ae3f7d"} Apr 16 20:17:42.290236 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:17:42.290143 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-56987b47c5-7rx8c" event={"ID":"53d396a1-380e-4e12-aef0-d8cb1c29f4bb","Type":"ContainerStarted","Data":"25b9a79fbc8247b14792c17293b446687bf89a30111a3350cb81c5ae0786b0ef"} Apr 16 20:17:42.292501 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:17:42.292472 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6699675ffd-sm45k" event={"ID":"55379d66-7d64-4888-ad47-e977a8686a02","Type":"ContainerStarted","Data":"10d15a105dcb26dfa39df60e9471444942257af52f1de6464767606836a92d71"} Apr 16 20:17:42.313532 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:17:42.313475 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-56987b47c5-7rx8c" podStartSLOduration=7.313454189 podStartE2EDuration="7.313454189s" podCreationTimestamp="2026-04-16 20:17:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:17:42.31120345 +0000 UTC m=+1429.489965479" watchObservedRunningTime="2026-04-16 20:17:42.313454189 +0000 UTC m=+1429.492216212" Apr 16 20:17:42.335855 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:17:42.335797 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6699675ffd-sm45k" podStartSLOduration=7.335777351 podStartE2EDuration="7.335777351s" podCreationTimestamp="2026-04-16 20:17:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:17:42.334573994 +0000 UTC m=+1429.513336023" watchObservedRunningTime="2026-04-16 20:17:42.335777351 +0000 UTC m=+1429.514539376" Apr 16 20:17:45.870985 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:17:45.870941 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6699675ffd-sm45k" Apr 16 20:17:45.870985 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:17:45.870990 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6699675ffd-sm45k" Apr 16 20:17:45.872264 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:17:45.872233 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6699675ffd-sm45k" podUID="55379d66-7d64-4888-ad47-e977a8686a02" containerName="main" probeResult="failure" output="Get \"https://10.132.0.37:8001/health\": dial tcp 10.132.0.37:8001: connect: connection refused" Apr 16 20:17:45.899156 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:17:45.899118 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-56987b47c5-7rx8c" Apr 16 20:17:45.899326 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:17:45.899166 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-56987b47c5-7rx8c" Apr 16 20:17:45.900633 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:17:45.900593 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-56987b47c5-7rx8c" podUID="53d396a1-380e-4e12-aef0-d8cb1c29f4bb" containerName="main" probeResult="failure" output="Get \"https://10.132.0.38:8000/health\": dial tcp 10.132.0.38:8000: connect: connection refused" Apr 16 20:17:49.134315 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:17:49.134272 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-55c8bffd6-9jwhw" podUID="45dc5182-f8dc-4793-b9e7-aa120a718845" containerName="main" probeResult="failure" output="Get \"https://10.132.0.36:8000/health\": dial tcp 10.132.0.36:8000: connect: connection refused" Apr 16 20:17:55.872190 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:17:55.872135 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6699675ffd-sm45k" podUID="55379d66-7d64-4888-ad47-e977a8686a02" containerName="main" probeResult="failure" output="Get \"https://10.132.0.37:8001/health\": dial tcp 10.132.0.37:8001: connect: connection refused" Apr 16 20:17:55.891557 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:17:55.891527 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6699675ffd-sm45k" Apr 16 20:17:55.899913 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:17:55.899871 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-56987b47c5-7rx8c" podUID="53d396a1-380e-4e12-aef0-d8cb1c29f4bb" containerName="main" probeResult="failure" output="Get \"https://10.132.0.38:8000/health\": dial tcp 10.132.0.38:8000: connect: connection refused" Apr 16 20:17:59.134648 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:17:59.134600 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-55c8bffd6-9jwhw" podUID="45dc5182-f8dc-4793-b9e7-aa120a718845" containerName="main" probeResult="failure" output="Get \"https://10.132.0.36:8000/health\": dial tcp 10.132.0.36:8000: connect: connection refused" Apr 16 20:18:02.318777 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:18:02.318739 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-f5b5466ff-frvkh" podUID="d527aadd-baba-4971-b2e9-5df1ede5f2d0" containerName="llm-d-routing-sidecar" containerID="cri-o://b7e081c84f432c85178363ebde3cdf303d0234509525fc701394854075bfa44e" gracePeriod=2 Apr 16 20:18:02.848540 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:18:02.848470 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6967dsr" Apr 16 20:18:02.852410 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:18:02.852388 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-f5b5466ff-frvkh_d527aadd-baba-4971-b2e9-5df1ede5f2d0/main/0.log" Apr 16 20:18:02.853112 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:18:02.853077 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-f5b5466ff-frvkh" Apr 16 20:18:02.941910 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:18:02.941880 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/d527aadd-baba-4971-b2e9-5df1ede5f2d0-model-cache\") pod \"d527aadd-baba-4971-b2e9-5df1ede5f2d0\" (UID: \"d527aadd-baba-4971-b2e9-5df1ede5f2d0\") " Apr 16 20:18:02.941910 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:18:02.941921 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/93e8c4ec-79da-4421-be85-19b6810b3069-dshm\") pod \"93e8c4ec-79da-4421-be85-19b6810b3069\" (UID: \"93e8c4ec-79da-4421-be85-19b6810b3069\") " Apr 16 20:18:02.942179 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:18:02.941967 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d527aadd-baba-4971-b2e9-5df1ede5f2d0-kserve-provision-location\") pod \"d527aadd-baba-4971-b2e9-5df1ede5f2d0\" (UID: \"d527aadd-baba-4971-b2e9-5df1ede5f2d0\") " Apr 16 20:18:02.942179 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:18:02.942038 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/d527aadd-baba-4971-b2e9-5df1ede5f2d0-dshm\") pod \"d527aadd-baba-4971-b2e9-5df1ede5f2d0\" (UID: \"d527aadd-baba-4971-b2e9-5df1ede5f2d0\") " Apr 16 20:18:02.942179 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:18:02.942060 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/93e8c4ec-79da-4421-be85-19b6810b3069-home\") pod \"93e8c4ec-79da-4421-be85-19b6810b3069\" (UID: \"93e8c4ec-79da-4421-be85-19b6810b3069\") " Apr 16 20:18:02.942179 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:18:02.942118 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/d527aadd-baba-4971-b2e9-5df1ede5f2d0-home\") pod \"d527aadd-baba-4971-b2e9-5df1ede5f2d0\" (UID: \"d527aadd-baba-4971-b2e9-5df1ede5f2d0\") " Apr 16 20:18:02.942179 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:18:02.942166 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/93e8c4ec-79da-4421-be85-19b6810b3069-model-cache\") pod \"93e8c4ec-79da-4421-be85-19b6810b3069\" (UID: \"93e8c4ec-79da-4421-be85-19b6810b3069\") " Apr 16 20:18:02.942440 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:18:02.942193 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-njq55\" (UniqueName: \"kubernetes.io/projected/d527aadd-baba-4971-b2e9-5df1ede5f2d0-kube-api-access-njq55\") pod \"d527aadd-baba-4971-b2e9-5df1ede5f2d0\" (UID: \"d527aadd-baba-4971-b2e9-5df1ede5f2d0\") " Apr 16 20:18:02.942440 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:18:02.942228 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d527aadd-baba-4971-b2e9-5df1ede5f2d0-tls-certs\") pod \"d527aadd-baba-4971-b2e9-5df1ede5f2d0\" (UID: \"d527aadd-baba-4971-b2e9-5df1ede5f2d0\") " Apr 16 20:18:02.942440 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:18:02.942283 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnd27\" (UniqueName: \"kubernetes.io/projected/93e8c4ec-79da-4421-be85-19b6810b3069-kube-api-access-mnd27\") pod \"93e8c4ec-79da-4421-be85-19b6810b3069\" (UID: \"93e8c4ec-79da-4421-be85-19b6810b3069\") " Apr 16 20:18:02.942440 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:18:02.942312 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/93e8c4ec-79da-4421-be85-19b6810b3069-kserve-provision-location\") pod \"93e8c4ec-79da-4421-be85-19b6810b3069\" (UID: \"93e8c4ec-79da-4421-be85-19b6810b3069\") " Apr 16 20:18:02.942440 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:18:02.942339 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/93e8c4ec-79da-4421-be85-19b6810b3069-tls-certs\") pod \"93e8c4ec-79da-4421-be85-19b6810b3069\" (UID: \"93e8c4ec-79da-4421-be85-19b6810b3069\") " Apr 16 20:18:02.942671 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:18:02.942187 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d527aadd-baba-4971-b2e9-5df1ede5f2d0-model-cache" (OuterVolumeSpecName: "model-cache") pod "d527aadd-baba-4971-b2e9-5df1ede5f2d0" (UID: "d527aadd-baba-4971-b2e9-5df1ede5f2d0"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:18:02.942671 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:18:02.942444 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/93e8c4ec-79da-4421-be85-19b6810b3069-home" (OuterVolumeSpecName: "home") pod "93e8c4ec-79da-4421-be85-19b6810b3069" (UID: "93e8c4ec-79da-4421-be85-19b6810b3069"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:18:02.942671 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:18:02.942467 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d527aadd-baba-4971-b2e9-5df1ede5f2d0-home" (OuterVolumeSpecName: "home") pod "d527aadd-baba-4971-b2e9-5df1ede5f2d0" (UID: "d527aadd-baba-4971-b2e9-5df1ede5f2d0"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:18:02.942671 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:18:02.942593 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/93e8c4ec-79da-4421-be85-19b6810b3069-model-cache" (OuterVolumeSpecName: "model-cache") pod "93e8c4ec-79da-4421-be85-19b6810b3069" (UID: "93e8c4ec-79da-4421-be85-19b6810b3069"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:18:02.942893 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:18:02.942771 2572 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/d527aadd-baba-4971-b2e9-5df1ede5f2d0-model-cache\") on node \"ip-10-0-128-48.ec2.internal\" DevicePath \"\"" Apr 16 20:18:02.942893 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:18:02.942788 2572 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/93e8c4ec-79da-4421-be85-19b6810b3069-home\") on node \"ip-10-0-128-48.ec2.internal\" DevicePath \"\"" Apr 16 20:18:02.942893 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:18:02.942801 2572 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/d527aadd-baba-4971-b2e9-5df1ede5f2d0-home\") on node \"ip-10-0-128-48.ec2.internal\" DevicePath \"\"" Apr 16 20:18:02.942893 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:18:02.942813 2572 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/93e8c4ec-79da-4421-be85-19b6810b3069-model-cache\") on node \"ip-10-0-128-48.ec2.internal\" DevicePath \"\"" Apr 16 20:18:02.945305 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:18:02.945276 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d527aadd-baba-4971-b2e9-5df1ede5f2d0-dshm" (OuterVolumeSpecName: "dshm") pod "d527aadd-baba-4971-b2e9-5df1ede5f2d0" (UID: "d527aadd-baba-4971-b2e9-5df1ede5f2d0"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:18:02.945645 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:18:02.945603 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d527aadd-baba-4971-b2e9-5df1ede5f2d0-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "d527aadd-baba-4971-b2e9-5df1ede5f2d0" (UID: "d527aadd-baba-4971-b2e9-5df1ede5f2d0"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:18:02.945645 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:18:02.945614 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d527aadd-baba-4971-b2e9-5df1ede5f2d0-kube-api-access-njq55" (OuterVolumeSpecName: "kube-api-access-njq55") pod "d527aadd-baba-4971-b2e9-5df1ede5f2d0" (UID: "d527aadd-baba-4971-b2e9-5df1ede5f2d0"). InnerVolumeSpecName "kube-api-access-njq55". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 20:18:02.955799 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:18:02.955747 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/93e8c4ec-79da-4421-be85-19b6810b3069-dshm" (OuterVolumeSpecName: "dshm") pod "93e8c4ec-79da-4421-be85-19b6810b3069" (UID: "93e8c4ec-79da-4421-be85-19b6810b3069"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:18:02.955799 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:18:02.955767 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/93e8c4ec-79da-4421-be85-19b6810b3069-kube-api-access-mnd27" (OuterVolumeSpecName: "kube-api-access-mnd27") pod "93e8c4ec-79da-4421-be85-19b6810b3069" (UID: "93e8c4ec-79da-4421-be85-19b6810b3069"). InnerVolumeSpecName "kube-api-access-mnd27". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 20:18:02.957062 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:18:02.957036 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93e8c4ec-79da-4421-be85-19b6810b3069-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "93e8c4ec-79da-4421-be85-19b6810b3069" (UID: "93e8c4ec-79da-4421-be85-19b6810b3069"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:18:03.004986 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:18:03.004941 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d527aadd-baba-4971-b2e9-5df1ede5f2d0-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "d527aadd-baba-4971-b2e9-5df1ede5f2d0" (UID: "d527aadd-baba-4971-b2e9-5df1ede5f2d0"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:18:03.014893 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:18:03.014852 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/93e8c4ec-79da-4421-be85-19b6810b3069-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "93e8c4ec-79da-4421-be85-19b6810b3069" (UID: "93e8c4ec-79da-4421-be85-19b6810b3069"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:18:03.043717 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:18:03.043678 2572 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/93e8c4ec-79da-4421-be85-19b6810b3069-dshm\") on node \"ip-10-0-128-48.ec2.internal\" DevicePath \"\"" Apr 16 20:18:03.043717 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:18:03.043714 2572 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d527aadd-baba-4971-b2e9-5df1ede5f2d0-kserve-provision-location\") on node \"ip-10-0-128-48.ec2.internal\" DevicePath \"\"" Apr 16 20:18:03.043905 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:18:03.043730 2572 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/d527aadd-baba-4971-b2e9-5df1ede5f2d0-dshm\") on node \"ip-10-0-128-48.ec2.internal\" DevicePath \"\"" Apr 16 20:18:03.043905 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:18:03.043742 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-njq55\" (UniqueName: \"kubernetes.io/projected/d527aadd-baba-4971-b2e9-5df1ede5f2d0-kube-api-access-njq55\") on node \"ip-10-0-128-48.ec2.internal\" DevicePath \"\"" Apr 16 20:18:03.043905 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:18:03.043754 2572 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d527aadd-baba-4971-b2e9-5df1ede5f2d0-tls-certs\") on node \"ip-10-0-128-48.ec2.internal\" DevicePath \"\"" Apr 16 20:18:03.043905 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:18:03.043767 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-mnd27\" (UniqueName: \"kubernetes.io/projected/93e8c4ec-79da-4421-be85-19b6810b3069-kube-api-access-mnd27\") on node \"ip-10-0-128-48.ec2.internal\" DevicePath \"\"" Apr 16 20:18:03.043905 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:18:03.043779 2572 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/93e8c4ec-79da-4421-be85-19b6810b3069-kserve-provision-location\") on node \"ip-10-0-128-48.ec2.internal\" DevicePath \"\"" Apr 16 20:18:03.043905 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:18:03.043794 2572 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/93e8c4ec-79da-4421-be85-19b6810b3069-tls-certs\") on node \"ip-10-0-128-48.ec2.internal\" DevicePath \"\"" Apr 16 20:18:03.399046 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:18:03.399014 2572 generic.go:358] "Generic (PLEG): container finished" podID="93e8c4ec-79da-4421-be85-19b6810b3069" containerID="48d20975e294cbccfb434acaabbeb864c96fea15d80b070c5434d549e099f5d8" exitCode=137 Apr 16 20:18:03.399501 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:18:03.399130 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6967dsr" event={"ID":"93e8c4ec-79da-4421-be85-19b6810b3069","Type":"ContainerDied","Data":"48d20975e294cbccfb434acaabbeb864c96fea15d80b070c5434d549e099f5d8"} Apr 16 20:18:03.399501 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:18:03.399147 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6967dsr" Apr 16 20:18:03.399501 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:18:03.399164 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6967dsr" event={"ID":"93e8c4ec-79da-4421-be85-19b6810b3069","Type":"ContainerDied","Data":"3aec01e45da837e52b0e0089e32b9346c1a36262f3f574d1d5b4791bc30edcad"} Apr 16 20:18:03.399501 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:18:03.399189 2572 scope.go:117] "RemoveContainer" containerID="48d20975e294cbccfb434acaabbeb864c96fea15d80b070c5434d549e099f5d8" Apr 16 20:18:03.401354 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:18:03.400941 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-f5b5466ff-frvkh_d527aadd-baba-4971-b2e9-5df1ede5f2d0/main/0.log" Apr 16 20:18:03.402339 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:18:03.402313 2572 generic.go:358] "Generic (PLEG): container finished" podID="d527aadd-baba-4971-b2e9-5df1ede5f2d0" containerID="80999210b81f313b611bfe44191ee8260144aafd07df018b54224475ac9de3e0" exitCode=137 Apr 16 20:18:03.402339 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:18:03.402338 2572 generic.go:358] "Generic (PLEG): container finished" podID="d527aadd-baba-4971-b2e9-5df1ede5f2d0" containerID="b7e081c84f432c85178363ebde3cdf303d0234509525fc701394854075bfa44e" exitCode=0 Apr 16 20:18:03.402511 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:18:03.402365 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-f5b5466ff-frvkh" event={"ID":"d527aadd-baba-4971-b2e9-5df1ede5f2d0","Type":"ContainerDied","Data":"80999210b81f313b611bfe44191ee8260144aafd07df018b54224475ac9de3e0"} Apr 16 20:18:03.402511 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:18:03.402389 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-f5b5466ff-frvkh" event={"ID":"d527aadd-baba-4971-b2e9-5df1ede5f2d0","Type":"ContainerDied","Data":"b7e081c84f432c85178363ebde3cdf303d0234509525fc701394854075bfa44e"} Apr 16 20:18:03.402511 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:18:03.402403 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-f5b5466ff-frvkh" event={"ID":"d527aadd-baba-4971-b2e9-5df1ede5f2d0","Type":"ContainerDied","Data":"aee5638edb506d488fe3675ea8333ab51b5504c967def73164be68ea6eb9e4d4"} Apr 16 20:18:03.402810 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:18:03.402608 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-f5b5466ff-frvkh" Apr 16 20:18:03.424423 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:18:03.424394 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6967dsr"] Apr 16 20:18:03.430444 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:18:03.430413 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6967dsr"] Apr 16 20:18:03.430929 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:18:03.430906 2572 scope.go:117] "RemoveContainer" containerID="fc6588f0c92e2913244039890c1a36f827b6ee1a8f76b65183698a0d259176bf" Apr 16 20:18:03.442765 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:18:03.442738 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-f5b5466ff-frvkh"] Apr 16 20:18:03.446876 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:18:03.446853 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-f5b5466ff-frvkh"] Apr 16 20:18:03.485766 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:18:03.485741 2572 scope.go:117] "RemoveContainer" containerID="48d20975e294cbccfb434acaabbeb864c96fea15d80b070c5434d549e099f5d8" Apr 16 20:18:03.486164 ip-10-0-128-48 kubenswrapper[2572]: E0416 20:18:03.486141 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"48d20975e294cbccfb434acaabbeb864c96fea15d80b070c5434d549e099f5d8\": container with ID starting with 48d20975e294cbccfb434acaabbeb864c96fea15d80b070c5434d549e099f5d8 not found: ID does not exist" containerID="48d20975e294cbccfb434acaabbeb864c96fea15d80b070c5434d549e099f5d8" Apr 16 20:18:03.486285 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:18:03.486175 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"48d20975e294cbccfb434acaabbeb864c96fea15d80b070c5434d549e099f5d8"} err="failed to get container status \"48d20975e294cbccfb434acaabbeb864c96fea15d80b070c5434d549e099f5d8\": rpc error: code = NotFound desc = could not find container \"48d20975e294cbccfb434acaabbeb864c96fea15d80b070c5434d549e099f5d8\": container with ID starting with 48d20975e294cbccfb434acaabbeb864c96fea15d80b070c5434d549e099f5d8 not found: ID does not exist" Apr 16 20:18:03.486285 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:18:03.486202 2572 scope.go:117] "RemoveContainer" containerID="fc6588f0c92e2913244039890c1a36f827b6ee1a8f76b65183698a0d259176bf" Apr 16 20:18:03.486547 ip-10-0-128-48 kubenswrapper[2572]: E0416 20:18:03.486523 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fc6588f0c92e2913244039890c1a36f827b6ee1a8f76b65183698a0d259176bf\": container with ID starting with fc6588f0c92e2913244039890c1a36f827b6ee1a8f76b65183698a0d259176bf not found: ID does not exist" containerID="fc6588f0c92e2913244039890c1a36f827b6ee1a8f76b65183698a0d259176bf" Apr 16 20:18:03.486625 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:18:03.486555 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc6588f0c92e2913244039890c1a36f827b6ee1a8f76b65183698a0d259176bf"} err="failed to get container status \"fc6588f0c92e2913244039890c1a36f827b6ee1a8f76b65183698a0d259176bf\": rpc error: code = NotFound desc = could not find container \"fc6588f0c92e2913244039890c1a36f827b6ee1a8f76b65183698a0d259176bf\": container with ID starting with fc6588f0c92e2913244039890c1a36f827b6ee1a8f76b65183698a0d259176bf not found: ID does not exist" Apr 16 20:18:03.486625 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:18:03.486576 2572 scope.go:117] "RemoveContainer" containerID="80999210b81f313b611bfe44191ee8260144aafd07df018b54224475ac9de3e0" Apr 16 20:18:03.518488 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:18:03.517606 2572 scope.go:117] "RemoveContainer" containerID="21cd938444174e89f25e61963f831e15340c0391b3598e97d6a9a45fa9b4031f" Apr 16 20:18:03.601276 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:18:03.601249 2572 scope.go:117] "RemoveContainer" containerID="b7e081c84f432c85178363ebde3cdf303d0234509525fc701394854075bfa44e" Apr 16 20:18:03.609959 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:18:03.609932 2572 scope.go:117] "RemoveContainer" containerID="80999210b81f313b611bfe44191ee8260144aafd07df018b54224475ac9de3e0" Apr 16 20:18:03.610328 ip-10-0-128-48 kubenswrapper[2572]: E0416 20:18:03.610296 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"80999210b81f313b611bfe44191ee8260144aafd07df018b54224475ac9de3e0\": container with ID starting with 80999210b81f313b611bfe44191ee8260144aafd07df018b54224475ac9de3e0 not found: ID does not exist" containerID="80999210b81f313b611bfe44191ee8260144aafd07df018b54224475ac9de3e0" Apr 16 20:18:03.610447 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:18:03.610338 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"80999210b81f313b611bfe44191ee8260144aafd07df018b54224475ac9de3e0"} err="failed to get container status \"80999210b81f313b611bfe44191ee8260144aafd07df018b54224475ac9de3e0\": rpc error: code = NotFound desc = could not find container \"80999210b81f313b611bfe44191ee8260144aafd07df018b54224475ac9de3e0\": container with ID starting with 80999210b81f313b611bfe44191ee8260144aafd07df018b54224475ac9de3e0 not found: ID does not exist" Apr 16 20:18:03.610447 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:18:03.610367 2572 scope.go:117] "RemoveContainer" containerID="21cd938444174e89f25e61963f831e15340c0391b3598e97d6a9a45fa9b4031f" Apr 16 20:18:03.610714 ip-10-0-128-48 kubenswrapper[2572]: E0416 20:18:03.610678 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"21cd938444174e89f25e61963f831e15340c0391b3598e97d6a9a45fa9b4031f\": container with ID starting with 21cd938444174e89f25e61963f831e15340c0391b3598e97d6a9a45fa9b4031f not found: ID does not exist" containerID="21cd938444174e89f25e61963f831e15340c0391b3598e97d6a9a45fa9b4031f" Apr 16 20:18:03.610791 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:18:03.610724 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"21cd938444174e89f25e61963f831e15340c0391b3598e97d6a9a45fa9b4031f"} err="failed to get container status \"21cd938444174e89f25e61963f831e15340c0391b3598e97d6a9a45fa9b4031f\": rpc error: code = NotFound desc = could not find container \"21cd938444174e89f25e61963f831e15340c0391b3598e97d6a9a45fa9b4031f\": container with ID starting with 21cd938444174e89f25e61963f831e15340c0391b3598e97d6a9a45fa9b4031f not found: ID does not exist" Apr 16 20:18:03.610791 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:18:03.610748 2572 scope.go:117] "RemoveContainer" containerID="b7e081c84f432c85178363ebde3cdf303d0234509525fc701394854075bfa44e" Apr 16 20:18:03.611015 ip-10-0-128-48 kubenswrapper[2572]: E0416 20:18:03.610995 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b7e081c84f432c85178363ebde3cdf303d0234509525fc701394854075bfa44e\": container with ID starting with b7e081c84f432c85178363ebde3cdf303d0234509525fc701394854075bfa44e not found: ID does not exist" containerID="b7e081c84f432c85178363ebde3cdf303d0234509525fc701394854075bfa44e" Apr 16 20:18:03.611081 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:18:03.611019 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b7e081c84f432c85178363ebde3cdf303d0234509525fc701394854075bfa44e"} err="failed to get container status \"b7e081c84f432c85178363ebde3cdf303d0234509525fc701394854075bfa44e\": rpc error: code = NotFound desc = could not find container \"b7e081c84f432c85178363ebde3cdf303d0234509525fc701394854075bfa44e\": container with ID starting with b7e081c84f432c85178363ebde3cdf303d0234509525fc701394854075bfa44e not found: ID does not exist" Apr 16 20:18:03.611081 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:18:03.611034 2572 scope.go:117] "RemoveContainer" containerID="80999210b81f313b611bfe44191ee8260144aafd07df018b54224475ac9de3e0" Apr 16 20:18:03.611360 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:18:03.611326 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"80999210b81f313b611bfe44191ee8260144aafd07df018b54224475ac9de3e0"} err="failed to get container status \"80999210b81f313b611bfe44191ee8260144aafd07df018b54224475ac9de3e0\": rpc error: code = NotFound desc = could not find container \"80999210b81f313b611bfe44191ee8260144aafd07df018b54224475ac9de3e0\": container with ID starting with 80999210b81f313b611bfe44191ee8260144aafd07df018b54224475ac9de3e0 not found: ID does not exist" Apr 16 20:18:03.611360 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:18:03.611351 2572 scope.go:117] "RemoveContainer" containerID="21cd938444174e89f25e61963f831e15340c0391b3598e97d6a9a45fa9b4031f" Apr 16 20:18:03.611617 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:18:03.611588 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"21cd938444174e89f25e61963f831e15340c0391b3598e97d6a9a45fa9b4031f"} err="failed to get container status \"21cd938444174e89f25e61963f831e15340c0391b3598e97d6a9a45fa9b4031f\": rpc error: code = NotFound desc = could not find container \"21cd938444174e89f25e61963f831e15340c0391b3598e97d6a9a45fa9b4031f\": container with ID starting with 21cd938444174e89f25e61963f831e15340c0391b3598e97d6a9a45fa9b4031f not found: ID does not exist" Apr 16 20:18:03.611685 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:18:03.611619 2572 scope.go:117] "RemoveContainer" containerID="b7e081c84f432c85178363ebde3cdf303d0234509525fc701394854075bfa44e" Apr 16 20:18:03.611889 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:18:03.611868 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b7e081c84f432c85178363ebde3cdf303d0234509525fc701394854075bfa44e"} err="failed to get container status \"b7e081c84f432c85178363ebde3cdf303d0234509525fc701394854075bfa44e\": rpc error: code = NotFound desc = could not find container \"b7e081c84f432c85178363ebde3cdf303d0234509525fc701394854075bfa44e\": container with ID starting with b7e081c84f432c85178363ebde3cdf303d0234509525fc701394854075bfa44e not found: ID does not exist" Apr 16 20:18:05.367211 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:18:05.367172 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="93e8c4ec-79da-4421-be85-19b6810b3069" path="/var/lib/kubelet/pods/93e8c4ec-79da-4421-be85-19b6810b3069/volumes" Apr 16 20:18:05.367706 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:18:05.367596 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d527aadd-baba-4971-b2e9-5df1ede5f2d0" path="/var/lib/kubelet/pods/d527aadd-baba-4971-b2e9-5df1ede5f2d0/volumes" Apr 16 20:18:05.871738 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:18:05.871679 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6699675ffd-sm45k" podUID="55379d66-7d64-4888-ad47-e977a8686a02" containerName="main" probeResult="failure" output="Get \"https://10.132.0.37:8001/health\": dial tcp 10.132.0.37:8001: connect: connection refused" Apr 16 20:18:05.899603 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:18:05.899558 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-56987b47c5-7rx8c" podUID="53d396a1-380e-4e12-aef0-d8cb1c29f4bb" containerName="main" probeResult="failure" output="Get \"https://10.132.0.38:8000/health\": dial tcp 10.132.0.38:8000: connect: connection refused" Apr 16 20:18:09.134846 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:18:09.134793 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-55c8bffd6-9jwhw" podUID="45dc5182-f8dc-4793-b9e7-aa120a718845" containerName="main" probeResult="failure" output="Get \"https://10.132.0.36:8000/health\": dial tcp 10.132.0.36:8000: connect: connection refused" Apr 16 20:18:15.872038 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:18:15.871990 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6699675ffd-sm45k" podUID="55379d66-7d64-4888-ad47-e977a8686a02" containerName="main" probeResult="failure" output="Get \"https://10.132.0.37:8001/health\": dial tcp 10.132.0.37:8001: connect: connection refused" Apr 16 20:18:15.899819 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:18:15.899770 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-56987b47c5-7rx8c" podUID="53d396a1-380e-4e12-aef0-d8cb1c29f4bb" containerName="main" probeResult="failure" output="Get \"https://10.132.0.38:8000/health\": dial tcp 10.132.0.38:8000: connect: connection refused" Apr 16 20:18:19.134399 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:18:19.134356 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-55c8bffd6-9jwhw" podUID="45dc5182-f8dc-4793-b9e7-aa120a718845" containerName="main" probeResult="failure" output="Get \"https://10.132.0.36:8000/health\": dial tcp 10.132.0.36:8000: connect: connection refused" Apr 16 20:18:25.871709 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:18:25.871655 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6699675ffd-sm45k" podUID="55379d66-7d64-4888-ad47-e977a8686a02" containerName="main" probeResult="failure" output="Get \"https://10.132.0.37:8001/health\": dial tcp 10.132.0.37:8001: connect: connection refused" Apr 16 20:18:25.899663 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:18:25.899610 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-56987b47c5-7rx8c" podUID="53d396a1-380e-4e12-aef0-d8cb1c29f4bb" containerName="main" probeResult="failure" output="Get \"https://10.132.0.38:8000/health\": dial tcp 10.132.0.38:8000: connect: connection refused" Apr 16 20:18:29.134154 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:18:29.134088 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-55c8bffd6-9jwhw" podUID="45dc5182-f8dc-4793-b9e7-aa120a718845" containerName="main" probeResult="failure" output="Get \"https://10.132.0.36:8000/health\": dial tcp 10.132.0.36:8000: connect: connection refused" Apr 16 20:18:35.872053 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:18:35.872002 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6699675ffd-sm45k" podUID="55379d66-7d64-4888-ad47-e977a8686a02" containerName="main" probeResult="failure" output="Get \"https://10.132.0.37:8001/health\": dial tcp 10.132.0.37:8001: connect: connection refused" Apr 16 20:18:35.899949 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:18:35.899905 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-56987b47c5-7rx8c" podUID="53d396a1-380e-4e12-aef0-d8cb1c29f4bb" containerName="main" probeResult="failure" output="Get \"https://10.132.0.38:8000/health\": dial tcp 10.132.0.38:8000: connect: connection refused" Apr 16 20:18:39.134405 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:18:39.134365 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-55c8bffd6-9jwhw" podUID="45dc5182-f8dc-4793-b9e7-aa120a718845" containerName="main" probeResult="failure" output="Get \"https://10.132.0.36:8000/health\": dial tcp 10.132.0.36:8000: connect: connection refused" Apr 16 20:18:45.871338 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:18:45.871292 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6699675ffd-sm45k" podUID="55379d66-7d64-4888-ad47-e977a8686a02" containerName="main" probeResult="failure" output="Get \"https://10.132.0.37:8001/health\": dial tcp 10.132.0.37:8001: connect: connection refused" Apr 16 20:18:45.900181 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:18:45.900140 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-56987b47c5-7rx8c" podUID="53d396a1-380e-4e12-aef0-d8cb1c29f4bb" containerName="main" probeResult="failure" output="Get \"https://10.132.0.38:8000/health\": dial tcp 10.132.0.38:8000: connect: connection refused" Apr 16 20:18:49.143675 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:18:49.143645 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/stop-feature-test-kserve-55c8bffd6-9jwhw" Apr 16 20:18:49.151987 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:18:49.151964 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-55c8bffd6-9jwhw" Apr 16 20:18:50.035026 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:18:50.034971 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-55c8bffd6-9jwhw"] Apr 16 20:18:50.577563 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:18:50.577518 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/stop-feature-test-kserve-55c8bffd6-9jwhw" podUID="45dc5182-f8dc-4793-b9e7-aa120a718845" containerName="main" containerID="cri-o://12edcd79d04f6334c88db91fd0a36e79df70705c5e9f6dd211d2c6c82852e658" gracePeriod=30 Apr 16 20:18:53.320935 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:18:53.320904 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2xttw_d7cbe800-699f-48fe-9a8a-b74c32bf0dcc/console-operator/2.log" Apr 16 20:18:53.325490 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:18:53.325465 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2xttw_d7cbe800-699f-48fe-9a8a-b74c32bf0dcc/console-operator/2.log" Apr 16 20:18:53.328490 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:18:53.328458 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bsbmz_40cddc46-c9be-411b-92b8-a65e04009fc9/ovn-acl-logging/0.log" Apr 16 20:18:53.332681 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:18:53.332663 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bsbmz_40cddc46-c9be-411b-92b8-a65e04009fc9/ovn-acl-logging/0.log" Apr 16 20:18:55.872211 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:18:55.872160 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6699675ffd-sm45k" podUID="55379d66-7d64-4888-ad47-e977a8686a02" containerName="main" probeResult="failure" output="Get \"https://10.132.0.37:8001/health\": dial tcp 10.132.0.37:8001: connect: connection refused" Apr 16 20:18:55.899275 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:18:55.899233 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-56987b47c5-7rx8c" podUID="53d396a1-380e-4e12-aef0-d8cb1c29f4bb" containerName="main" probeResult="failure" output="Get \"https://10.132.0.38:8000/health\": dial tcp 10.132.0.38:8000: connect: connection refused" Apr 16 20:19:05.872109 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:19:05.872048 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6699675ffd-sm45k" podUID="55379d66-7d64-4888-ad47-e977a8686a02" containerName="main" probeResult="failure" output="Get \"https://10.132.0.37:8001/health\": dial tcp 10.132.0.37:8001: connect: connection refused" Apr 16 20:19:05.899375 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:19:05.899342 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-56987b47c5-7rx8c" podUID="53d396a1-380e-4e12-aef0-d8cb1c29f4bb" containerName="main" probeResult="failure" output="Get \"https://10.132.0.38:8000/health\": dial tcp 10.132.0.38:8000: connect: connection refused" Apr 16 20:19:15.871345 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:19:15.871232 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6699675ffd-sm45k" podUID="55379d66-7d64-4888-ad47-e977a8686a02" containerName="main" probeResult="failure" output="Get \"https://10.132.0.37:8001/health\": dial tcp 10.132.0.37:8001: connect: connection refused" Apr 16 20:19:15.900042 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:19:15.900003 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-56987b47c5-7rx8c" podUID="53d396a1-380e-4e12-aef0-d8cb1c29f4bb" containerName="main" probeResult="failure" output="Get \"https://10.132.0.38:8000/health\": dial tcp 10.132.0.38:8000: connect: connection refused" Apr 16 20:19:21.307693 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:19:21.307671 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-55c8bffd6-9jwhw_45dc5182-f8dc-4793-b9e7-aa120a718845/main/0.log" Apr 16 20:19:21.308062 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:19:21.308049 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-55c8bffd6-9jwhw" Apr 16 20:19:21.352336 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:19:21.352310 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/45dc5182-f8dc-4793-b9e7-aa120a718845-dshm\") pod \"45dc5182-f8dc-4793-b9e7-aa120a718845\" (UID: \"45dc5182-f8dc-4793-b9e7-aa120a718845\") " Apr 16 20:19:21.352454 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:19:21.352352 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k5qcj\" (UniqueName: \"kubernetes.io/projected/45dc5182-f8dc-4793-b9e7-aa120a718845-kube-api-access-k5qcj\") pod \"45dc5182-f8dc-4793-b9e7-aa120a718845\" (UID: \"45dc5182-f8dc-4793-b9e7-aa120a718845\") " Apr 16 20:19:21.352454 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:19:21.352415 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/45dc5182-f8dc-4793-b9e7-aa120a718845-model-cache\") pod \"45dc5182-f8dc-4793-b9e7-aa120a718845\" (UID: \"45dc5182-f8dc-4793-b9e7-aa120a718845\") " Apr 16 20:19:21.352573 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:19:21.352476 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/45dc5182-f8dc-4793-b9e7-aa120a718845-tls-certs\") pod \"45dc5182-f8dc-4793-b9e7-aa120a718845\" (UID: \"45dc5182-f8dc-4793-b9e7-aa120a718845\") " Apr 16 20:19:21.352573 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:19:21.352516 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/45dc5182-f8dc-4793-b9e7-aa120a718845-home\") pod \"45dc5182-f8dc-4793-b9e7-aa120a718845\" (UID: \"45dc5182-f8dc-4793-b9e7-aa120a718845\") " Apr 16 20:19:21.352729 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:19:21.352685 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/45dc5182-f8dc-4793-b9e7-aa120a718845-model-cache" (OuterVolumeSpecName: "model-cache") pod "45dc5182-f8dc-4793-b9e7-aa120a718845" (UID: "45dc5182-f8dc-4793-b9e7-aa120a718845"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:19:21.353025 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:19:21.352992 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/45dc5182-f8dc-4793-b9e7-aa120a718845-home" (OuterVolumeSpecName: "home") pod "45dc5182-f8dc-4793-b9e7-aa120a718845" (UID: "45dc5182-f8dc-4793-b9e7-aa120a718845"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:19:21.354400 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:19:21.354380 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/45dc5182-f8dc-4793-b9e7-aa120a718845-dshm" (OuterVolumeSpecName: "dshm") pod "45dc5182-f8dc-4793-b9e7-aa120a718845" (UID: "45dc5182-f8dc-4793-b9e7-aa120a718845"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:19:21.354831 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:19:21.354805 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/45dc5182-f8dc-4793-b9e7-aa120a718845-kube-api-access-k5qcj" (OuterVolumeSpecName: "kube-api-access-k5qcj") pod "45dc5182-f8dc-4793-b9e7-aa120a718845" (UID: "45dc5182-f8dc-4793-b9e7-aa120a718845"). InnerVolumeSpecName "kube-api-access-k5qcj". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 20:19:21.354831 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:19:21.354819 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/45dc5182-f8dc-4793-b9e7-aa120a718845-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "45dc5182-f8dc-4793-b9e7-aa120a718845" (UID: "45dc5182-f8dc-4793-b9e7-aa120a718845"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:19:21.453560 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:19:21.453468 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/45dc5182-f8dc-4793-b9e7-aa120a718845-kserve-provision-location\") pod \"45dc5182-f8dc-4793-b9e7-aa120a718845\" (UID: \"45dc5182-f8dc-4793-b9e7-aa120a718845\") " Apr 16 20:19:21.453720 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:19:21.453676 2572 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/45dc5182-f8dc-4793-b9e7-aa120a718845-tls-certs\") on node \"ip-10-0-128-48.ec2.internal\" DevicePath \"\"" Apr 16 20:19:21.453720 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:19:21.453695 2572 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/45dc5182-f8dc-4793-b9e7-aa120a718845-home\") on node \"ip-10-0-128-48.ec2.internal\" DevicePath \"\"" Apr 16 20:19:21.453720 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:19:21.453706 2572 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/45dc5182-f8dc-4793-b9e7-aa120a718845-dshm\") on node \"ip-10-0-128-48.ec2.internal\" DevicePath \"\"" Apr 16 20:19:21.453720 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:19:21.453720 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-k5qcj\" (UniqueName: \"kubernetes.io/projected/45dc5182-f8dc-4793-b9e7-aa120a718845-kube-api-access-k5qcj\") on node \"ip-10-0-128-48.ec2.internal\" DevicePath \"\"" Apr 16 20:19:21.453948 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:19:21.453733 2572 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/45dc5182-f8dc-4793-b9e7-aa120a718845-model-cache\") on node \"ip-10-0-128-48.ec2.internal\" DevicePath \"\"" Apr 16 20:19:21.509232 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:19:21.509195 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/45dc5182-f8dc-4793-b9e7-aa120a718845-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "45dc5182-f8dc-4793-b9e7-aa120a718845" (UID: "45dc5182-f8dc-4793-b9e7-aa120a718845"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:19:21.554863 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:19:21.554835 2572 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/45dc5182-f8dc-4793-b9e7-aa120a718845-kserve-provision-location\") on node \"ip-10-0-128-48.ec2.internal\" DevicePath \"\"" Apr 16 20:19:21.684375 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:19:21.684353 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-55c8bffd6-9jwhw_45dc5182-f8dc-4793-b9e7-aa120a718845/main/0.log" Apr 16 20:19:21.684702 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:19:21.684675 2572 generic.go:358] "Generic (PLEG): container finished" podID="45dc5182-f8dc-4793-b9e7-aa120a718845" containerID="12edcd79d04f6334c88db91fd0a36e79df70705c5e9f6dd211d2c6c82852e658" exitCode=137 Apr 16 20:19:21.684769 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:19:21.684747 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-55c8bffd6-9jwhw" Apr 16 20:19:21.684769 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:19:21.684754 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-55c8bffd6-9jwhw" event={"ID":"45dc5182-f8dc-4793-b9e7-aa120a718845","Type":"ContainerDied","Data":"12edcd79d04f6334c88db91fd0a36e79df70705c5e9f6dd211d2c6c82852e658"} Apr 16 20:19:21.684836 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:19:21.684794 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-55c8bffd6-9jwhw" event={"ID":"45dc5182-f8dc-4793-b9e7-aa120a718845","Type":"ContainerDied","Data":"2ec8a850deecd3f90866b64be0322cf48f88a54b3ed453dc9b26f7991ca4ed26"} Apr 16 20:19:21.684836 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:19:21.684814 2572 scope.go:117] "RemoveContainer" containerID="12edcd79d04f6334c88db91fd0a36e79df70705c5e9f6dd211d2c6c82852e658" Apr 16 20:19:21.707034 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:19:21.706979 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-55c8bffd6-9jwhw"] Apr 16 20:19:21.710735 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:19:21.710713 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-55c8bffd6-9jwhw"] Apr 16 20:19:21.713747 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:19:21.713726 2572 scope.go:117] "RemoveContainer" containerID="7fb51a82719af14a531c234a55855885ed862cf51d325924d3c02ae54bbf6130" Apr 16 20:19:21.723749 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:19:21.723732 2572 scope.go:117] "RemoveContainer" containerID="12edcd79d04f6334c88db91fd0a36e79df70705c5e9f6dd211d2c6c82852e658" Apr 16 20:19:21.724013 ip-10-0-128-48 kubenswrapper[2572]: E0416 20:19:21.723994 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"12edcd79d04f6334c88db91fd0a36e79df70705c5e9f6dd211d2c6c82852e658\": container with ID starting with 12edcd79d04f6334c88db91fd0a36e79df70705c5e9f6dd211d2c6c82852e658 not found: ID does not exist" containerID="12edcd79d04f6334c88db91fd0a36e79df70705c5e9f6dd211d2c6c82852e658" Apr 16 20:19:21.724074 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:19:21.724027 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"12edcd79d04f6334c88db91fd0a36e79df70705c5e9f6dd211d2c6c82852e658"} err="failed to get container status \"12edcd79d04f6334c88db91fd0a36e79df70705c5e9f6dd211d2c6c82852e658\": rpc error: code = NotFound desc = could not find container \"12edcd79d04f6334c88db91fd0a36e79df70705c5e9f6dd211d2c6c82852e658\": container with ID starting with 12edcd79d04f6334c88db91fd0a36e79df70705c5e9f6dd211d2c6c82852e658 not found: ID does not exist" Apr 16 20:19:21.724074 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:19:21.724044 2572 scope.go:117] "RemoveContainer" containerID="7fb51a82719af14a531c234a55855885ed862cf51d325924d3c02ae54bbf6130" Apr 16 20:19:21.724348 ip-10-0-128-48 kubenswrapper[2572]: E0416 20:19:21.724319 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7fb51a82719af14a531c234a55855885ed862cf51d325924d3c02ae54bbf6130\": container with ID starting with 7fb51a82719af14a531c234a55855885ed862cf51d325924d3c02ae54bbf6130 not found: ID does not exist" containerID="7fb51a82719af14a531c234a55855885ed862cf51d325924d3c02ae54bbf6130" Apr 16 20:19:21.724397 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:19:21.724356 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7fb51a82719af14a531c234a55855885ed862cf51d325924d3c02ae54bbf6130"} err="failed to get container status \"7fb51a82719af14a531c234a55855885ed862cf51d325924d3c02ae54bbf6130\": rpc error: code = NotFound desc = could not find container \"7fb51a82719af14a531c234a55855885ed862cf51d325924d3c02ae54bbf6130\": container with ID starting with 7fb51a82719af14a531c234a55855885ed862cf51d325924d3c02ae54bbf6130 not found: ID does not exist" Apr 16 20:19:23.366342 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:19:23.366307 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="45dc5182-f8dc-4793-b9e7-aa120a718845" path="/var/lib/kubelet/pods/45dc5182-f8dc-4793-b9e7-aa120a718845/volumes" Apr 16 20:19:25.871625 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:19:25.871574 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6699675ffd-sm45k" podUID="55379d66-7d64-4888-ad47-e977a8686a02" containerName="main" probeResult="failure" output="Get \"https://10.132.0.37:8001/health\": dial tcp 10.132.0.37:8001: connect: connection refused" Apr 16 20:19:25.899662 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:19:25.899622 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-56987b47c5-7rx8c" podUID="53d396a1-380e-4e12-aef0-d8cb1c29f4bb" containerName="main" probeResult="failure" output="Get \"https://10.132.0.38:8000/health\": dial tcp 10.132.0.38:8000: connect: connection refused" Apr 16 20:19:35.871982 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:19:35.871934 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6699675ffd-sm45k" podUID="55379d66-7d64-4888-ad47-e977a8686a02" containerName="main" probeResult="failure" output="Get \"https://10.132.0.37:8001/health\": dial tcp 10.132.0.37:8001: connect: connection refused" Apr 16 20:19:35.899246 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:19:35.899213 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-56987b47c5-7rx8c" podUID="53d396a1-380e-4e12-aef0-d8cb1c29f4bb" containerName="main" probeResult="failure" output="Get \"https://10.132.0.38:8000/health\": dial tcp 10.132.0.38:8000: connect: connection refused" Apr 16 20:19:45.872167 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:19:45.872117 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6699675ffd-sm45k" podUID="55379d66-7d64-4888-ad47-e977a8686a02" containerName="main" probeResult="failure" output="Get \"https://10.132.0.37:8001/health\": dial tcp 10.132.0.37:8001: connect: connection refused" Apr 16 20:19:45.899518 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:19:45.899482 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-56987b47c5-7rx8c" podUID="53d396a1-380e-4e12-aef0-d8cb1c29f4bb" containerName="main" probeResult="failure" output="Get \"https://10.132.0.38:8000/health\": dial tcp 10.132.0.38:8000: connect: connection refused" Apr 16 20:19:55.872002 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:19:55.871953 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6699675ffd-sm45k" podUID="55379d66-7d64-4888-ad47-e977a8686a02" containerName="main" probeResult="failure" output="Get \"https://10.132.0.37:8001/health\": dial tcp 10.132.0.37:8001: connect: connection refused" Apr 16 20:19:55.899364 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:19:55.899324 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-56987b47c5-7rx8c" podUID="53d396a1-380e-4e12-aef0-d8cb1c29f4bb" containerName="main" probeResult="failure" output="Get \"https://10.132.0.38:8000/health\": dial tcp 10.132.0.38:8000: connect: connection refused" Apr 16 20:20:05.881002 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:20:05.880973 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6699675ffd-sm45k" Apr 16 20:20:05.894204 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:20:05.894182 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6699675ffd-sm45k" Apr 16 20:20:05.909748 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:20:05.909727 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-56987b47c5-7rx8c" Apr 16 20:20:05.917594 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:20:05.917572 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-56987b47c5-7rx8c" Apr 16 20:20:17.898287 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:20:17.898257 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0"] Apr 16 20:20:17.898653 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:20:17.898612 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d527aadd-baba-4971-b2e9-5df1ede5f2d0" containerName="main" Apr 16 20:20:17.898653 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:20:17.898627 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="d527aadd-baba-4971-b2e9-5df1ede5f2d0" containerName="main" Apr 16 20:20:17.898653 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:20:17.898639 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="45dc5182-f8dc-4793-b9e7-aa120a718845" containerName="storage-initializer" Apr 16 20:20:17.898653 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:20:17.898645 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="45dc5182-f8dc-4793-b9e7-aa120a718845" containerName="storage-initializer" Apr 16 20:20:17.898832 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:20:17.898665 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="93e8c4ec-79da-4421-be85-19b6810b3069" containerName="storage-initializer" Apr 16 20:20:17.898832 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:20:17.898674 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="93e8c4ec-79da-4421-be85-19b6810b3069" containerName="storage-initializer" Apr 16 20:20:17.898832 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:20:17.898692 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="45dc5182-f8dc-4793-b9e7-aa120a718845" containerName="main" Apr 16 20:20:17.898832 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:20:17.898700 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="45dc5182-f8dc-4793-b9e7-aa120a718845" containerName="main" Apr 16 20:20:17.898832 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:20:17.898716 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d527aadd-baba-4971-b2e9-5df1ede5f2d0" containerName="storage-initializer" Apr 16 20:20:17.898832 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:20:17.898725 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="d527aadd-baba-4971-b2e9-5df1ede5f2d0" containerName="storage-initializer" Apr 16 20:20:17.898832 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:20:17.898736 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="93e8c4ec-79da-4421-be85-19b6810b3069" containerName="main" Apr 16 20:20:17.898832 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:20:17.898745 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="93e8c4ec-79da-4421-be85-19b6810b3069" containerName="main" Apr 16 20:20:17.898832 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:20:17.898759 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d527aadd-baba-4971-b2e9-5df1ede5f2d0" containerName="llm-d-routing-sidecar" Apr 16 20:20:17.898832 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:20:17.898774 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="d527aadd-baba-4971-b2e9-5df1ede5f2d0" containerName="llm-d-routing-sidecar" Apr 16 20:20:17.899176 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:20:17.898860 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="d527aadd-baba-4971-b2e9-5df1ede5f2d0" containerName="llm-d-routing-sidecar" Apr 16 20:20:17.899176 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:20:17.898876 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="45dc5182-f8dc-4793-b9e7-aa120a718845" containerName="main" Apr 16 20:20:17.899176 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:20:17.898886 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="d527aadd-baba-4971-b2e9-5df1ede5f2d0" containerName="main" Apr 16 20:20:17.899176 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:20:17.898893 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="93e8c4ec-79da-4421-be85-19b6810b3069" containerName="main" Apr 16 20:20:17.901499 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:20:17.901482 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 20:20:17.905547 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:20:17.905525 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvc-model-fb-opt-125m-route-f312f5-cb7fb8cf-dockercfg-bghkq\"" Apr 16 20:20:17.905677 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:20:17.905531 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisv4e643bc258191ffc517a31cd1d0ddd27-kserve-self-signed-certs\"" Apr 16 20:20:17.911965 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:20:17.911942 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0"] Apr 16 20:20:17.920730 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:20:17.920700 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/86cf0680-5743-47ef-8e5c-d40b64bb9070-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"86cf0680-5743-47ef-8e5c-d40b64bb9070\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 20:20:17.920960 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:20:17.920933 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/86cf0680-5743-47ef-8e5c-d40b64bb9070-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"86cf0680-5743-47ef-8e5c-d40b64bb9070\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 20:20:17.921079 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:20:17.920990 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/86cf0680-5743-47ef-8e5c-d40b64bb9070-home\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"86cf0680-5743-47ef-8e5c-d40b64bb9070\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 20:20:17.921079 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:20:17.921020 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/86cf0680-5743-47ef-8e5c-d40b64bb9070-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"86cf0680-5743-47ef-8e5c-d40b64bb9070\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 20:20:17.921079 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:20:17.921071 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/86cf0680-5743-47ef-8e5c-d40b64bb9070-dshm\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"86cf0680-5743-47ef-8e5c-d40b64bb9070\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 20:20:17.921279 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:20:17.921114 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b785w\" (UniqueName: \"kubernetes.io/projected/86cf0680-5743-47ef-8e5c-d40b64bb9070-kube-api-access-b785w\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"86cf0680-5743-47ef-8e5c-d40b64bb9070\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 20:20:17.943334 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:20:17.943308 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1"] Apr 16 20:20:17.945465 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:20:17.945451 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 20:20:17.959652 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:20:17.959629 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1"] Apr 16 20:20:18.022535 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:20:18.022503 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/86cf0680-5743-47ef-8e5c-d40b64bb9070-home\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"86cf0680-5743-47ef-8e5c-d40b64bb9070\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 20:20:18.022702 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:20:18.022539 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/e9affb05-4d32-4615-85a8-d9ba201725ed-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"e9affb05-4d32-4615-85a8-d9ba201725ed\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 20:20:18.022702 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:20:18.022561 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/86cf0680-5743-47ef-8e5c-d40b64bb9070-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"86cf0680-5743-47ef-8e5c-d40b64bb9070\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 20:20:18.022702 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:20:18.022596 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e9affb05-4d32-4615-85a8-d9ba201725ed-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"e9affb05-4d32-4615-85a8-d9ba201725ed\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 20:20:18.022702 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:20:18.022614 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/86cf0680-5743-47ef-8e5c-d40b64bb9070-dshm\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"86cf0680-5743-47ef-8e5c-d40b64bb9070\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 20:20:18.022702 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:20:18.022630 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b785w\" (UniqueName: \"kubernetes.io/projected/86cf0680-5743-47ef-8e5c-d40b64bb9070-kube-api-access-b785w\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"86cf0680-5743-47ef-8e5c-d40b64bb9070\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 20:20:18.022702 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:20:18.022656 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mhvh6\" (UniqueName: \"kubernetes.io/projected/e9affb05-4d32-4615-85a8-d9ba201725ed-kube-api-access-mhvh6\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"e9affb05-4d32-4615-85a8-d9ba201725ed\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 20:20:18.022702 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:20:18.022684 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e9affb05-4d32-4615-85a8-d9ba201725ed-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"e9affb05-4d32-4615-85a8-d9ba201725ed\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 20:20:18.023062 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:20:18.022812 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/86cf0680-5743-47ef-8e5c-d40b64bb9070-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"86cf0680-5743-47ef-8e5c-d40b64bb9070\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 20:20:18.023062 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:20:18.022841 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/e9affb05-4d32-4615-85a8-d9ba201725ed-home\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"e9affb05-4d32-4615-85a8-d9ba201725ed\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 20:20:18.023062 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:20:18.022868 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/e9affb05-4d32-4615-85a8-d9ba201725ed-dshm\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"e9affb05-4d32-4615-85a8-d9ba201725ed\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 20:20:18.023062 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:20:18.022947 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/86cf0680-5743-47ef-8e5c-d40b64bb9070-home\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"86cf0680-5743-47ef-8e5c-d40b64bb9070\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 20:20:18.023062 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:20:18.022949 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/86cf0680-5743-47ef-8e5c-d40b64bb9070-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"86cf0680-5743-47ef-8e5c-d40b64bb9070\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 20:20:18.023279 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:20:18.023075 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/86cf0680-5743-47ef-8e5c-d40b64bb9070-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"86cf0680-5743-47ef-8e5c-d40b64bb9070\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 20:20:18.023279 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:20:18.023196 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/86cf0680-5743-47ef-8e5c-d40b64bb9070-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"86cf0680-5743-47ef-8e5c-d40b64bb9070\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 20:20:18.024939 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:20:18.024916 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/86cf0680-5743-47ef-8e5c-d40b64bb9070-dshm\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"86cf0680-5743-47ef-8e5c-d40b64bb9070\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 20:20:18.025052 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:20:18.025037 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/86cf0680-5743-47ef-8e5c-d40b64bb9070-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"86cf0680-5743-47ef-8e5c-d40b64bb9070\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 20:20:18.030759 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:20:18.030735 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b785w\" (UniqueName: \"kubernetes.io/projected/86cf0680-5743-47ef-8e5c-d40b64bb9070-kube-api-access-b785w\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"86cf0680-5743-47ef-8e5c-d40b64bb9070\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 20:20:18.123726 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:20:18.123689 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mhvh6\" (UniqueName: \"kubernetes.io/projected/e9affb05-4d32-4615-85a8-d9ba201725ed-kube-api-access-mhvh6\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"e9affb05-4d32-4615-85a8-d9ba201725ed\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 20:20:18.123900 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:20:18.123737 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e9affb05-4d32-4615-85a8-d9ba201725ed-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"e9affb05-4d32-4615-85a8-d9ba201725ed\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 20:20:18.123900 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:20:18.123764 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/e9affb05-4d32-4615-85a8-d9ba201725ed-home\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"e9affb05-4d32-4615-85a8-d9ba201725ed\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 20:20:18.123900 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:20:18.123788 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/e9affb05-4d32-4615-85a8-d9ba201725ed-dshm\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"e9affb05-4d32-4615-85a8-d9ba201725ed\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 20:20:18.123900 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:20:18.123835 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/e9affb05-4d32-4615-85a8-d9ba201725ed-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"e9affb05-4d32-4615-85a8-d9ba201725ed\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 20:20:18.124143 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:20:18.123995 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e9affb05-4d32-4615-85a8-d9ba201725ed-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"e9affb05-4d32-4615-85a8-d9ba201725ed\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 20:20:18.124143 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:20:18.124116 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e9affb05-4d32-4615-85a8-d9ba201725ed-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"e9affb05-4d32-4615-85a8-d9ba201725ed\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 20:20:18.124250 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:20:18.124142 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/e9affb05-4d32-4615-85a8-d9ba201725ed-home\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"e9affb05-4d32-4615-85a8-d9ba201725ed\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 20:20:18.124250 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:20:18.124160 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/e9affb05-4d32-4615-85a8-d9ba201725ed-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"e9affb05-4d32-4615-85a8-d9ba201725ed\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 20:20:18.126149 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:20:18.126126 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/e9affb05-4d32-4615-85a8-d9ba201725ed-dshm\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"e9affb05-4d32-4615-85a8-d9ba201725ed\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 20:20:18.126311 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:20:18.126294 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e9affb05-4d32-4615-85a8-d9ba201725ed-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"e9affb05-4d32-4615-85a8-d9ba201725ed\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 20:20:18.131630 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:20:18.131610 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mhvh6\" (UniqueName: \"kubernetes.io/projected/e9affb05-4d32-4615-85a8-d9ba201725ed-kube-api-access-mhvh6\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"e9affb05-4d32-4615-85a8-d9ba201725ed\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 20:20:18.211777 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:20:18.211703 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 20:20:18.254861 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:20:18.254830 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 20:20:18.343109 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:20:18.343067 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0"] Apr 16 20:20:18.345901 ip-10-0-128-48 kubenswrapper[2572]: W0416 20:20:18.345859 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod86cf0680_5743_47ef_8e5c_d40b64bb9070.slice/crio-4841ce586c55b2abd246435a887a4926a65bc784d0e24ab4a3b817a9f792d3ea WatchSource:0}: Error finding container 4841ce586c55b2abd246435a887a4926a65bc784d0e24ab4a3b817a9f792d3ea: Status 404 returned error can't find the container with id 4841ce586c55b2abd246435a887a4926a65bc784d0e24ab4a3b817a9f792d3ea Apr 16 20:20:18.392700 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:20:18.392676 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1"] Apr 16 20:20:18.394613 ip-10-0-128-48 kubenswrapper[2572]: W0416 20:20:18.394586 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode9affb05_4d32_4615_85a8_d9ba201725ed.slice/crio-85264eb8f8b0eb2d753556ac20e9215d3550c6a200e26b86d90b390098380157 WatchSource:0}: Error finding container 85264eb8f8b0eb2d753556ac20e9215d3550c6a200e26b86d90b390098380157: Status 404 returned error can't find the container with id 85264eb8f8b0eb2d753556ac20e9215d3550c6a200e26b86d90b390098380157 Apr 16 20:20:18.887698 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:20:18.887620 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" event={"ID":"86cf0680-5743-47ef-8e5c-d40b64bb9070","Type":"ContainerStarted","Data":"3fab1be540643798642ce60b125961f20e9dafce30a7b8360c87d75f7d86310f"} Apr 16 20:20:18.887698 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:20:18.887668 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" event={"ID":"86cf0680-5743-47ef-8e5c-d40b64bb9070","Type":"ContainerStarted","Data":"4841ce586c55b2abd246435a887a4926a65bc784d0e24ab4a3b817a9f792d3ea"} Apr 16 20:20:18.889152 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:20:18.889123 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" event={"ID":"e9affb05-4d32-4615-85a8-d9ba201725ed","Type":"ContainerStarted","Data":"bcc02997898e5553a2de0046956e6c403ac07f186b6e00a98498c26df273119f"} Apr 16 20:20:18.889152 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:20:18.889156 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" event={"ID":"e9affb05-4d32-4615-85a8-d9ba201725ed","Type":"ContainerStarted","Data":"85264eb8f8b0eb2d753556ac20e9215d3550c6a200e26b86d90b390098380157"} Apr 16 20:20:22.908935 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:20:22.908899 2572 generic.go:358] "Generic (PLEG): container finished" podID="e9affb05-4d32-4615-85a8-d9ba201725ed" containerID="bcc02997898e5553a2de0046956e6c403ac07f186b6e00a98498c26df273119f" exitCode=0 Apr 16 20:20:22.909312 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:20:22.908973 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" event={"ID":"e9affb05-4d32-4615-85a8-d9ba201725ed","Type":"ContainerDied","Data":"bcc02997898e5553a2de0046956e6c403ac07f186b6e00a98498c26df273119f"} Apr 16 20:20:22.910452 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:20:22.910431 2572 generic.go:358] "Generic (PLEG): container finished" podID="86cf0680-5743-47ef-8e5c-d40b64bb9070" containerID="3fab1be540643798642ce60b125961f20e9dafce30a7b8360c87d75f7d86310f" exitCode=0 Apr 16 20:20:22.910552 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:20:22.910467 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" event={"ID":"86cf0680-5743-47ef-8e5c-d40b64bb9070","Type":"ContainerDied","Data":"3fab1be540643798642ce60b125961f20e9dafce30a7b8360c87d75f7d86310f"} Apr 16 20:20:23.917610 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:20:23.917567 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" event={"ID":"86cf0680-5743-47ef-8e5c-d40b64bb9070","Type":"ContainerStarted","Data":"7dd3605d493e88820f811a9166f38f5eb7f198233332eecd1fc1d859c06f6042"} Apr 16 20:20:23.919592 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:20:23.919566 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" event={"ID":"e9affb05-4d32-4615-85a8-d9ba201725ed","Type":"ContainerStarted","Data":"78b9612ba23ae63a2c025f61ae79ab939c5bf8a7b8643b624676453cb432ca3c"} Apr 16 20:20:23.941443 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:20:23.941384 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podStartSLOduration=6.941366437 podStartE2EDuration="6.941366437s" podCreationTimestamp="2026-04-16 20:20:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:20:23.939072239 +0000 UTC m=+1591.117834265" watchObservedRunningTime="2026-04-16 20:20:23.941366437 +0000 UTC m=+1591.120128464" Apr 16 20:20:23.960553 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:20:23.960495 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" podStartSLOduration=6.9604779390000004 podStartE2EDuration="6.960477939s" podCreationTimestamp="2026-04-16 20:20:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:20:23.958876875 +0000 UTC m=+1591.137638915" watchObservedRunningTime="2026-04-16 20:20:23.960477939 +0000 UTC m=+1591.139239965" Apr 16 20:20:28.212961 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:20:28.212918 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 20:20:28.217078 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:20:28.214664 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="86cf0680-5743-47ef-8e5c-d40b64bb9070" containerName="main" probeResult="failure" output="Get \"https://10.132.0.39:8000/health\": dial tcp 10.132.0.39:8000: connect: connection refused" Apr 16 20:20:33.375815 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:20:33.375770 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-56987b47c5-7rx8c"] Apr 16 20:20:33.376953 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:20:33.376811 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-56987b47c5-7rx8c" podUID="53d396a1-380e-4e12-aef0-d8cb1c29f4bb" containerName="main" containerID="cri-o://25b9a79fbc8247b14792c17293b446687bf89a30111a3350cb81c5ae0786b0ef" gracePeriod=30 Apr 16 20:20:33.381244 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:20:33.381218 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6699675ffd-sm45k"] Apr 16 20:20:33.381652 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:20:33.381593 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6699675ffd-sm45k" podUID="55379d66-7d64-4888-ad47-e977a8686a02" containerName="main" containerID="cri-o://10d15a105dcb26dfa39df60e9471444942257af52f1de6464767606836a92d71" gracePeriod=30 Apr 16 20:20:38.212649 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:20:38.212600 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="86cf0680-5743-47ef-8e5c-d40b64bb9070" containerName="main" probeResult="failure" output="Get \"https://10.132.0.39:8000/health\": dial tcp 10.132.0.39:8000: connect: connection refused" Apr 16 20:20:48.212114 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:20:48.211994 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 20:20:48.212558 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:20:48.212284 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="86cf0680-5743-47ef-8e5c-d40b64bb9070" containerName="main" probeResult="failure" output="Get \"https://10.132.0.39:8000/health\": dial tcp 10.132.0.39:8000: connect: connection refused" Apr 16 20:20:48.579698 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:20:48.579663 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-8485dcccd-66n2h"] Apr 16 20:20:48.585576 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:20:48.585551 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-8485dcccd-66n2h" Apr 16 20:20:48.588285 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:20:48.588261 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-with-refs-pd-test-kserve-self-signed-certs\"" Apr 16 20:20:48.588412 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:20:48.588263 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-with-refs-pd-test-kserve-dockercfg-jdfcl\"" Apr 16 20:20:48.593156 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:20:48.593059 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-8485dcccd-66n2h"] Apr 16 20:20:48.604977 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:20:48.604908 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-6b75fc484c-rqxnl"] Apr 16 20:20:48.609871 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:20:48.609746 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-6b75fc484c-rqxnl" Apr 16 20:20:48.621667 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:20:48.620567 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-6b75fc484c-rqxnl"] Apr 16 20:20:48.628140 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:20:48.628105 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/d92bff9e-c823-4740-9e8c-bd283d0e313b-dshm\") pod \"router-with-refs-pd-test-kserve-prefill-6b75fc484c-rqxnl\" (UID: \"d92bff9e-c823-4740-9e8c-bd283d0e313b\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-6b75fc484c-rqxnl" Apr 16 20:20:48.628286 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:20:48.628162 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/150cd8db-a7e6-4238-9ad0-760a02ca4ba4-model-cache\") pod \"router-with-refs-pd-test-kserve-8485dcccd-66n2h\" (UID: \"150cd8db-a7e6-4238-9ad0-760a02ca4ba4\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-8485dcccd-66n2h" Apr 16 20:20:48.628286 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:20:48.628228 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/150cd8db-a7e6-4238-9ad0-760a02ca4ba4-dshm\") pod \"router-with-refs-pd-test-kserve-8485dcccd-66n2h\" (UID: \"150cd8db-a7e6-4238-9ad0-760a02ca4ba4\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-8485dcccd-66n2h" Apr 16 20:20:48.628286 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:20:48.628255 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/150cd8db-a7e6-4238-9ad0-760a02ca4ba4-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-8485dcccd-66n2h\" (UID: \"150cd8db-a7e6-4238-9ad0-760a02ca4ba4\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-8485dcccd-66n2h" Apr 16 20:20:48.628464 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:20:48.628301 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/d92bff9e-c823-4740-9e8c-bd283d0e313b-home\") pod \"router-with-refs-pd-test-kserve-prefill-6b75fc484c-rqxnl\" (UID: \"d92bff9e-c823-4740-9e8c-bd283d0e313b\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-6b75fc484c-rqxnl" Apr 16 20:20:48.628464 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:20:48.628325 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/150cd8db-a7e6-4238-9ad0-760a02ca4ba4-home\") pod \"router-with-refs-pd-test-kserve-8485dcccd-66n2h\" (UID: \"150cd8db-a7e6-4238-9ad0-760a02ca4ba4\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-8485dcccd-66n2h" Apr 16 20:20:48.628464 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:20:48.628368 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d92bff9e-c823-4740-9e8c-bd283d0e313b-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-prefill-6b75fc484c-rqxnl\" (UID: \"d92bff9e-c823-4740-9e8c-bd283d0e313b\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-6b75fc484c-rqxnl" Apr 16 20:20:48.628464 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:20:48.628402 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-snh5b\" (UniqueName: \"kubernetes.io/projected/d92bff9e-c823-4740-9e8c-bd283d0e313b-kube-api-access-snh5b\") pod \"router-with-refs-pd-test-kserve-prefill-6b75fc484c-rqxnl\" (UID: \"d92bff9e-c823-4740-9e8c-bd283d0e313b\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-6b75fc484c-rqxnl" Apr 16 20:20:48.628464 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:20:48.628427 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/150cd8db-a7e6-4238-9ad0-760a02ca4ba4-tls-certs\") pod \"router-with-refs-pd-test-kserve-8485dcccd-66n2h\" (UID: \"150cd8db-a7e6-4238-9ad0-760a02ca4ba4\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-8485dcccd-66n2h" Apr 16 20:20:48.628731 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:20:48.628503 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d92bff9e-c823-4740-9e8c-bd283d0e313b-tls-certs\") pod \"router-with-refs-pd-test-kserve-prefill-6b75fc484c-rqxnl\" (UID: \"d92bff9e-c823-4740-9e8c-bd283d0e313b\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-6b75fc484c-rqxnl" Apr 16 20:20:48.628731 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:20:48.628567 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-twscj\" (UniqueName: \"kubernetes.io/projected/150cd8db-a7e6-4238-9ad0-760a02ca4ba4-kube-api-access-twscj\") pod \"router-with-refs-pd-test-kserve-8485dcccd-66n2h\" (UID: \"150cd8db-a7e6-4238-9ad0-760a02ca4ba4\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-8485dcccd-66n2h" Apr 16 20:20:48.628731 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:20:48.628602 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/d92bff9e-c823-4740-9e8c-bd283d0e313b-model-cache\") pod \"router-with-refs-pd-test-kserve-prefill-6b75fc484c-rqxnl\" (UID: \"d92bff9e-c823-4740-9e8c-bd283d0e313b\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-6b75fc484c-rqxnl" Apr 16 20:20:48.729859 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:20:48.729823 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/d92bff9e-c823-4740-9e8c-bd283d0e313b-home\") pod \"router-with-refs-pd-test-kserve-prefill-6b75fc484c-rqxnl\" (UID: \"d92bff9e-c823-4740-9e8c-bd283d0e313b\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-6b75fc484c-rqxnl" Apr 16 20:20:48.729859 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:20:48.729865 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/150cd8db-a7e6-4238-9ad0-760a02ca4ba4-home\") pod \"router-with-refs-pd-test-kserve-8485dcccd-66n2h\" (UID: \"150cd8db-a7e6-4238-9ad0-760a02ca4ba4\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-8485dcccd-66n2h" Apr 16 20:20:48.730143 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:20:48.729909 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d92bff9e-c823-4740-9e8c-bd283d0e313b-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-prefill-6b75fc484c-rqxnl\" (UID: \"d92bff9e-c823-4740-9e8c-bd283d0e313b\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-6b75fc484c-rqxnl" Apr 16 20:20:48.730143 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:20:48.729945 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-snh5b\" (UniqueName: \"kubernetes.io/projected/d92bff9e-c823-4740-9e8c-bd283d0e313b-kube-api-access-snh5b\") pod \"router-with-refs-pd-test-kserve-prefill-6b75fc484c-rqxnl\" (UID: \"d92bff9e-c823-4740-9e8c-bd283d0e313b\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-6b75fc484c-rqxnl" Apr 16 20:20:48.730143 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:20:48.729973 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/150cd8db-a7e6-4238-9ad0-760a02ca4ba4-tls-certs\") pod \"router-with-refs-pd-test-kserve-8485dcccd-66n2h\" (UID: \"150cd8db-a7e6-4238-9ad0-760a02ca4ba4\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-8485dcccd-66n2h" Apr 16 20:20:48.730143 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:20:48.730045 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d92bff9e-c823-4740-9e8c-bd283d0e313b-tls-certs\") pod \"router-with-refs-pd-test-kserve-prefill-6b75fc484c-rqxnl\" (UID: \"d92bff9e-c823-4740-9e8c-bd283d0e313b\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-6b75fc484c-rqxnl" Apr 16 20:20:48.730143 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:20:48.730135 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-twscj\" (UniqueName: \"kubernetes.io/projected/150cd8db-a7e6-4238-9ad0-760a02ca4ba4-kube-api-access-twscj\") pod \"router-with-refs-pd-test-kserve-8485dcccd-66n2h\" (UID: \"150cd8db-a7e6-4238-9ad0-760a02ca4ba4\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-8485dcccd-66n2h" Apr 16 20:20:48.730426 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:20:48.730169 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/d92bff9e-c823-4740-9e8c-bd283d0e313b-model-cache\") pod \"router-with-refs-pd-test-kserve-prefill-6b75fc484c-rqxnl\" (UID: \"d92bff9e-c823-4740-9e8c-bd283d0e313b\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-6b75fc484c-rqxnl" Apr 16 20:20:48.730426 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:20:48.730226 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/d92bff9e-c823-4740-9e8c-bd283d0e313b-dshm\") pod \"router-with-refs-pd-test-kserve-prefill-6b75fc484c-rqxnl\" (UID: \"d92bff9e-c823-4740-9e8c-bd283d0e313b\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-6b75fc484c-rqxnl" Apr 16 20:20:48.730426 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:20:48.730263 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/150cd8db-a7e6-4238-9ad0-760a02ca4ba4-model-cache\") pod \"router-with-refs-pd-test-kserve-8485dcccd-66n2h\" (UID: \"150cd8db-a7e6-4238-9ad0-760a02ca4ba4\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-8485dcccd-66n2h" Apr 16 20:20:48.730426 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:20:48.730330 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/150cd8db-a7e6-4238-9ad0-760a02ca4ba4-dshm\") pod \"router-with-refs-pd-test-kserve-8485dcccd-66n2h\" (UID: \"150cd8db-a7e6-4238-9ad0-760a02ca4ba4\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-8485dcccd-66n2h" Apr 16 20:20:48.730426 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:20:48.730351 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/d92bff9e-c823-4740-9e8c-bd283d0e313b-home\") pod \"router-with-refs-pd-test-kserve-prefill-6b75fc484c-rqxnl\" (UID: \"d92bff9e-c823-4740-9e8c-bd283d0e313b\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-6b75fc484c-rqxnl" Apr 16 20:20:48.730426 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:20:48.730353 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/150cd8db-a7e6-4238-9ad0-760a02ca4ba4-home\") pod \"router-with-refs-pd-test-kserve-8485dcccd-66n2h\" (UID: \"150cd8db-a7e6-4238-9ad0-760a02ca4ba4\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-8485dcccd-66n2h" Apr 16 20:20:48.730731 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:20:48.730661 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d92bff9e-c823-4740-9e8c-bd283d0e313b-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-prefill-6b75fc484c-rqxnl\" (UID: \"d92bff9e-c823-4740-9e8c-bd283d0e313b\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-6b75fc484c-rqxnl" Apr 16 20:20:48.730731 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:20:48.730675 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/150cd8db-a7e6-4238-9ad0-760a02ca4ba4-model-cache\") pod \"router-with-refs-pd-test-kserve-8485dcccd-66n2h\" (UID: \"150cd8db-a7e6-4238-9ad0-760a02ca4ba4\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-8485dcccd-66n2h" Apr 16 20:20:48.730731 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:20:48.730356 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/150cd8db-a7e6-4238-9ad0-760a02ca4ba4-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-8485dcccd-66n2h\" (UID: \"150cd8db-a7e6-4238-9ad0-760a02ca4ba4\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-8485dcccd-66n2h" Apr 16 20:20:48.730884 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:20:48.730680 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/150cd8db-a7e6-4238-9ad0-760a02ca4ba4-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-8485dcccd-66n2h\" (UID: \"150cd8db-a7e6-4238-9ad0-760a02ca4ba4\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-8485dcccd-66n2h" Apr 16 20:20:48.731021 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:20:48.730997 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/d92bff9e-c823-4740-9e8c-bd283d0e313b-model-cache\") pod \"router-with-refs-pd-test-kserve-prefill-6b75fc484c-rqxnl\" (UID: \"d92bff9e-c823-4740-9e8c-bd283d0e313b\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-6b75fc484c-rqxnl" Apr 16 20:20:48.733691 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:20:48.733221 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/150cd8db-a7e6-4238-9ad0-760a02ca4ba4-dshm\") pod \"router-with-refs-pd-test-kserve-8485dcccd-66n2h\" (UID: \"150cd8db-a7e6-4238-9ad0-760a02ca4ba4\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-8485dcccd-66n2h" Apr 16 20:20:48.733691 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:20:48.733296 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/d92bff9e-c823-4740-9e8c-bd283d0e313b-dshm\") pod \"router-with-refs-pd-test-kserve-prefill-6b75fc484c-rqxnl\" (UID: \"d92bff9e-c823-4740-9e8c-bd283d0e313b\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-6b75fc484c-rqxnl" Apr 16 20:20:48.733691 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:20:48.733597 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d92bff9e-c823-4740-9e8c-bd283d0e313b-tls-certs\") pod \"router-with-refs-pd-test-kserve-prefill-6b75fc484c-rqxnl\" (UID: \"d92bff9e-c823-4740-9e8c-bd283d0e313b\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-6b75fc484c-rqxnl" Apr 16 20:20:48.733691 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:20:48.733638 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/150cd8db-a7e6-4238-9ad0-760a02ca4ba4-tls-certs\") pod \"router-with-refs-pd-test-kserve-8485dcccd-66n2h\" (UID: \"150cd8db-a7e6-4238-9ad0-760a02ca4ba4\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-8485dcccd-66n2h" Apr 16 20:20:48.738465 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:20:48.738443 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-snh5b\" (UniqueName: \"kubernetes.io/projected/d92bff9e-c823-4740-9e8c-bd283d0e313b-kube-api-access-snh5b\") pod \"router-with-refs-pd-test-kserve-prefill-6b75fc484c-rqxnl\" (UID: \"d92bff9e-c823-4740-9e8c-bd283d0e313b\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-6b75fc484c-rqxnl" Apr 16 20:20:48.738583 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:20:48.738552 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-twscj\" (UniqueName: \"kubernetes.io/projected/150cd8db-a7e6-4238-9ad0-760a02ca4ba4-kube-api-access-twscj\") pod \"router-with-refs-pd-test-kserve-8485dcccd-66n2h\" (UID: \"150cd8db-a7e6-4238-9ad0-760a02ca4ba4\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-8485dcccd-66n2h" Apr 16 20:20:48.909496 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:20:48.909405 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-8485dcccd-66n2h" Apr 16 20:20:48.927260 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:20:48.927224 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-6b75fc484c-rqxnl" Apr 16 20:20:49.076234 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:20:49.076185 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-8485dcccd-66n2h"] Apr 16 20:20:49.078684 ip-10-0-128-48 kubenswrapper[2572]: W0416 20:20:49.078653 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod150cd8db_a7e6_4238_9ad0_760a02ca4ba4.slice/crio-50cae8db8888646371b2f14e6dddb7ca72350717592c7375b7ead3d9f791a022 WatchSource:0}: Error finding container 50cae8db8888646371b2f14e6dddb7ca72350717592c7375b7ead3d9f791a022: Status 404 returned error can't find the container with id 50cae8db8888646371b2f14e6dddb7ca72350717592c7375b7ead3d9f791a022 Apr 16 20:20:49.108880 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:20:49.108850 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-6b75fc484c-rqxnl"] Apr 16 20:20:49.114009 ip-10-0-128-48 kubenswrapper[2572]: W0416 20:20:49.113975 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd92bff9e_c823_4740_9e8c_bd283d0e313b.slice/crio-f5a3866b5cb10ca74533c16afae8d7481951151c9ead99161964a143796d9862 WatchSource:0}: Error finding container f5a3866b5cb10ca74533c16afae8d7481951151c9ead99161964a143796d9862: Status 404 returned error can't find the container with id f5a3866b5cb10ca74533c16afae8d7481951151c9ead99161964a143796d9862 Apr 16 20:20:50.034999 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:20:50.034910 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-8485dcccd-66n2h" event={"ID":"150cd8db-a7e6-4238-9ad0-760a02ca4ba4","Type":"ContainerStarted","Data":"73f47184264722da006d935a8d86c701750b1360f0e9dff7c089b89db63b3150"} Apr 16 20:20:50.034999 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:20:50.034961 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-8485dcccd-66n2h" event={"ID":"150cd8db-a7e6-4238-9ad0-760a02ca4ba4","Type":"ContainerStarted","Data":"50cae8db8888646371b2f14e6dddb7ca72350717592c7375b7ead3d9f791a022"} Apr 16 20:20:50.035574 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:20:50.035154 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-8485dcccd-66n2h" Apr 16 20:20:50.036962 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:20:50.036906 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-6b75fc484c-rqxnl" event={"ID":"d92bff9e-c823-4740-9e8c-bd283d0e313b","Type":"ContainerStarted","Data":"aacac551091c77eade4a549bf938a61a54ad1f9fea9c36bd15efddaa720d18fa"} Apr 16 20:20:50.036962 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:20:50.036947 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-6b75fc484c-rqxnl" event={"ID":"d92bff9e-c823-4740-9e8c-bd283d0e313b","Type":"ContainerStarted","Data":"f5a3866b5cb10ca74533c16afae8d7481951151c9ead99161964a143796d9862"} Apr 16 20:20:51.042776 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:20:51.042645 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-8485dcccd-66n2h" event={"ID":"150cd8db-a7e6-4238-9ad0-760a02ca4ba4","Type":"ContainerStarted","Data":"6c1c32c806873db3f84bffd390ae3430a12ab7be9588373942926ba9fdeac435"} Apr 16 20:20:54.058231 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:20:54.058194 2572 generic.go:358] "Generic (PLEG): container finished" podID="d92bff9e-c823-4740-9e8c-bd283d0e313b" containerID="aacac551091c77eade4a549bf938a61a54ad1f9fea9c36bd15efddaa720d18fa" exitCode=0 Apr 16 20:20:54.058697 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:20:54.058283 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-6b75fc484c-rqxnl" event={"ID":"d92bff9e-c823-4740-9e8c-bd283d0e313b","Type":"ContainerDied","Data":"aacac551091c77eade4a549bf938a61a54ad1f9fea9c36bd15efddaa720d18fa"} Apr 16 20:20:55.064199 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:20:55.064162 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-6b75fc484c-rqxnl" event={"ID":"d92bff9e-c823-4740-9e8c-bd283d0e313b","Type":"ContainerStarted","Data":"998a4fc4216eaaa4343d8725cd1acc1b5e5b811c8566e7110bc05e803373045d"} Apr 16 20:20:55.066198 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:20:55.066169 2572 generic.go:358] "Generic (PLEG): container finished" podID="150cd8db-a7e6-4238-9ad0-760a02ca4ba4" containerID="6c1c32c806873db3f84bffd390ae3430a12ab7be9588373942926ba9fdeac435" exitCode=0 Apr 16 20:20:55.066306 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:20:55.066208 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-8485dcccd-66n2h" event={"ID":"150cd8db-a7e6-4238-9ad0-760a02ca4ba4","Type":"ContainerDied","Data":"6c1c32c806873db3f84bffd390ae3430a12ab7be9588373942926ba9fdeac435"} Apr 16 20:20:55.086296 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:20:55.086242 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-6b75fc484c-rqxnl" podStartSLOduration=7.086221812 podStartE2EDuration="7.086221812s" podCreationTimestamp="2026-04-16 20:20:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:20:55.083575975 +0000 UTC m=+1622.262338034" watchObservedRunningTime="2026-04-16 20:20:55.086221812 +0000 UTC m=+1622.264983840" Apr 16 20:20:56.073171 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:20:56.073127 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-8485dcccd-66n2h" event={"ID":"150cd8db-a7e6-4238-9ad0-760a02ca4ba4","Type":"ContainerStarted","Data":"bde189940f3079e6ecb2d9020e43340d68a64ab3c022357383d330eb8fa54513"} Apr 16 20:20:56.112225 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:20:56.112142 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-8485dcccd-66n2h" podStartSLOduration=8.112120524 podStartE2EDuration="8.112120524s" podCreationTimestamp="2026-04-16 20:20:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:20:56.109867981 +0000 UTC m=+1623.288630003" watchObservedRunningTime="2026-04-16 20:20:56.112120524 +0000 UTC m=+1623.290882747" Apr 16 20:20:58.212524 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:20:58.212474 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="86cf0680-5743-47ef-8e5c-d40b64bb9070" containerName="main" probeResult="failure" output="Get \"https://10.132.0.39:8000/health\": dial tcp 10.132.0.39:8000: connect: connection refused" Apr 16 20:20:58.910198 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:20:58.910143 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-8485dcccd-66n2h" Apr 16 20:20:58.910413 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:20:58.910237 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-8485dcccd-66n2h" Apr 16 20:20:58.911672 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:20:58.911639 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-8485dcccd-66n2h" podUID="150cd8db-a7e6-4238-9ad0-760a02ca4ba4" containerName="main" probeResult="failure" output="Get \"https://10.132.0.41:8001/health\": dial tcp 10.132.0.41:8001: connect: connection refused" Apr 16 20:20:58.927994 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:20:58.927967 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-6b75fc484c-rqxnl" Apr 16 20:20:58.928211 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:20:58.928008 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-6b75fc484c-rqxnl" Apr 16 20:20:58.929403 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:20:58.929373 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-6b75fc484c-rqxnl" podUID="d92bff9e-c823-4740-9e8c-bd283d0e313b" containerName="main" probeResult="failure" output="Get \"https://10.132.0.42:8000/health\": dial tcp 10.132.0.42:8000: connect: connection refused" Apr 16 20:21:03.382270 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:21:03.382229 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6699675ffd-sm45k" podUID="55379d66-7d64-4888-ad47-e977a8686a02" containerName="llm-d-routing-sidecar" containerID="cri-o://a05bd43e0fd99b4d1c37eb876a649416190319e3d8594ef87ebb41c28e9b2273" gracePeriod=2 Apr 16 20:21:03.912278 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:21:03.912243 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-56987b47c5-7rx8c" Apr 16 20:21:03.928980 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:21:03.928908 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_custom-route-timeout-pd-test-kserve-6699675ffd-sm45k_55379d66-7d64-4888-ad47-e977a8686a02/main/0.log" Apr 16 20:21:03.929932 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:21:03.929907 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6699675ffd-sm45k" Apr 16 20:21:03.991174 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:21:03.991136 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/55379d66-7d64-4888-ad47-e977a8686a02-tls-certs\") pod \"55379d66-7d64-4888-ad47-e977a8686a02\" (UID: \"55379d66-7d64-4888-ad47-e977a8686a02\") " Apr 16 20:21:03.991370 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:21:03.991196 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8mbnd\" (UniqueName: \"kubernetes.io/projected/55379d66-7d64-4888-ad47-e977a8686a02-kube-api-access-8mbnd\") pod \"55379d66-7d64-4888-ad47-e977a8686a02\" (UID: \"55379d66-7d64-4888-ad47-e977a8686a02\") " Apr 16 20:21:03.991370 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:21:03.991256 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/55379d66-7d64-4888-ad47-e977a8686a02-dshm\") pod \"55379d66-7d64-4888-ad47-e977a8686a02\" (UID: \"55379d66-7d64-4888-ad47-e977a8686a02\") " Apr 16 20:21:03.991370 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:21:03.991297 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/53d396a1-380e-4e12-aef0-d8cb1c29f4bb-kserve-provision-location\") pod \"53d396a1-380e-4e12-aef0-d8cb1c29f4bb\" (UID: \"53d396a1-380e-4e12-aef0-d8cb1c29f4bb\") " Apr 16 20:21:03.991370 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:21:03.991335 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/53d396a1-380e-4e12-aef0-d8cb1c29f4bb-home\") pod \"53d396a1-380e-4e12-aef0-d8cb1c29f4bb\" (UID: \"53d396a1-380e-4e12-aef0-d8cb1c29f4bb\") " Apr 16 20:21:03.991616 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:21:03.991375 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/55379d66-7d64-4888-ad47-e977a8686a02-kserve-provision-location\") pod \"55379d66-7d64-4888-ad47-e977a8686a02\" (UID: \"55379d66-7d64-4888-ad47-e977a8686a02\") " Apr 16 20:21:03.991616 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:21:03.991407 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/53d396a1-380e-4e12-aef0-d8cb1c29f4bb-dshm\") pod \"53d396a1-380e-4e12-aef0-d8cb1c29f4bb\" (UID: \"53d396a1-380e-4e12-aef0-d8cb1c29f4bb\") " Apr 16 20:21:03.991616 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:21:03.991458 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/53d396a1-380e-4e12-aef0-d8cb1c29f4bb-tls-certs\") pod \"53d396a1-380e-4e12-aef0-d8cb1c29f4bb\" (UID: \"53d396a1-380e-4e12-aef0-d8cb1c29f4bb\") " Apr 16 20:21:03.991616 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:21:03.991481 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kpdnw\" (UniqueName: \"kubernetes.io/projected/53d396a1-380e-4e12-aef0-d8cb1c29f4bb-kube-api-access-kpdnw\") pod \"53d396a1-380e-4e12-aef0-d8cb1c29f4bb\" (UID: \"53d396a1-380e-4e12-aef0-d8cb1c29f4bb\") " Apr 16 20:21:03.991616 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:21:03.991517 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/53d396a1-380e-4e12-aef0-d8cb1c29f4bb-model-cache\") pod \"53d396a1-380e-4e12-aef0-d8cb1c29f4bb\" (UID: \"53d396a1-380e-4e12-aef0-d8cb1c29f4bb\") " Apr 16 20:21:03.991616 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:21:03.991564 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/55379d66-7d64-4888-ad47-e977a8686a02-home\") pod \"55379d66-7d64-4888-ad47-e977a8686a02\" (UID: \"55379d66-7d64-4888-ad47-e977a8686a02\") " Apr 16 20:21:03.991616 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:21:03.991587 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/55379d66-7d64-4888-ad47-e977a8686a02-model-cache\") pod \"55379d66-7d64-4888-ad47-e977a8686a02\" (UID: \"55379d66-7d64-4888-ad47-e977a8686a02\") " Apr 16 20:21:03.992132 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:21:03.992083 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/55379d66-7d64-4888-ad47-e977a8686a02-model-cache" (OuterVolumeSpecName: "model-cache") pod "55379d66-7d64-4888-ad47-e977a8686a02" (UID: "55379d66-7d64-4888-ad47-e977a8686a02"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:21:03.993979 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:21:03.993359 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/53d396a1-380e-4e12-aef0-d8cb1c29f4bb-home" (OuterVolumeSpecName: "home") pod "53d396a1-380e-4e12-aef0-d8cb1c29f4bb" (UID: "53d396a1-380e-4e12-aef0-d8cb1c29f4bb"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:21:03.993979 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:21:03.993622 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/53d396a1-380e-4e12-aef0-d8cb1c29f4bb-model-cache" (OuterVolumeSpecName: "model-cache") pod "53d396a1-380e-4e12-aef0-d8cb1c29f4bb" (UID: "53d396a1-380e-4e12-aef0-d8cb1c29f4bb"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:21:03.994861 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:21:03.994826 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55379d66-7d64-4888-ad47-e977a8686a02-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "55379d66-7d64-4888-ad47-e977a8686a02" (UID: "55379d66-7d64-4888-ad47-e977a8686a02"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:21:03.997585 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:21:03.995893 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53d396a1-380e-4e12-aef0-d8cb1c29f4bb-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "53d396a1-380e-4e12-aef0-d8cb1c29f4bb" (UID: "53d396a1-380e-4e12-aef0-d8cb1c29f4bb"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:21:03.998183 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:21:03.997853 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/55379d66-7d64-4888-ad47-e977a8686a02-home" (OuterVolumeSpecName: "home") pod "55379d66-7d64-4888-ad47-e977a8686a02" (UID: "55379d66-7d64-4888-ad47-e977a8686a02"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:21:03.999766 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:21:03.999736 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/55379d66-7d64-4888-ad47-e977a8686a02-kube-api-access-8mbnd" (OuterVolumeSpecName: "kube-api-access-8mbnd") pod "55379d66-7d64-4888-ad47-e977a8686a02" (UID: "55379d66-7d64-4888-ad47-e977a8686a02"). InnerVolumeSpecName "kube-api-access-8mbnd". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 20:21:04.000875 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:21:04.000829 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/55379d66-7d64-4888-ad47-e977a8686a02-dshm" (OuterVolumeSpecName: "dshm") pod "55379d66-7d64-4888-ad47-e977a8686a02" (UID: "55379d66-7d64-4888-ad47-e977a8686a02"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:21:04.009884 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:21:04.009834 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/53d396a1-380e-4e12-aef0-d8cb1c29f4bb-kube-api-access-kpdnw" (OuterVolumeSpecName: "kube-api-access-kpdnw") pod "53d396a1-380e-4e12-aef0-d8cb1c29f4bb" (UID: "53d396a1-380e-4e12-aef0-d8cb1c29f4bb"). InnerVolumeSpecName "kube-api-access-kpdnw". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 20:21:04.011642 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:21:04.011609 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/53d396a1-380e-4e12-aef0-d8cb1c29f4bb-dshm" (OuterVolumeSpecName: "dshm") pod "53d396a1-380e-4e12-aef0-d8cb1c29f4bb" (UID: "53d396a1-380e-4e12-aef0-d8cb1c29f4bb"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:21:04.059682 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:21:04.059625 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/53d396a1-380e-4e12-aef0-d8cb1c29f4bb-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "53d396a1-380e-4e12-aef0-d8cb1c29f4bb" (UID: "53d396a1-380e-4e12-aef0-d8cb1c29f4bb"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:21:04.065573 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:21:04.065527 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/55379d66-7d64-4888-ad47-e977a8686a02-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "55379d66-7d64-4888-ad47-e977a8686a02" (UID: "55379d66-7d64-4888-ad47-e977a8686a02"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:21:04.093222 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:21:04.093182 2572 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/53d396a1-380e-4e12-aef0-d8cb1c29f4bb-kserve-provision-location\") on node \"ip-10-0-128-48.ec2.internal\" DevicePath \"\"" Apr 16 20:21:04.093222 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:21:04.093218 2572 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/53d396a1-380e-4e12-aef0-d8cb1c29f4bb-home\") on node \"ip-10-0-128-48.ec2.internal\" DevicePath \"\"" Apr 16 20:21:04.093451 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:21:04.093235 2572 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/55379d66-7d64-4888-ad47-e977a8686a02-kserve-provision-location\") on node \"ip-10-0-128-48.ec2.internal\" DevicePath \"\"" Apr 16 20:21:04.093451 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:21:04.093249 2572 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/53d396a1-380e-4e12-aef0-d8cb1c29f4bb-dshm\") on node \"ip-10-0-128-48.ec2.internal\" DevicePath \"\"" Apr 16 20:21:04.093451 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:21:04.093268 2572 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/53d396a1-380e-4e12-aef0-d8cb1c29f4bb-tls-certs\") on node \"ip-10-0-128-48.ec2.internal\" DevicePath \"\"" Apr 16 20:21:04.093451 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:21:04.093282 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-kpdnw\" (UniqueName: \"kubernetes.io/projected/53d396a1-380e-4e12-aef0-d8cb1c29f4bb-kube-api-access-kpdnw\") on node \"ip-10-0-128-48.ec2.internal\" DevicePath \"\"" Apr 16 20:21:04.093451 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:21:04.093296 2572 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/53d396a1-380e-4e12-aef0-d8cb1c29f4bb-model-cache\") on node \"ip-10-0-128-48.ec2.internal\" DevicePath \"\"" Apr 16 20:21:04.093451 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:21:04.093310 2572 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/55379d66-7d64-4888-ad47-e977a8686a02-home\") on node \"ip-10-0-128-48.ec2.internal\" DevicePath \"\"" Apr 16 20:21:04.093451 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:21:04.093323 2572 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/55379d66-7d64-4888-ad47-e977a8686a02-model-cache\") on node \"ip-10-0-128-48.ec2.internal\" DevicePath \"\"" Apr 16 20:21:04.093451 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:21:04.093337 2572 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/55379d66-7d64-4888-ad47-e977a8686a02-tls-certs\") on node \"ip-10-0-128-48.ec2.internal\" DevicePath \"\"" Apr 16 20:21:04.093451 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:21:04.093351 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-8mbnd\" (UniqueName: \"kubernetes.io/projected/55379d66-7d64-4888-ad47-e977a8686a02-kube-api-access-8mbnd\") on node \"ip-10-0-128-48.ec2.internal\" DevicePath \"\"" Apr 16 20:21:04.093451 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:21:04.093365 2572 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/55379d66-7d64-4888-ad47-e977a8686a02-dshm\") on node \"ip-10-0-128-48.ec2.internal\" DevicePath \"\"" Apr 16 20:21:04.108299 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:21:04.108251 2572 generic.go:358] "Generic (PLEG): container finished" podID="53d396a1-380e-4e12-aef0-d8cb1c29f4bb" containerID="25b9a79fbc8247b14792c17293b446687bf89a30111a3350cb81c5ae0786b0ef" exitCode=137 Apr 16 20:21:04.108458 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:21:04.108295 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-56987b47c5-7rx8c" event={"ID":"53d396a1-380e-4e12-aef0-d8cb1c29f4bb","Type":"ContainerDied","Data":"25b9a79fbc8247b14792c17293b446687bf89a30111a3350cb81c5ae0786b0ef"} Apr 16 20:21:04.108458 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:21:04.108343 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-56987b47c5-7rx8c" event={"ID":"53d396a1-380e-4e12-aef0-d8cb1c29f4bb","Type":"ContainerDied","Data":"a8f1ea1d23f49dea6dc0006b98a6b599e5f821eaa1afe98daf7acd344a73560e"} Apr 16 20:21:04.108458 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:21:04.108365 2572 scope.go:117] "RemoveContainer" containerID="25b9a79fbc8247b14792c17293b446687bf89a30111a3350cb81c5ae0786b0ef" Apr 16 20:21:04.108458 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:21:04.108365 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-56987b47c5-7rx8c" Apr 16 20:21:04.109857 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:21:04.109834 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_custom-route-timeout-pd-test-kserve-6699675ffd-sm45k_55379d66-7d64-4888-ad47-e977a8686a02/main/0.log" Apr 16 20:21:04.114429 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:21:04.114135 2572 generic.go:358] "Generic (PLEG): container finished" podID="55379d66-7d64-4888-ad47-e977a8686a02" containerID="10d15a105dcb26dfa39df60e9471444942257af52f1de6464767606836a92d71" exitCode=137 Apr 16 20:21:04.114429 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:21:04.114157 2572 generic.go:358] "Generic (PLEG): container finished" podID="55379d66-7d64-4888-ad47-e977a8686a02" containerID="a05bd43e0fd99b4d1c37eb876a649416190319e3d8594ef87ebb41c28e9b2273" exitCode=0 Apr 16 20:21:04.114429 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:21:04.114192 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6699675ffd-sm45k" event={"ID":"55379d66-7d64-4888-ad47-e977a8686a02","Type":"ContainerDied","Data":"10d15a105dcb26dfa39df60e9471444942257af52f1de6464767606836a92d71"} Apr 16 20:21:04.114429 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:21:04.114219 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6699675ffd-sm45k" event={"ID":"55379d66-7d64-4888-ad47-e977a8686a02","Type":"ContainerDied","Data":"a05bd43e0fd99b4d1c37eb876a649416190319e3d8594ef87ebb41c28e9b2273"} Apr 16 20:21:04.114429 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:21:04.114227 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6699675ffd-sm45k" Apr 16 20:21:04.114429 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:21:04.114237 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6699675ffd-sm45k" event={"ID":"55379d66-7d64-4888-ad47-e977a8686a02","Type":"ContainerDied","Data":"5f886570908f0e7ba1cd4d3aced28ea7b6f310d40676e7f19d43f08f8c81ad75"} Apr 16 20:21:04.146605 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:21:04.146575 2572 scope.go:117] "RemoveContainer" containerID="57a1a33896d9298a404363ebe5163ed2e6f407958aa0f22c91cefc38cff5fc86" Apr 16 20:21:04.153382 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:21:04.153277 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-56987b47c5-7rx8c"] Apr 16 20:21:04.158925 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:21:04.157727 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-56987b47c5-7rx8c"] Apr 16 20:21:04.162629 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:21:04.162608 2572 scope.go:117] "RemoveContainer" containerID="25b9a79fbc8247b14792c17293b446687bf89a30111a3350cb81c5ae0786b0ef" Apr 16 20:21:04.163000 ip-10-0-128-48 kubenswrapper[2572]: E0416 20:21:04.162977 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"25b9a79fbc8247b14792c17293b446687bf89a30111a3350cb81c5ae0786b0ef\": container with ID starting with 25b9a79fbc8247b14792c17293b446687bf89a30111a3350cb81c5ae0786b0ef not found: ID does not exist" containerID="25b9a79fbc8247b14792c17293b446687bf89a30111a3350cb81c5ae0786b0ef" Apr 16 20:21:04.163168 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:21:04.163143 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"25b9a79fbc8247b14792c17293b446687bf89a30111a3350cb81c5ae0786b0ef"} err="failed to get container status \"25b9a79fbc8247b14792c17293b446687bf89a30111a3350cb81c5ae0786b0ef\": rpc error: code = NotFound desc = could not find container \"25b9a79fbc8247b14792c17293b446687bf89a30111a3350cb81c5ae0786b0ef\": container with ID starting with 25b9a79fbc8247b14792c17293b446687bf89a30111a3350cb81c5ae0786b0ef not found: ID does not exist" Apr 16 20:21:04.163287 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:21:04.163274 2572 scope.go:117] "RemoveContainer" containerID="57a1a33896d9298a404363ebe5163ed2e6f407958aa0f22c91cefc38cff5fc86" Apr 16 20:21:04.163701 ip-10-0-128-48 kubenswrapper[2572]: E0416 20:21:04.163675 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"57a1a33896d9298a404363ebe5163ed2e6f407958aa0f22c91cefc38cff5fc86\": container with ID starting with 57a1a33896d9298a404363ebe5163ed2e6f407958aa0f22c91cefc38cff5fc86 not found: ID does not exist" containerID="57a1a33896d9298a404363ebe5163ed2e6f407958aa0f22c91cefc38cff5fc86" Apr 16 20:21:04.163795 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:21:04.163710 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"57a1a33896d9298a404363ebe5163ed2e6f407958aa0f22c91cefc38cff5fc86"} err="failed to get container status \"57a1a33896d9298a404363ebe5163ed2e6f407958aa0f22c91cefc38cff5fc86\": rpc error: code = NotFound desc = could not find container \"57a1a33896d9298a404363ebe5163ed2e6f407958aa0f22c91cefc38cff5fc86\": container with ID starting with 57a1a33896d9298a404363ebe5163ed2e6f407958aa0f22c91cefc38cff5fc86 not found: ID does not exist" Apr 16 20:21:04.163795 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:21:04.163760 2572 scope.go:117] "RemoveContainer" containerID="10d15a105dcb26dfa39df60e9471444942257af52f1de6464767606836a92d71" Apr 16 20:21:04.176786 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:21:04.176766 2572 scope.go:117] "RemoveContainer" containerID="8f000e126a7d0b4ae10eaaf0bd5ee95d9d104eb1f7a03d506d4e3b30b6ae3f7d" Apr 16 20:21:04.181535 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:21:04.181368 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6699675ffd-sm45k"] Apr 16 20:21:04.183555 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:21:04.183539 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6699675ffd-sm45k"] Apr 16 20:21:04.199117 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:21:04.199078 2572 scope.go:117] "RemoveContainer" containerID="a05bd43e0fd99b4d1c37eb876a649416190319e3d8594ef87ebb41c28e9b2273" Apr 16 20:21:04.210003 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:21:04.209984 2572 scope.go:117] "RemoveContainer" containerID="10d15a105dcb26dfa39df60e9471444942257af52f1de6464767606836a92d71" Apr 16 20:21:04.210338 ip-10-0-128-48 kubenswrapper[2572]: E0416 20:21:04.210312 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"10d15a105dcb26dfa39df60e9471444942257af52f1de6464767606836a92d71\": container with ID starting with 10d15a105dcb26dfa39df60e9471444942257af52f1de6464767606836a92d71 not found: ID does not exist" containerID="10d15a105dcb26dfa39df60e9471444942257af52f1de6464767606836a92d71" Apr 16 20:21:04.210433 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:21:04.210348 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10d15a105dcb26dfa39df60e9471444942257af52f1de6464767606836a92d71"} err="failed to get container status \"10d15a105dcb26dfa39df60e9471444942257af52f1de6464767606836a92d71\": rpc error: code = NotFound desc = could not find container \"10d15a105dcb26dfa39df60e9471444942257af52f1de6464767606836a92d71\": container with ID starting with 10d15a105dcb26dfa39df60e9471444942257af52f1de6464767606836a92d71 not found: ID does not exist" Apr 16 20:21:04.210433 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:21:04.210372 2572 scope.go:117] "RemoveContainer" containerID="8f000e126a7d0b4ae10eaaf0bd5ee95d9d104eb1f7a03d506d4e3b30b6ae3f7d" Apr 16 20:21:04.210732 ip-10-0-128-48 kubenswrapper[2572]: E0416 20:21:04.210696 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8f000e126a7d0b4ae10eaaf0bd5ee95d9d104eb1f7a03d506d4e3b30b6ae3f7d\": container with ID starting with 8f000e126a7d0b4ae10eaaf0bd5ee95d9d104eb1f7a03d506d4e3b30b6ae3f7d not found: ID does not exist" containerID="8f000e126a7d0b4ae10eaaf0bd5ee95d9d104eb1f7a03d506d4e3b30b6ae3f7d" Apr 16 20:21:04.210818 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:21:04.210726 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f000e126a7d0b4ae10eaaf0bd5ee95d9d104eb1f7a03d506d4e3b30b6ae3f7d"} err="failed to get container status \"8f000e126a7d0b4ae10eaaf0bd5ee95d9d104eb1f7a03d506d4e3b30b6ae3f7d\": rpc error: code = NotFound desc = could not find container \"8f000e126a7d0b4ae10eaaf0bd5ee95d9d104eb1f7a03d506d4e3b30b6ae3f7d\": container with ID starting with 8f000e126a7d0b4ae10eaaf0bd5ee95d9d104eb1f7a03d506d4e3b30b6ae3f7d not found: ID does not exist" Apr 16 20:21:04.210818 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:21:04.210748 2572 scope.go:117] "RemoveContainer" containerID="a05bd43e0fd99b4d1c37eb876a649416190319e3d8594ef87ebb41c28e9b2273" Apr 16 20:21:04.211046 ip-10-0-128-48 kubenswrapper[2572]: E0416 20:21:04.211012 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a05bd43e0fd99b4d1c37eb876a649416190319e3d8594ef87ebb41c28e9b2273\": container with ID starting with a05bd43e0fd99b4d1c37eb876a649416190319e3d8594ef87ebb41c28e9b2273 not found: ID does not exist" containerID="a05bd43e0fd99b4d1c37eb876a649416190319e3d8594ef87ebb41c28e9b2273" Apr 16 20:21:04.211193 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:21:04.211042 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a05bd43e0fd99b4d1c37eb876a649416190319e3d8594ef87ebb41c28e9b2273"} err="failed to get container status \"a05bd43e0fd99b4d1c37eb876a649416190319e3d8594ef87ebb41c28e9b2273\": rpc error: code = NotFound desc = could not find container \"a05bd43e0fd99b4d1c37eb876a649416190319e3d8594ef87ebb41c28e9b2273\": container with ID starting with a05bd43e0fd99b4d1c37eb876a649416190319e3d8594ef87ebb41c28e9b2273 not found: ID does not exist" Apr 16 20:21:04.211193 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:21:04.211064 2572 scope.go:117] "RemoveContainer" containerID="10d15a105dcb26dfa39df60e9471444942257af52f1de6464767606836a92d71" Apr 16 20:21:04.211462 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:21:04.211435 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10d15a105dcb26dfa39df60e9471444942257af52f1de6464767606836a92d71"} err="failed to get container status \"10d15a105dcb26dfa39df60e9471444942257af52f1de6464767606836a92d71\": rpc error: code = NotFound desc = could not find container \"10d15a105dcb26dfa39df60e9471444942257af52f1de6464767606836a92d71\": container with ID starting with 10d15a105dcb26dfa39df60e9471444942257af52f1de6464767606836a92d71 not found: ID does not exist" Apr 16 20:21:04.211553 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:21:04.211465 2572 scope.go:117] "RemoveContainer" containerID="8f000e126a7d0b4ae10eaaf0bd5ee95d9d104eb1f7a03d506d4e3b30b6ae3f7d" Apr 16 20:21:04.211732 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:21:04.211705 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f000e126a7d0b4ae10eaaf0bd5ee95d9d104eb1f7a03d506d4e3b30b6ae3f7d"} err="failed to get container status \"8f000e126a7d0b4ae10eaaf0bd5ee95d9d104eb1f7a03d506d4e3b30b6ae3f7d\": rpc error: code = NotFound desc = could not find container \"8f000e126a7d0b4ae10eaaf0bd5ee95d9d104eb1f7a03d506d4e3b30b6ae3f7d\": container with ID starting with 8f000e126a7d0b4ae10eaaf0bd5ee95d9d104eb1f7a03d506d4e3b30b6ae3f7d not found: ID does not exist" Apr 16 20:21:04.211732 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:21:04.211734 2572 scope.go:117] "RemoveContainer" containerID="a05bd43e0fd99b4d1c37eb876a649416190319e3d8594ef87ebb41c28e9b2273" Apr 16 20:21:04.212135 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:21:04.212059 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a05bd43e0fd99b4d1c37eb876a649416190319e3d8594ef87ebb41c28e9b2273"} err="failed to get container status \"a05bd43e0fd99b4d1c37eb876a649416190319e3d8594ef87ebb41c28e9b2273\": rpc error: code = NotFound desc = could not find container \"a05bd43e0fd99b4d1c37eb876a649416190319e3d8594ef87ebb41c28e9b2273\": container with ID starting with a05bd43e0fd99b4d1c37eb876a649416190319e3d8594ef87ebb41c28e9b2273 not found: ID does not exist" Apr 16 20:21:05.368172 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:21:05.368135 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="53d396a1-380e-4e12-aef0-d8cb1c29f4bb" path="/var/lib/kubelet/pods/53d396a1-380e-4e12-aef0-d8cb1c29f4bb/volumes" Apr 16 20:21:05.368829 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:21:05.368802 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="55379d66-7d64-4888-ad47-e977a8686a02" path="/var/lib/kubelet/pods/55379d66-7d64-4888-ad47-e977a8686a02/volumes" Apr 16 20:21:08.213239 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:21:08.213194 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="86cf0680-5743-47ef-8e5c-d40b64bb9070" containerName="main" probeResult="failure" output="Get \"https://10.132.0.39:8000/health\": dial tcp 10.132.0.39:8000: connect: connection refused" Apr 16 20:21:08.910859 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:21:08.910806 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-8485dcccd-66n2h" podUID="150cd8db-a7e6-4238-9ad0-760a02ca4ba4" containerName="main" probeResult="failure" output="Get \"https://10.132.0.41:8001/health\": dial tcp 10.132.0.41:8001: connect: connection refused" Apr 16 20:21:08.929186 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:21:08.929039 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-6b75fc484c-rqxnl" podUID="d92bff9e-c823-4740-9e8c-bd283d0e313b" containerName="main" probeResult="failure" output="Get \"https://10.132.0.42:8000/health\": dial tcp 10.132.0.42:8000: connect: connection refused" Apr 16 20:21:08.929702 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:21:08.929675 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-8485dcccd-66n2h" Apr 16 20:21:18.212446 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:21:18.212395 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="86cf0680-5743-47ef-8e5c-d40b64bb9070" containerName="main" probeResult="failure" output="Get \"https://10.132.0.39:8000/health\": dial tcp 10.132.0.39:8000: connect: connection refused" Apr 16 20:21:18.910806 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:21:18.910736 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-8485dcccd-66n2h" podUID="150cd8db-a7e6-4238-9ad0-760a02ca4ba4" containerName="main" probeResult="failure" output="Get \"https://10.132.0.41:8001/health\": dial tcp 10.132.0.41:8001: connect: connection refused" Apr 16 20:21:18.927807 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:21:18.927755 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-6b75fc484c-rqxnl" podUID="d92bff9e-c823-4740-9e8c-bd283d0e313b" containerName="main" probeResult="failure" output="Get \"https://10.132.0.42:8000/health\": dial tcp 10.132.0.42:8000: connect: connection refused" Apr 16 20:21:28.212220 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:21:28.212176 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="86cf0680-5743-47ef-8e5c-d40b64bb9070" containerName="main" probeResult="failure" output="Get \"https://10.132.0.39:8000/health\": dial tcp 10.132.0.39:8000: connect: connection refused" Apr 16 20:21:28.910982 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:21:28.910787 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-8485dcccd-66n2h" podUID="150cd8db-a7e6-4238-9ad0-760a02ca4ba4" containerName="main" probeResult="failure" output="Get \"https://10.132.0.41:8001/health\": dial tcp 10.132.0.41:8001: connect: connection refused" Apr 16 20:21:28.928336 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:21:28.928297 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-6b75fc484c-rqxnl" podUID="d92bff9e-c823-4740-9e8c-bd283d0e313b" containerName="main" probeResult="failure" output="Get \"https://10.132.0.42:8000/health\": dial tcp 10.132.0.42:8000: connect: connection refused" Apr 16 20:21:38.212815 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:21:38.212755 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="86cf0680-5743-47ef-8e5c-d40b64bb9070" containerName="main" probeResult="failure" output="Get \"https://10.132.0.39:8000/health\": dial tcp 10.132.0.39:8000: connect: connection refused" Apr 16 20:21:38.909904 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:21:38.909860 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-8485dcccd-66n2h" podUID="150cd8db-a7e6-4238-9ad0-760a02ca4ba4" containerName="main" probeResult="failure" output="Get \"https://10.132.0.41:8001/health\": dial tcp 10.132.0.41:8001: connect: connection refused" Apr 16 20:21:38.928177 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:21:38.928134 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-6b75fc484c-rqxnl" podUID="d92bff9e-c823-4740-9e8c-bd283d0e313b" containerName="main" probeResult="failure" output="Get \"https://10.132.0.42:8000/health\": dial tcp 10.132.0.42:8000: connect: connection refused" Apr 16 20:21:48.212577 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:21:48.212534 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="86cf0680-5743-47ef-8e5c-d40b64bb9070" containerName="main" probeResult="failure" output="Get \"https://10.132.0.39:8000/health\": dial tcp 10.132.0.39:8000: connect: connection refused" Apr 16 20:21:48.910515 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:21:48.910450 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-8485dcccd-66n2h" podUID="150cd8db-a7e6-4238-9ad0-760a02ca4ba4" containerName="main" probeResult="failure" output="Get \"https://10.132.0.41:8001/health\": dial tcp 10.132.0.41:8001: connect: connection refused" Apr 16 20:21:48.927843 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:21:48.927798 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-6b75fc484c-rqxnl" podUID="d92bff9e-c823-4740-9e8c-bd283d0e313b" containerName="main" probeResult="failure" output="Get \"https://10.132.0.42:8000/health\": dial tcp 10.132.0.42:8000: connect: connection refused" Apr 16 20:21:58.212517 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:21:58.212471 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="86cf0680-5743-47ef-8e5c-d40b64bb9070" containerName="main" probeResult="failure" output="Get \"https://10.132.0.39:8000/health\": dial tcp 10.132.0.39:8000: connect: connection refused" Apr 16 20:21:58.909929 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:21:58.909880 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-8485dcccd-66n2h" podUID="150cd8db-a7e6-4238-9ad0-760a02ca4ba4" containerName="main" probeResult="failure" output="Get \"https://10.132.0.41:8001/health\": dial tcp 10.132.0.41:8001: connect: connection refused" Apr 16 20:21:58.927695 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:21:58.927658 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-6b75fc484c-rqxnl" podUID="d92bff9e-c823-4740-9e8c-bd283d0e313b" containerName="main" probeResult="failure" output="Get \"https://10.132.0.42:8000/health\": dial tcp 10.132.0.42:8000: connect: connection refused" Apr 16 20:22:08.212345 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:22:08.212292 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="86cf0680-5743-47ef-8e5c-d40b64bb9070" containerName="main" probeResult="failure" output="Get \"https://10.132.0.39:8000/health\": dial tcp 10.132.0.39:8000: connect: connection refused" Apr 16 20:22:08.910406 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:22:08.910369 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-8485dcccd-66n2h" podUID="150cd8db-a7e6-4238-9ad0-760a02ca4ba4" containerName="main" probeResult="failure" output="Get \"https://10.132.0.41:8001/health\": dial tcp 10.132.0.41:8001: connect: connection refused" Apr 16 20:22:08.927911 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:22:08.927866 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-6b75fc484c-rqxnl" podUID="d92bff9e-c823-4740-9e8c-bd283d0e313b" containerName="main" probeResult="failure" output="Get \"https://10.132.0.42:8000/health\": dial tcp 10.132.0.42:8000: connect: connection refused" Apr 16 20:22:18.212596 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:22:18.212499 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="86cf0680-5743-47ef-8e5c-d40b64bb9070" containerName="main" probeResult="failure" output="Get \"https://10.132.0.39:8000/health\": dial tcp 10.132.0.39:8000: connect: connection refused" Apr 16 20:22:18.910202 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:22:18.910160 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-8485dcccd-66n2h" podUID="150cd8db-a7e6-4238-9ad0-760a02ca4ba4" containerName="main" probeResult="failure" output="Get \"https://10.132.0.41:8001/health\": dial tcp 10.132.0.41:8001: connect: connection refused" Apr 16 20:22:18.928773 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:22:18.928740 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-6b75fc484c-rqxnl" podUID="d92bff9e-c823-4740-9e8c-bd283d0e313b" containerName="main" probeResult="failure" output="Get \"https://10.132.0.42:8000/health\": dial tcp 10.132.0.42:8000: connect: connection refused" Apr 16 20:22:28.213039 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:22:28.212998 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="86cf0680-5743-47ef-8e5c-d40b64bb9070" containerName="main" probeResult="failure" output="Get \"https://10.132.0.39:8000/health\": dial tcp 10.132.0.39:8000: connect: connection refused" Apr 16 20:22:28.910373 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:22:28.910327 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-8485dcccd-66n2h" podUID="150cd8db-a7e6-4238-9ad0-760a02ca4ba4" containerName="main" probeResult="failure" output="Get \"https://10.132.0.41:8001/health\": dial tcp 10.132.0.41:8001: connect: connection refused" Apr 16 20:22:28.927631 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:22:28.927598 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-6b75fc484c-rqxnl" podUID="d92bff9e-c823-4740-9e8c-bd283d0e313b" containerName="main" probeResult="failure" output="Get \"https://10.132.0.42:8000/health\": dial tcp 10.132.0.42:8000: connect: connection refused" Apr 16 20:22:38.212577 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:22:38.212532 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="86cf0680-5743-47ef-8e5c-d40b64bb9070" containerName="main" probeResult="failure" output="Get \"https://10.132.0.39:8000/health\": dial tcp 10.132.0.39:8000: connect: connection refused" Apr 16 20:22:38.910583 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:22:38.910537 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-8485dcccd-66n2h" podUID="150cd8db-a7e6-4238-9ad0-760a02ca4ba4" containerName="main" probeResult="failure" output="Get \"https://10.132.0.41:8001/health\": dial tcp 10.132.0.41:8001: connect: connection refused" Apr 16 20:22:38.928381 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:22:38.928345 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-6b75fc484c-rqxnl" podUID="d92bff9e-c823-4740-9e8c-bd283d0e313b" containerName="main" probeResult="failure" output="Get \"https://10.132.0.42:8000/health\": dial tcp 10.132.0.42:8000: connect: connection refused" Apr 16 20:22:48.213085 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:22:48.213042 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="86cf0680-5743-47ef-8e5c-d40b64bb9070" containerName="main" probeResult="failure" output="Get \"https://10.132.0.39:8000/health\": dial tcp 10.132.0.39:8000: connect: connection refused" Apr 16 20:22:48.910489 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:22:48.910435 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-8485dcccd-66n2h" podUID="150cd8db-a7e6-4238-9ad0-760a02ca4ba4" containerName="main" probeResult="failure" output="Get \"https://10.132.0.41:8001/health\": dial tcp 10.132.0.41:8001: connect: connection refused" Apr 16 20:22:48.928118 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:22:48.928069 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-6b75fc484c-rqxnl" podUID="d92bff9e-c823-4740-9e8c-bd283d0e313b" containerName="main" probeResult="failure" output="Get \"https://10.132.0.42:8000/health\": dial tcp 10.132.0.42:8000: connect: connection refused" Apr 16 20:22:58.212175 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:22:58.212134 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="86cf0680-5743-47ef-8e5c-d40b64bb9070" containerName="main" probeResult="failure" output="Get \"https://10.132.0.39:8000/health\": dial tcp 10.132.0.39:8000: connect: connection refused" Apr 16 20:22:58.910301 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:22:58.910243 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-8485dcccd-66n2h" podUID="150cd8db-a7e6-4238-9ad0-760a02ca4ba4" containerName="main" probeResult="failure" output="Get \"https://10.132.0.41:8001/health\": dial tcp 10.132.0.41:8001: connect: connection refused" Apr 16 20:22:58.927903 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:22:58.927873 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-6b75fc484c-rqxnl" podUID="d92bff9e-c823-4740-9e8c-bd283d0e313b" containerName="main" probeResult="failure" output="Get \"https://10.132.0.42:8000/health\": dial tcp 10.132.0.42:8000: connect: connection refused" Apr 16 20:23:08.212606 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:23:08.212560 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="86cf0680-5743-47ef-8e5c-d40b64bb9070" containerName="main" probeResult="failure" output="Get \"https://10.132.0.39:8000/health\": dial tcp 10.132.0.39:8000: connect: connection refused" Apr 16 20:23:08.910765 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:23:08.910723 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-8485dcccd-66n2h" podUID="150cd8db-a7e6-4238-9ad0-760a02ca4ba4" containerName="main" probeResult="failure" output="Get \"https://10.132.0.41:8001/health\": dial tcp 10.132.0.41:8001: connect: connection refused" Apr 16 20:23:08.928505 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:23:08.928470 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-6b75fc484c-rqxnl" podUID="d92bff9e-c823-4740-9e8c-bd283d0e313b" containerName="main" probeResult="failure" output="Get \"https://10.132.0.42:8000/health\": dial tcp 10.132.0.42:8000: connect: connection refused" Apr 16 20:23:18.212802 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:23:18.212764 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="86cf0680-5743-47ef-8e5c-d40b64bb9070" containerName="main" probeResult="failure" output="Get \"https://10.132.0.39:8000/health\": dial tcp 10.132.0.39:8000: connect: connection refused" Apr 16 20:23:18.910764 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:23:18.910722 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-8485dcccd-66n2h" podUID="150cd8db-a7e6-4238-9ad0-760a02ca4ba4" containerName="main" probeResult="failure" output="Get \"https://10.132.0.41:8001/health\": dial tcp 10.132.0.41:8001: connect: connection refused" Apr 16 20:23:18.927826 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:23:18.927787 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-6b75fc484c-rqxnl" podUID="d92bff9e-c823-4740-9e8c-bd283d0e313b" containerName="main" probeResult="failure" output="Get \"https://10.132.0.42:8000/health\": dial tcp 10.132.0.42:8000: connect: connection refused" Apr 16 20:23:28.212567 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:23:28.212519 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="86cf0680-5743-47ef-8e5c-d40b64bb9070" containerName="main" probeResult="failure" output="Get \"https://10.132.0.39:8000/health\": dial tcp 10.132.0.39:8000: connect: connection refused" Apr 16 20:23:28.910008 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:23:28.909964 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-8485dcccd-66n2h" podUID="150cd8db-a7e6-4238-9ad0-760a02ca4ba4" containerName="main" probeResult="failure" output="Get \"https://10.132.0.41:8001/health\": dial tcp 10.132.0.41:8001: connect: connection refused" Apr 16 20:23:28.928041 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:23:28.928003 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-6b75fc484c-rqxnl" podUID="d92bff9e-c823-4740-9e8c-bd283d0e313b" containerName="main" probeResult="failure" output="Get \"https://10.132.0.42:8000/health\": dial tcp 10.132.0.42:8000: connect: connection refused" Apr 16 20:23:38.227515 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:23:38.227480 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 20:23:38.238940 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:23:38.238918 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 20:23:38.910575 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:23:38.910530 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-8485dcccd-66n2h" podUID="150cd8db-a7e6-4238-9ad0-760a02ca4ba4" containerName="main" probeResult="failure" output="Get \"https://10.132.0.41:8001/health\": dial tcp 10.132.0.41:8001: connect: connection refused" Apr 16 20:23:38.928069 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:23:38.928024 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-6b75fc484c-rqxnl" podUID="d92bff9e-c823-4740-9e8c-bd283d0e313b" containerName="main" probeResult="failure" output="Get \"https://10.132.0.42:8000/health\": dial tcp 10.132.0.42:8000: connect: connection refused" Apr 16 20:23:48.910782 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:23:48.910733 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-8485dcccd-66n2h" podUID="150cd8db-a7e6-4238-9ad0-760a02ca4ba4" containerName="main" probeResult="failure" output="Get \"https://10.132.0.41:8001/health\": dial tcp 10.132.0.41:8001: connect: connection refused" Apr 16 20:23:48.928825 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:23:48.928788 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-6b75fc484c-rqxnl" podUID="d92bff9e-c823-4740-9e8c-bd283d0e313b" containerName="main" probeResult="failure" output="Get \"https://10.132.0.42:8000/health\": dial tcp 10.132.0.42:8000: connect: connection refused" Apr 16 20:23:53.350498 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:23:53.350464 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2xttw_d7cbe800-699f-48fe-9a8a-b74c32bf0dcc/console-operator/2.log" Apr 16 20:23:53.355177 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:23:53.355151 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2xttw_d7cbe800-699f-48fe-9a8a-b74c32bf0dcc/console-operator/2.log" Apr 16 20:23:53.358580 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:23:53.358558 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bsbmz_40cddc46-c9be-411b-92b8-a65e04009fc9/ovn-acl-logging/0.log" Apr 16 20:23:53.362879 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:23:53.362846 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bsbmz_40cddc46-c9be-411b-92b8-a65e04009fc9/ovn-acl-logging/0.log" Apr 16 20:23:53.861907 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:23:53.861870 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0"] Apr 16 20:23:53.862274 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:23:53.862237 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="86cf0680-5743-47ef-8e5c-d40b64bb9070" containerName="main" containerID="cri-o://7dd3605d493e88820f811a9166f38f5eb7f198233332eecd1fc1d859c06f6042" gracePeriod=30 Apr 16 20:23:55.216274 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:23:55.216252 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 20:23:55.355118 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:23:55.355015 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/86cf0680-5743-47ef-8e5c-d40b64bb9070-tls-certs\") pod \"86cf0680-5743-47ef-8e5c-d40b64bb9070\" (UID: \"86cf0680-5743-47ef-8e5c-d40b64bb9070\") " Apr 16 20:23:55.355267 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:23:55.355124 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/86cf0680-5743-47ef-8e5c-d40b64bb9070-kserve-provision-location\") pod \"86cf0680-5743-47ef-8e5c-d40b64bb9070\" (UID: \"86cf0680-5743-47ef-8e5c-d40b64bb9070\") " Apr 16 20:23:55.355267 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:23:55.355243 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b785w\" (UniqueName: \"kubernetes.io/projected/86cf0680-5743-47ef-8e5c-d40b64bb9070-kube-api-access-b785w\") pod \"86cf0680-5743-47ef-8e5c-d40b64bb9070\" (UID: \"86cf0680-5743-47ef-8e5c-d40b64bb9070\") " Apr 16 20:23:55.355361 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:23:55.355328 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/86cf0680-5743-47ef-8e5c-d40b64bb9070-dshm\") pod \"86cf0680-5743-47ef-8e5c-d40b64bb9070\" (UID: \"86cf0680-5743-47ef-8e5c-d40b64bb9070\") " Apr 16 20:23:55.355412 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:23:55.355378 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/86cf0680-5743-47ef-8e5c-d40b64bb9070-model-cache\") pod \"86cf0680-5743-47ef-8e5c-d40b64bb9070\" (UID: \"86cf0680-5743-47ef-8e5c-d40b64bb9070\") " Apr 16 20:23:55.355468 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:23:55.355414 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/86cf0680-5743-47ef-8e5c-d40b64bb9070-home\") pod \"86cf0680-5743-47ef-8e5c-d40b64bb9070\" (UID: \"86cf0680-5743-47ef-8e5c-d40b64bb9070\") " Apr 16 20:23:55.355644 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:23:55.355599 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/86cf0680-5743-47ef-8e5c-d40b64bb9070-model-cache" (OuterVolumeSpecName: "model-cache") pod "86cf0680-5743-47ef-8e5c-d40b64bb9070" (UID: "86cf0680-5743-47ef-8e5c-d40b64bb9070"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:23:55.355798 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:23:55.355765 2572 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/86cf0680-5743-47ef-8e5c-d40b64bb9070-model-cache\") on node \"ip-10-0-128-48.ec2.internal\" DevicePath \"\"" Apr 16 20:23:55.355908 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:23:55.355881 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/86cf0680-5743-47ef-8e5c-d40b64bb9070-home" (OuterVolumeSpecName: "home") pod "86cf0680-5743-47ef-8e5c-d40b64bb9070" (UID: "86cf0680-5743-47ef-8e5c-d40b64bb9070"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:23:55.357533 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:23:55.357507 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/86cf0680-5743-47ef-8e5c-d40b64bb9070-kube-api-access-b785w" (OuterVolumeSpecName: "kube-api-access-b785w") pod "86cf0680-5743-47ef-8e5c-d40b64bb9070" (UID: "86cf0680-5743-47ef-8e5c-d40b64bb9070"). InnerVolumeSpecName "kube-api-access-b785w". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 20:23:55.357641 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:23:55.357532 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/86cf0680-5743-47ef-8e5c-d40b64bb9070-dshm" (OuterVolumeSpecName: "dshm") pod "86cf0680-5743-47ef-8e5c-d40b64bb9070" (UID: "86cf0680-5743-47ef-8e5c-d40b64bb9070"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:23:55.357641 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:23:55.357627 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/86cf0680-5743-47ef-8e5c-d40b64bb9070-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "86cf0680-5743-47ef-8e5c-d40b64bb9070" (UID: "86cf0680-5743-47ef-8e5c-d40b64bb9070"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:23:55.393194 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:23:55.393149 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/86cf0680-5743-47ef-8e5c-d40b64bb9070-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "86cf0680-5743-47ef-8e5c-d40b64bb9070" (UID: "86cf0680-5743-47ef-8e5c-d40b64bb9070"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:23:55.456817 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:23:55.456772 2572 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/86cf0680-5743-47ef-8e5c-d40b64bb9070-home\") on node \"ip-10-0-128-48.ec2.internal\" DevicePath \"\"" Apr 16 20:23:55.456817 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:23:55.456807 2572 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/86cf0680-5743-47ef-8e5c-d40b64bb9070-tls-certs\") on node \"ip-10-0-128-48.ec2.internal\" DevicePath \"\"" Apr 16 20:23:55.456817 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:23:55.456826 2572 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/86cf0680-5743-47ef-8e5c-d40b64bb9070-kserve-provision-location\") on node \"ip-10-0-128-48.ec2.internal\" DevicePath \"\"" Apr 16 20:23:55.457179 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:23:55.456837 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-b785w\" (UniqueName: \"kubernetes.io/projected/86cf0680-5743-47ef-8e5c-d40b64bb9070-kube-api-access-b785w\") on node \"ip-10-0-128-48.ec2.internal\" DevicePath \"\"" Apr 16 20:23:55.457179 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:23:55.456847 2572 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/86cf0680-5743-47ef-8e5c-d40b64bb9070-dshm\") on node \"ip-10-0-128-48.ec2.internal\" DevicePath \"\"" Apr 16 20:23:55.730119 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:23:55.730011 2572 generic.go:358] "Generic (PLEG): container finished" podID="86cf0680-5743-47ef-8e5c-d40b64bb9070" containerID="7dd3605d493e88820f811a9166f38f5eb7f198233332eecd1fc1d859c06f6042" exitCode=0 Apr 16 20:23:55.730119 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:23:55.730108 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 20:23:55.730368 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:23:55.730117 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" event={"ID":"86cf0680-5743-47ef-8e5c-d40b64bb9070","Type":"ContainerDied","Data":"7dd3605d493e88820f811a9166f38f5eb7f198233332eecd1fc1d859c06f6042"} Apr 16 20:23:55.730368 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:23:55.730156 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" event={"ID":"86cf0680-5743-47ef-8e5c-d40b64bb9070","Type":"ContainerDied","Data":"4841ce586c55b2abd246435a887a4926a65bc784d0e24ab4a3b817a9f792d3ea"} Apr 16 20:23:55.730368 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:23:55.730179 2572 scope.go:117] "RemoveContainer" containerID="7dd3605d493e88820f811a9166f38f5eb7f198233332eecd1fc1d859c06f6042" Apr 16 20:23:55.751723 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:23:55.751693 2572 scope.go:117] "RemoveContainer" containerID="3fab1be540643798642ce60b125961f20e9dafce30a7b8360c87d75f7d86310f" Apr 16 20:23:55.753294 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:23:55.753271 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0"] Apr 16 20:23:55.759253 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:23:55.759226 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0"] Apr 16 20:23:55.778554 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:23:55.778523 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1"] Apr 16 20:23:55.778823 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:23:55.778795 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" podUID="e9affb05-4d32-4615-85a8-d9ba201725ed" containerName="main" containerID="cri-o://78b9612ba23ae63a2c025f61ae79ab939c5bf8a7b8643b624676453cb432ca3c" gracePeriod=30 Apr 16 20:23:55.806667 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:23:55.806644 2572 scope.go:117] "RemoveContainer" containerID="7dd3605d493e88820f811a9166f38f5eb7f198233332eecd1fc1d859c06f6042" Apr 16 20:23:55.806967 ip-10-0-128-48 kubenswrapper[2572]: E0416 20:23:55.806944 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7dd3605d493e88820f811a9166f38f5eb7f198233332eecd1fc1d859c06f6042\": container with ID starting with 7dd3605d493e88820f811a9166f38f5eb7f198233332eecd1fc1d859c06f6042 not found: ID does not exist" containerID="7dd3605d493e88820f811a9166f38f5eb7f198233332eecd1fc1d859c06f6042" Apr 16 20:23:55.807019 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:23:55.806970 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7dd3605d493e88820f811a9166f38f5eb7f198233332eecd1fc1d859c06f6042"} err="failed to get container status \"7dd3605d493e88820f811a9166f38f5eb7f198233332eecd1fc1d859c06f6042\": rpc error: code = NotFound desc = could not find container \"7dd3605d493e88820f811a9166f38f5eb7f198233332eecd1fc1d859c06f6042\": container with ID starting with 7dd3605d493e88820f811a9166f38f5eb7f198233332eecd1fc1d859c06f6042 not found: ID does not exist" Apr 16 20:23:55.807019 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:23:55.806987 2572 scope.go:117] "RemoveContainer" containerID="3fab1be540643798642ce60b125961f20e9dafce30a7b8360c87d75f7d86310f" Apr 16 20:23:55.807301 ip-10-0-128-48 kubenswrapper[2572]: E0416 20:23:55.807281 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3fab1be540643798642ce60b125961f20e9dafce30a7b8360c87d75f7d86310f\": container with ID starting with 3fab1be540643798642ce60b125961f20e9dafce30a7b8360c87d75f7d86310f not found: ID does not exist" containerID="3fab1be540643798642ce60b125961f20e9dafce30a7b8360c87d75f7d86310f" Apr 16 20:23:55.807358 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:23:55.807311 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3fab1be540643798642ce60b125961f20e9dafce30a7b8360c87d75f7d86310f"} err="failed to get container status \"3fab1be540643798642ce60b125961f20e9dafce30a7b8360c87d75f7d86310f\": rpc error: code = NotFound desc = could not find container \"3fab1be540643798642ce60b125961f20e9dafce30a7b8360c87d75f7d86310f\": container with ID starting with 3fab1be540643798642ce60b125961f20e9dafce30a7b8360c87d75f7d86310f not found: ID does not exist" Apr 16 20:23:57.015788 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:23:57.015763 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 20:23:57.172968 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:23:57.172880 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e9affb05-4d32-4615-85a8-d9ba201725ed-tls-certs\") pod \"e9affb05-4d32-4615-85a8-d9ba201725ed\" (UID: \"e9affb05-4d32-4615-85a8-d9ba201725ed\") " Apr 16 20:23:57.172968 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:23:57.172932 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/e9affb05-4d32-4615-85a8-d9ba201725ed-model-cache\") pod \"e9affb05-4d32-4615-85a8-d9ba201725ed\" (UID: \"e9affb05-4d32-4615-85a8-d9ba201725ed\") " Apr 16 20:23:57.173222 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:23:57.173136 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/e9affb05-4d32-4615-85a8-d9ba201725ed-dshm\") pod \"e9affb05-4d32-4615-85a8-d9ba201725ed\" (UID: \"e9affb05-4d32-4615-85a8-d9ba201725ed\") " Apr 16 20:23:57.173222 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:23:57.173181 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e9affb05-4d32-4615-85a8-d9ba201725ed-model-cache" (OuterVolumeSpecName: "model-cache") pod "e9affb05-4d32-4615-85a8-d9ba201725ed" (UID: "e9affb05-4d32-4615-85a8-d9ba201725ed"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:23:57.173222 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:23:57.173186 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e9affb05-4d32-4615-85a8-d9ba201725ed-kserve-provision-location\") pod \"e9affb05-4d32-4615-85a8-d9ba201725ed\" (UID: \"e9affb05-4d32-4615-85a8-d9ba201725ed\") " Apr 16 20:23:57.173388 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:23:57.173245 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/e9affb05-4d32-4615-85a8-d9ba201725ed-home\") pod \"e9affb05-4d32-4615-85a8-d9ba201725ed\" (UID: \"e9affb05-4d32-4615-85a8-d9ba201725ed\") " Apr 16 20:23:57.173388 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:23:57.173277 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mhvh6\" (UniqueName: \"kubernetes.io/projected/e9affb05-4d32-4615-85a8-d9ba201725ed-kube-api-access-mhvh6\") pod \"e9affb05-4d32-4615-85a8-d9ba201725ed\" (UID: \"e9affb05-4d32-4615-85a8-d9ba201725ed\") " Apr 16 20:23:57.173598 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:23:57.173580 2572 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/e9affb05-4d32-4615-85a8-d9ba201725ed-model-cache\") on node \"ip-10-0-128-48.ec2.internal\" DevicePath \"\"" Apr 16 20:23:57.173719 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:23:57.173685 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e9affb05-4d32-4615-85a8-d9ba201725ed-home" (OuterVolumeSpecName: "home") pod "e9affb05-4d32-4615-85a8-d9ba201725ed" (UID: "e9affb05-4d32-4615-85a8-d9ba201725ed"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:23:57.175076 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:23:57.175050 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e9affb05-4d32-4615-85a8-d9ba201725ed-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "e9affb05-4d32-4615-85a8-d9ba201725ed" (UID: "e9affb05-4d32-4615-85a8-d9ba201725ed"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:23:57.175336 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:23:57.175208 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e9affb05-4d32-4615-85a8-d9ba201725ed-dshm" (OuterVolumeSpecName: "dshm") pod "e9affb05-4d32-4615-85a8-d9ba201725ed" (UID: "e9affb05-4d32-4615-85a8-d9ba201725ed"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:23:57.175737 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:23:57.175708 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e9affb05-4d32-4615-85a8-d9ba201725ed-kube-api-access-mhvh6" (OuterVolumeSpecName: "kube-api-access-mhvh6") pod "e9affb05-4d32-4615-85a8-d9ba201725ed" (UID: "e9affb05-4d32-4615-85a8-d9ba201725ed"). InnerVolumeSpecName "kube-api-access-mhvh6". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 20:23:57.231217 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:23:57.231176 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e9affb05-4d32-4615-85a8-d9ba201725ed-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "e9affb05-4d32-4615-85a8-d9ba201725ed" (UID: "e9affb05-4d32-4615-85a8-d9ba201725ed"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:23:57.274233 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:23:57.274198 2572 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/e9affb05-4d32-4615-85a8-d9ba201725ed-dshm\") on node \"ip-10-0-128-48.ec2.internal\" DevicePath \"\"" Apr 16 20:23:57.274233 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:23:57.274224 2572 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e9affb05-4d32-4615-85a8-d9ba201725ed-kserve-provision-location\") on node \"ip-10-0-128-48.ec2.internal\" DevicePath \"\"" Apr 16 20:23:57.274233 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:23:57.274235 2572 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/e9affb05-4d32-4615-85a8-d9ba201725ed-home\") on node \"ip-10-0-128-48.ec2.internal\" DevicePath \"\"" Apr 16 20:23:57.274499 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:23:57.274244 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-mhvh6\" (UniqueName: \"kubernetes.io/projected/e9affb05-4d32-4615-85a8-d9ba201725ed-kube-api-access-mhvh6\") on node \"ip-10-0-128-48.ec2.internal\" DevicePath \"\"" Apr 16 20:23:57.274499 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:23:57.274253 2572 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e9affb05-4d32-4615-85a8-d9ba201725ed-tls-certs\") on node \"ip-10-0-128-48.ec2.internal\" DevicePath \"\"" Apr 16 20:23:57.367610 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:23:57.367571 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="86cf0680-5743-47ef-8e5c-d40b64bb9070" path="/var/lib/kubelet/pods/86cf0680-5743-47ef-8e5c-d40b64bb9070/volumes" Apr 16 20:23:57.745325 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:23:57.745294 2572 generic.go:358] "Generic (PLEG): container finished" podID="e9affb05-4d32-4615-85a8-d9ba201725ed" containerID="78b9612ba23ae63a2c025f61ae79ab939c5bf8a7b8643b624676453cb432ca3c" exitCode=0 Apr 16 20:23:57.745510 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:23:57.745366 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 20:23:57.745510 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:23:57.745371 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" event={"ID":"e9affb05-4d32-4615-85a8-d9ba201725ed","Type":"ContainerDied","Data":"78b9612ba23ae63a2c025f61ae79ab939c5bf8a7b8643b624676453cb432ca3c"} Apr 16 20:23:57.745510 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:23:57.745402 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" event={"ID":"e9affb05-4d32-4615-85a8-d9ba201725ed","Type":"ContainerDied","Data":"85264eb8f8b0eb2d753556ac20e9215d3550c6a200e26b86d90b390098380157"} Apr 16 20:23:57.745510 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:23:57.745417 2572 scope.go:117] "RemoveContainer" containerID="78b9612ba23ae63a2c025f61ae79ab939c5bf8a7b8643b624676453cb432ca3c" Apr 16 20:23:57.765112 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:23:57.765059 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1"] Apr 16 20:23:57.765667 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:23:57.765650 2572 scope.go:117] "RemoveContainer" containerID="bcc02997898e5553a2de0046956e6c403ac07f186b6e00a98498c26df273119f" Apr 16 20:23:57.767515 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:23:57.767487 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1"] Apr 16 20:23:57.827922 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:23:57.827903 2572 scope.go:117] "RemoveContainer" containerID="78b9612ba23ae63a2c025f61ae79ab939c5bf8a7b8643b624676453cb432ca3c" Apr 16 20:23:57.828247 ip-10-0-128-48 kubenswrapper[2572]: E0416 20:23:57.828226 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"78b9612ba23ae63a2c025f61ae79ab939c5bf8a7b8643b624676453cb432ca3c\": container with ID starting with 78b9612ba23ae63a2c025f61ae79ab939c5bf8a7b8643b624676453cb432ca3c not found: ID does not exist" containerID="78b9612ba23ae63a2c025f61ae79ab939c5bf8a7b8643b624676453cb432ca3c" Apr 16 20:23:57.828342 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:23:57.828253 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"78b9612ba23ae63a2c025f61ae79ab939c5bf8a7b8643b624676453cb432ca3c"} err="failed to get container status \"78b9612ba23ae63a2c025f61ae79ab939c5bf8a7b8643b624676453cb432ca3c\": rpc error: code = NotFound desc = could not find container \"78b9612ba23ae63a2c025f61ae79ab939c5bf8a7b8643b624676453cb432ca3c\": container with ID starting with 78b9612ba23ae63a2c025f61ae79ab939c5bf8a7b8643b624676453cb432ca3c not found: ID does not exist" Apr 16 20:23:57.828342 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:23:57.828269 2572 scope.go:117] "RemoveContainer" containerID="bcc02997898e5553a2de0046956e6c403ac07f186b6e00a98498c26df273119f" Apr 16 20:23:57.828538 ip-10-0-128-48 kubenswrapper[2572]: E0416 20:23:57.828520 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bcc02997898e5553a2de0046956e6c403ac07f186b6e00a98498c26df273119f\": container with ID starting with bcc02997898e5553a2de0046956e6c403ac07f186b6e00a98498c26df273119f not found: ID does not exist" containerID="bcc02997898e5553a2de0046956e6c403ac07f186b6e00a98498c26df273119f" Apr 16 20:23:57.828581 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:23:57.828542 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bcc02997898e5553a2de0046956e6c403ac07f186b6e00a98498c26df273119f"} err="failed to get container status \"bcc02997898e5553a2de0046956e6c403ac07f186b6e00a98498c26df273119f\": rpc error: code = NotFound desc = could not find container \"bcc02997898e5553a2de0046956e6c403ac07f186b6e00a98498c26df273119f\": container with ID starting with bcc02997898e5553a2de0046956e6c403ac07f186b6e00a98498c26df273119f not found: ID does not exist" Apr 16 20:23:58.910253 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:23:58.910214 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-8485dcccd-66n2h" podUID="150cd8db-a7e6-4238-9ad0-760a02ca4ba4" containerName="main" probeResult="failure" output="Get \"https://10.132.0.41:8001/health\": dial tcp 10.132.0.41:8001: connect: connection refused" Apr 16 20:23:58.928560 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:23:58.928517 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-6b75fc484c-rqxnl" podUID="d92bff9e-c823-4740-9e8c-bd283d0e313b" containerName="main" probeResult="failure" output="Get \"https://10.132.0.42:8000/health\": dial tcp 10.132.0.42:8000: connect: connection refused" Apr 16 20:23:59.366893 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:23:59.366850 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e9affb05-4d32-4615-85a8-d9ba201725ed" path="/var/lib/kubelet/pods/e9affb05-4d32-4615-85a8-d9ba201725ed/volumes" Apr 16 20:24:06.865740 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:24:06.865703 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-c56884tkxg"] Apr 16 20:24:06.866131 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:24:06.866007 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="53d396a1-380e-4e12-aef0-d8cb1c29f4bb" containerName="main" Apr 16 20:24:06.866131 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:24:06.866018 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="53d396a1-380e-4e12-aef0-d8cb1c29f4bb" containerName="main" Apr 16 20:24:06.866131 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:24:06.866028 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e9affb05-4d32-4615-85a8-d9ba201725ed" containerName="storage-initializer" Apr 16 20:24:06.866131 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:24:06.866035 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9affb05-4d32-4615-85a8-d9ba201725ed" containerName="storage-initializer" Apr 16 20:24:06.866131 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:24:06.866042 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e9affb05-4d32-4615-85a8-d9ba201725ed" containerName="main" Apr 16 20:24:06.866131 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:24:06.866047 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9affb05-4d32-4615-85a8-d9ba201725ed" containerName="main" Apr 16 20:24:06.866131 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:24:06.866056 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="86cf0680-5743-47ef-8e5c-d40b64bb9070" containerName="storage-initializer" Apr 16 20:24:06.866131 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:24:06.866061 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="86cf0680-5743-47ef-8e5c-d40b64bb9070" containerName="storage-initializer" Apr 16 20:24:06.866131 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:24:06.866071 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="86cf0680-5743-47ef-8e5c-d40b64bb9070" containerName="main" Apr 16 20:24:06.866131 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:24:06.866076 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="86cf0680-5743-47ef-8e5c-d40b64bb9070" containerName="main" Apr 16 20:24:06.866131 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:24:06.866084 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="55379d66-7d64-4888-ad47-e977a8686a02" containerName="storage-initializer" Apr 16 20:24:06.866131 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:24:06.866102 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="55379d66-7d64-4888-ad47-e977a8686a02" containerName="storage-initializer" Apr 16 20:24:06.866131 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:24:06.866109 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="53d396a1-380e-4e12-aef0-d8cb1c29f4bb" containerName="storage-initializer" Apr 16 20:24:06.866131 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:24:06.866115 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="53d396a1-380e-4e12-aef0-d8cb1c29f4bb" containerName="storage-initializer" Apr 16 20:24:06.866131 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:24:06.866125 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="55379d66-7d64-4888-ad47-e977a8686a02" containerName="main" Apr 16 20:24:06.866131 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:24:06.866131 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="55379d66-7d64-4888-ad47-e977a8686a02" containerName="main" Apr 16 20:24:06.866131 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:24:06.866138 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="55379d66-7d64-4888-ad47-e977a8686a02" containerName="llm-d-routing-sidecar" Apr 16 20:24:06.866131 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:24:06.866143 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="55379d66-7d64-4888-ad47-e977a8686a02" containerName="llm-d-routing-sidecar" Apr 16 20:24:06.866906 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:24:06.866193 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="e9affb05-4d32-4615-85a8-d9ba201725ed" containerName="main" Apr 16 20:24:06.866906 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:24:06.866203 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="55379d66-7d64-4888-ad47-e977a8686a02" containerName="main" Apr 16 20:24:06.866906 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:24:06.866210 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="53d396a1-380e-4e12-aef0-d8cb1c29f4bb" containerName="main" Apr 16 20:24:06.866906 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:24:06.866219 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="86cf0680-5743-47ef-8e5c-d40b64bb9070" containerName="main" Apr 16 20:24:06.866906 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:24:06.866230 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="55379d66-7d64-4888-ad47-e977a8686a02" containerName="llm-d-routing-sidecar" Apr 16 20:24:06.872409 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:24:06.872384 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-c56884tkxg" Apr 16 20:24:06.875257 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:24:06.875232 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-inline-config-test-epp-sa-dockercfg-9p8lp\"" Apr 16 20:24:06.875389 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:24:06.875374 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-inline-config-test-kserve-self-signed-certs\"" Apr 16 20:24:06.883140 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:24:06.883122 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-c56884tkxg"] Apr 16 20:24:07.062912 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:24:07.062876 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8ppnv\" (UniqueName: \"kubernetes.io/projected/f7c4371d-4330-457c-8865-69b36bd1a5d2-kube-api-access-8ppnv\") pod \"scheduler-inline-config-test-kserve-router-scheduler-c56884tkxg\" (UID: \"f7c4371d-4330-457c-8865-69b36bd1a5d2\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-c56884tkxg" Apr 16 20:24:07.063076 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:24:07.062945 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/f7c4371d-4330-457c-8865-69b36bd1a5d2-tokenizer-tmp\") pod \"scheduler-inline-config-test-kserve-router-scheduler-c56884tkxg\" (UID: \"f7c4371d-4330-457c-8865-69b36bd1a5d2\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-c56884tkxg" Apr 16 20:24:07.063076 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:24:07.062965 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f7c4371d-4330-457c-8865-69b36bd1a5d2-kserve-provision-location\") pod \"scheduler-inline-config-test-kserve-router-scheduler-c56884tkxg\" (UID: \"f7c4371d-4330-457c-8865-69b36bd1a5d2\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-c56884tkxg" Apr 16 20:24:07.063076 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:24:07.063000 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/f7c4371d-4330-457c-8865-69b36bd1a5d2-tokenizer-uds\") pod \"scheduler-inline-config-test-kserve-router-scheduler-c56884tkxg\" (UID: \"f7c4371d-4330-457c-8865-69b36bd1a5d2\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-c56884tkxg" Apr 16 20:24:07.063076 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:24:07.063027 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f7c4371d-4330-457c-8865-69b36bd1a5d2-tls-certs\") pod \"scheduler-inline-config-test-kserve-router-scheduler-c56884tkxg\" (UID: \"f7c4371d-4330-457c-8865-69b36bd1a5d2\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-c56884tkxg" Apr 16 20:24:07.063257 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:24:07.063087 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/f7c4371d-4330-457c-8865-69b36bd1a5d2-tokenizer-cache\") pod \"scheduler-inline-config-test-kserve-router-scheduler-c56884tkxg\" (UID: \"f7c4371d-4330-457c-8865-69b36bd1a5d2\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-c56884tkxg" Apr 16 20:24:07.164545 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:24:07.164471 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/f7c4371d-4330-457c-8865-69b36bd1a5d2-tokenizer-tmp\") pod \"scheduler-inline-config-test-kserve-router-scheduler-c56884tkxg\" (UID: \"f7c4371d-4330-457c-8865-69b36bd1a5d2\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-c56884tkxg" Apr 16 20:24:07.164545 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:24:07.164506 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f7c4371d-4330-457c-8865-69b36bd1a5d2-kserve-provision-location\") pod \"scheduler-inline-config-test-kserve-router-scheduler-c56884tkxg\" (UID: \"f7c4371d-4330-457c-8865-69b36bd1a5d2\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-c56884tkxg" Apr 16 20:24:07.164727 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:24:07.164633 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/f7c4371d-4330-457c-8865-69b36bd1a5d2-tokenizer-uds\") pod \"scheduler-inline-config-test-kserve-router-scheduler-c56884tkxg\" (UID: \"f7c4371d-4330-457c-8865-69b36bd1a5d2\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-c56884tkxg" Apr 16 20:24:07.164727 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:24:07.164678 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f7c4371d-4330-457c-8865-69b36bd1a5d2-tls-certs\") pod \"scheduler-inline-config-test-kserve-router-scheduler-c56884tkxg\" (UID: \"f7c4371d-4330-457c-8865-69b36bd1a5d2\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-c56884tkxg" Apr 16 20:24:07.164727 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:24:07.164717 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/f7c4371d-4330-457c-8865-69b36bd1a5d2-tokenizer-cache\") pod \"scheduler-inline-config-test-kserve-router-scheduler-c56884tkxg\" (UID: \"f7c4371d-4330-457c-8865-69b36bd1a5d2\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-c56884tkxg" Apr 16 20:24:07.164892 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:24:07.164757 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8ppnv\" (UniqueName: \"kubernetes.io/projected/f7c4371d-4330-457c-8865-69b36bd1a5d2-kube-api-access-8ppnv\") pod \"scheduler-inline-config-test-kserve-router-scheduler-c56884tkxg\" (UID: \"f7c4371d-4330-457c-8865-69b36bd1a5d2\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-c56884tkxg" Apr 16 20:24:07.164892 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:24:07.164875 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f7c4371d-4330-457c-8865-69b36bd1a5d2-kserve-provision-location\") pod \"scheduler-inline-config-test-kserve-router-scheduler-c56884tkxg\" (UID: \"f7c4371d-4330-457c-8865-69b36bd1a5d2\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-c56884tkxg" Apr 16 20:24:07.165038 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:24:07.164929 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/f7c4371d-4330-457c-8865-69b36bd1a5d2-tokenizer-tmp\") pod \"scheduler-inline-config-test-kserve-router-scheduler-c56884tkxg\" (UID: \"f7c4371d-4330-457c-8865-69b36bd1a5d2\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-c56884tkxg" Apr 16 20:24:07.165038 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:24:07.165001 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/f7c4371d-4330-457c-8865-69b36bd1a5d2-tokenizer-uds\") pod \"scheduler-inline-config-test-kserve-router-scheduler-c56884tkxg\" (UID: \"f7c4371d-4330-457c-8865-69b36bd1a5d2\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-c56884tkxg" Apr 16 20:24:07.165164 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:24:07.165134 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/f7c4371d-4330-457c-8865-69b36bd1a5d2-tokenizer-cache\") pod \"scheduler-inline-config-test-kserve-router-scheduler-c56884tkxg\" (UID: \"f7c4371d-4330-457c-8865-69b36bd1a5d2\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-c56884tkxg" Apr 16 20:24:07.167193 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:24:07.167174 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f7c4371d-4330-457c-8865-69b36bd1a5d2-tls-certs\") pod \"scheduler-inline-config-test-kserve-router-scheduler-c56884tkxg\" (UID: \"f7c4371d-4330-457c-8865-69b36bd1a5d2\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-c56884tkxg" Apr 16 20:24:07.172930 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:24:07.172907 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8ppnv\" (UniqueName: \"kubernetes.io/projected/f7c4371d-4330-457c-8865-69b36bd1a5d2-kube-api-access-8ppnv\") pod \"scheduler-inline-config-test-kserve-router-scheduler-c56884tkxg\" (UID: \"f7c4371d-4330-457c-8865-69b36bd1a5d2\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-c56884tkxg" Apr 16 20:24:07.182911 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:24:07.182891 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-c56884tkxg" Apr 16 20:24:07.314306 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:24:07.314276 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-c56884tkxg"] Apr 16 20:24:07.316961 ip-10-0-128-48 kubenswrapper[2572]: W0416 20:24:07.316935 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf7c4371d_4330_457c_8865_69b36bd1a5d2.slice/crio-ff1cc9d95aa4fb0a644dd37411d5db46dac9625f4102b09287791bb36a2d2841 WatchSource:0}: Error finding container ff1cc9d95aa4fb0a644dd37411d5db46dac9625f4102b09287791bb36a2d2841: Status 404 returned error can't find the container with id ff1cc9d95aa4fb0a644dd37411d5db46dac9625f4102b09287791bb36a2d2841 Apr 16 20:24:07.318809 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:24:07.318790 2572 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 20:24:07.782169 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:24:07.782133 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-c56884tkxg" event={"ID":"f7c4371d-4330-457c-8865-69b36bd1a5d2","Type":"ContainerStarted","Data":"92afa34b9ce72ead022f37638b1cfa1cc101165f2612d37dc67d5677aa0792cf"} Apr 16 20:24:07.782169 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:24:07.782174 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-c56884tkxg" event={"ID":"f7c4371d-4330-457c-8865-69b36bd1a5d2","Type":"ContainerStarted","Data":"ff1cc9d95aa4fb0a644dd37411d5db46dac9625f4102b09287791bb36a2d2841"} Apr 16 20:24:08.787334 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:24:08.787302 2572 generic.go:358] "Generic (PLEG): container finished" podID="f7c4371d-4330-457c-8865-69b36bd1a5d2" containerID="92afa34b9ce72ead022f37638b1cfa1cc101165f2612d37dc67d5677aa0792cf" exitCode=0 Apr 16 20:24:08.787759 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:24:08.787363 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-c56884tkxg" event={"ID":"f7c4371d-4330-457c-8865-69b36bd1a5d2","Type":"ContainerDied","Data":"92afa34b9ce72ead022f37638b1cfa1cc101165f2612d37dc67d5677aa0792cf"} Apr 16 20:24:08.926734 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:24:08.926709 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-8485dcccd-66n2h" Apr 16 20:24:08.937517 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:24:08.937491 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-6b75fc484c-rqxnl" Apr 16 20:24:08.938821 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:24:08.938797 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-8485dcccd-66n2h" Apr 16 20:24:08.945245 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:24:08.945224 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-6b75fc484c-rqxnl" Apr 16 20:24:09.792507 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:24:09.792468 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-c56884tkxg" event={"ID":"f7c4371d-4330-457c-8865-69b36bd1a5d2","Type":"ContainerStarted","Data":"a393e930e2dfc001e0fc00f9e565eeb7631afce14000e7c5ca29ba301a4c0eaa"} Apr 16 20:24:09.792932 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:24:09.792514 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-c56884tkxg" event={"ID":"f7c4371d-4330-457c-8865-69b36bd1a5d2","Type":"ContainerStarted","Data":"b0125ada958e3a3dc2d4303d8a98c3020e0f1ccf5799130c552befed80fddd77"} Apr 16 20:24:09.792932 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:24:09.792770 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-c56884tkxg" Apr 16 20:24:09.814285 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:24:09.814240 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-c56884tkxg" podStartSLOduration=3.814226187 podStartE2EDuration="3.814226187s" podCreationTimestamp="2026-04-16 20:24:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:24:09.812332653 +0000 UTC m=+1816.991094678" watchObservedRunningTime="2026-04-16 20:24:09.814226187 +0000 UTC m=+1816.992988212" Apr 16 20:24:17.183440 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:24:17.183395 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-c56884tkxg" Apr 16 20:24:17.183440 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:24:17.183437 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-c56884tkxg" Apr 16 20:24:17.186272 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:24:17.186240 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-c56884tkxg" Apr 16 20:24:17.827324 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:24:17.827278 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-c56884tkxg" Apr 16 20:24:22.474110 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:24:22.474060 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-8485dcccd-66n2h"] Apr 16 20:24:22.474932 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:24:22.474882 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-8485dcccd-66n2h" podUID="150cd8db-a7e6-4238-9ad0-760a02ca4ba4" containerName="main" containerID="cri-o://bde189940f3079e6ecb2d9020e43340d68a64ab3c022357383d330eb8fa54513" gracePeriod=30 Apr 16 20:24:22.480355 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:24:22.480329 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-6b75fc484c-rqxnl"] Apr 16 20:24:22.480610 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:24:22.480584 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-6b75fc484c-rqxnl" podUID="d92bff9e-c823-4740-9e8c-bd283d0e313b" containerName="main" containerID="cri-o://998a4fc4216eaaa4343d8725cd1acc1b5e5b811c8566e7110bc05e803373045d" gracePeriod=30 Apr 16 20:24:38.833012 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:24:38.832985 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-c56884tkxg" Apr 16 20:24:42.071059 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:24:42.071025 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6d656b98b8c94tl"] Apr 16 20:24:42.074928 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:24:42.074908 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6d656b98b8c94tl" Apr 16 20:24:42.078026 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:24:42.078005 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisv8de1d74aab16d9cabd8b5aafeb5248e8-kserve-self-signed-certs\"" Apr 16 20:24:42.095280 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:24:42.095257 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6d656b98b8c94tl"] Apr 16 20:24:42.153364 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:24:42.153335 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/cb08cac1-521a-4c93-825e-0bb146316665-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6d656b98b8c94tl\" (UID: \"cb08cac1-521a-4c93-825e-0bb146316665\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6d656b98b8c94tl" Apr 16 20:24:42.153487 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:24:42.153377 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/cb08cac1-521a-4c93-825e-0bb146316665-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6d656b98b8c94tl\" (UID: \"cb08cac1-521a-4c93-825e-0bb146316665\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6d656b98b8c94tl" Apr 16 20:24:42.153487 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:24:42.153413 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/cb08cac1-521a-4c93-825e-0bb146316665-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6d656b98b8c94tl\" (UID: \"cb08cac1-521a-4c93-825e-0bb146316665\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6d656b98b8c94tl" Apr 16 20:24:42.153487 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:24:42.153445 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/cb08cac1-521a-4c93-825e-0bb146316665-dshm\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6d656b98b8c94tl\" (UID: \"cb08cac1-521a-4c93-825e-0bb146316665\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6d656b98b8c94tl" Apr 16 20:24:42.153487 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:24:42.153467 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p7x24\" (UniqueName: \"kubernetes.io/projected/cb08cac1-521a-4c93-825e-0bb146316665-kube-api-access-p7x24\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6d656b98b8c94tl\" (UID: \"cb08cac1-521a-4c93-825e-0bb146316665\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6d656b98b8c94tl" Apr 16 20:24:42.153487 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:24:42.153482 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/cb08cac1-521a-4c93-825e-0bb146316665-home\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6d656b98b8c94tl\" (UID: \"cb08cac1-521a-4c93-825e-0bb146316665\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6d656b98b8c94tl" Apr 16 20:24:42.254564 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:24:42.254524 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/cb08cac1-521a-4c93-825e-0bb146316665-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6d656b98b8c94tl\" (UID: \"cb08cac1-521a-4c93-825e-0bb146316665\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6d656b98b8c94tl" Apr 16 20:24:42.254703 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:24:42.254582 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/cb08cac1-521a-4c93-825e-0bb146316665-dshm\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6d656b98b8c94tl\" (UID: \"cb08cac1-521a-4c93-825e-0bb146316665\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6d656b98b8c94tl" Apr 16 20:24:42.254703 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:24:42.254611 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p7x24\" (UniqueName: \"kubernetes.io/projected/cb08cac1-521a-4c93-825e-0bb146316665-kube-api-access-p7x24\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6d656b98b8c94tl\" (UID: \"cb08cac1-521a-4c93-825e-0bb146316665\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6d656b98b8c94tl" Apr 16 20:24:42.254703 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:24:42.254632 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/cb08cac1-521a-4c93-825e-0bb146316665-home\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6d656b98b8c94tl\" (UID: \"cb08cac1-521a-4c93-825e-0bb146316665\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6d656b98b8c94tl" Apr 16 20:24:42.254703 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:24:42.254671 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/cb08cac1-521a-4c93-825e-0bb146316665-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6d656b98b8c94tl\" (UID: \"cb08cac1-521a-4c93-825e-0bb146316665\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6d656b98b8c94tl" Apr 16 20:24:42.254903 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:24:42.254821 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/cb08cac1-521a-4c93-825e-0bb146316665-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6d656b98b8c94tl\" (UID: \"cb08cac1-521a-4c93-825e-0bb146316665\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6d656b98b8c94tl" Apr 16 20:24:42.255128 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:24:42.255080 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/cb08cac1-521a-4c93-825e-0bb146316665-home\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6d656b98b8c94tl\" (UID: \"cb08cac1-521a-4c93-825e-0bb146316665\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6d656b98b8c94tl" Apr 16 20:24:42.255229 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:24:42.255135 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/cb08cac1-521a-4c93-825e-0bb146316665-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6d656b98b8c94tl\" (UID: \"cb08cac1-521a-4c93-825e-0bb146316665\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6d656b98b8c94tl" Apr 16 20:24:42.255229 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:24:42.255173 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/cb08cac1-521a-4c93-825e-0bb146316665-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6d656b98b8c94tl\" (UID: \"cb08cac1-521a-4c93-825e-0bb146316665\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6d656b98b8c94tl" Apr 16 20:24:42.256874 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:24:42.256850 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/cb08cac1-521a-4c93-825e-0bb146316665-dshm\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6d656b98b8c94tl\" (UID: \"cb08cac1-521a-4c93-825e-0bb146316665\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6d656b98b8c94tl" Apr 16 20:24:42.257003 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:24:42.256987 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/cb08cac1-521a-4c93-825e-0bb146316665-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6d656b98b8c94tl\" (UID: \"cb08cac1-521a-4c93-825e-0bb146316665\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6d656b98b8c94tl" Apr 16 20:24:42.266997 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:24:42.266976 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p7x24\" (UniqueName: \"kubernetes.io/projected/cb08cac1-521a-4c93-825e-0bb146316665-kube-api-access-p7x24\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6d656b98b8c94tl\" (UID: \"cb08cac1-521a-4c93-825e-0bb146316665\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6d656b98b8c94tl" Apr 16 20:24:42.385573 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:24:42.385483 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6d656b98b8c94tl" Apr 16 20:24:42.518407 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:24:42.518381 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6d656b98b8c94tl"] Apr 16 20:24:42.521434 ip-10-0-128-48 kubenswrapper[2572]: W0416 20:24:42.521407 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcb08cac1_521a_4c93_825e_0bb146316665.slice/crio-dc6faac0da577d2c9b26e878ade3371cecfb0c092a564592413bf0f17f79b109 WatchSource:0}: Error finding container dc6faac0da577d2c9b26e878ade3371cecfb0c092a564592413bf0f17f79b109: Status 404 returned error can't find the container with id dc6faac0da577d2c9b26e878ade3371cecfb0c092a564592413bf0f17f79b109 Apr 16 20:24:42.913330 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:24:42.913296 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6d656b98b8c94tl" event={"ID":"cb08cac1-521a-4c93-825e-0bb146316665","Type":"ContainerStarted","Data":"3c4ccabd09a1aa5666de4113511affc0b4c802a0cfe883bb0fa40ac2a9532700"} Apr 16 20:24:42.913330 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:24:42.913333 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6d656b98b8c94tl" event={"ID":"cb08cac1-521a-4c93-825e-0bb146316665","Type":"ContainerStarted","Data":"dc6faac0da577d2c9b26e878ade3371cecfb0c092a564592413bf0f17f79b109"} Apr 16 20:24:46.930002 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:24:46.929969 2572 generic.go:358] "Generic (PLEG): container finished" podID="cb08cac1-521a-4c93-825e-0bb146316665" containerID="3c4ccabd09a1aa5666de4113511affc0b4c802a0cfe883bb0fa40ac2a9532700" exitCode=0 Apr 16 20:24:46.930412 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:24:46.930042 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6d656b98b8c94tl" event={"ID":"cb08cac1-521a-4c93-825e-0bb146316665","Type":"ContainerDied","Data":"3c4ccabd09a1aa5666de4113511affc0b4c802a0cfe883bb0fa40ac2a9532700"} Apr 16 20:24:47.934709 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:24:47.934672 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6d656b98b8c94tl" event={"ID":"cb08cac1-521a-4c93-825e-0bb146316665","Type":"ContainerStarted","Data":"e50447c7d77ff073a540185635a43e934524e850c7415d5f5d13517da54fd9e0"} Apr 16 20:24:47.956820 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:24:47.956768 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6d656b98b8c94tl" podStartSLOduration=5.956750886 podStartE2EDuration="5.956750886s" podCreationTimestamp="2026-04-16 20:24:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:24:47.954893996 +0000 UTC m=+1855.133656033" watchObservedRunningTime="2026-04-16 20:24:47.956750886 +0000 UTC m=+1855.135512913" Apr 16 20:24:48.740414 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:24:48.740311 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-c56884tkxg"] Apr 16 20:24:48.740705 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:24:48.740676 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-c56884tkxg" podUID="f7c4371d-4330-457c-8865-69b36bd1a5d2" containerName="main" containerID="cri-o://b0125ada958e3a3dc2d4303d8a98c3020e0f1ccf5799130c552befed80fddd77" gracePeriod=30 Apr 16 20:24:48.740812 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:24:48.740746 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-c56884tkxg" podUID="f7c4371d-4330-457c-8865-69b36bd1a5d2" containerName="tokenizer" containerID="cri-o://a393e930e2dfc001e0fc00f9e565eeb7631afce14000e7c5ca29ba301a4c0eaa" gracePeriod=30 Apr 16 20:24:48.830008 ip-10-0-128-48 kubenswrapper[2572]: W0416 20:24:48.829978 2572 logging.go:55] [core] [Channel #41 SubChannel #42]grpc: addrConn.createTransport failed to connect to {Addr: "10.132.0.43:9003", ServerName: "10.132.0.43:9003", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 10.132.0.43:9003: connect: connection refused" Apr 16 20:24:48.942207 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:24:48.942171 2572 generic.go:358] "Generic (PLEG): container finished" podID="f7c4371d-4330-457c-8865-69b36bd1a5d2" containerID="b0125ada958e3a3dc2d4303d8a98c3020e0f1ccf5799130c552befed80fddd77" exitCode=0 Apr 16 20:24:48.942626 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:24:48.942250 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-c56884tkxg" event={"ID":"f7c4371d-4330-457c-8865-69b36bd1a5d2","Type":"ContainerDied","Data":"b0125ada958e3a3dc2d4303d8a98c3020e0f1ccf5799130c552befed80fddd77"} Apr 16 20:24:49.830019 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:24:49.829979 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-c56884tkxg" podUID="f7c4371d-4330-457c-8865-69b36bd1a5d2" containerName="main" probeResult="failure" output="timeout: failed to connect service \"10.132.0.43:9003\" within 1s: context deadline exceeded" Apr 16 20:24:49.947829 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:24:49.947798 2572 generic.go:358] "Generic (PLEG): container finished" podID="f7c4371d-4330-457c-8865-69b36bd1a5d2" containerID="a393e930e2dfc001e0fc00f9e565eeb7631afce14000e7c5ca29ba301a4c0eaa" exitCode=0 Apr 16 20:24:49.948185 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:24:49.947867 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-c56884tkxg" event={"ID":"f7c4371d-4330-457c-8865-69b36bd1a5d2","Type":"ContainerDied","Data":"a393e930e2dfc001e0fc00f9e565eeb7631afce14000e7c5ca29ba301a4c0eaa"} Apr 16 20:24:49.983596 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:24:49.983573 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-c56884tkxg" Apr 16 20:24:50.120219 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:24:50.120182 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/f7c4371d-4330-457c-8865-69b36bd1a5d2-tokenizer-tmp\") pod \"f7c4371d-4330-457c-8865-69b36bd1a5d2\" (UID: \"f7c4371d-4330-457c-8865-69b36bd1a5d2\") " Apr 16 20:24:50.120412 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:24:50.120235 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f7c4371d-4330-457c-8865-69b36bd1a5d2-tls-certs\") pod \"f7c4371d-4330-457c-8865-69b36bd1a5d2\" (UID: \"f7c4371d-4330-457c-8865-69b36bd1a5d2\") " Apr 16 20:24:50.120412 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:24:50.120274 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f7c4371d-4330-457c-8865-69b36bd1a5d2-kserve-provision-location\") pod \"f7c4371d-4330-457c-8865-69b36bd1a5d2\" (UID: \"f7c4371d-4330-457c-8865-69b36bd1a5d2\") " Apr 16 20:24:50.120412 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:24:50.120328 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/f7c4371d-4330-457c-8865-69b36bd1a5d2-tokenizer-uds\") pod \"f7c4371d-4330-457c-8865-69b36bd1a5d2\" (UID: \"f7c4371d-4330-457c-8865-69b36bd1a5d2\") " Apr 16 20:24:50.120412 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:24:50.120364 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/f7c4371d-4330-457c-8865-69b36bd1a5d2-tokenizer-cache\") pod \"f7c4371d-4330-457c-8865-69b36bd1a5d2\" (UID: \"f7c4371d-4330-457c-8865-69b36bd1a5d2\") " Apr 16 20:24:50.120412 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:24:50.120397 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8ppnv\" (UniqueName: \"kubernetes.io/projected/f7c4371d-4330-457c-8865-69b36bd1a5d2-kube-api-access-8ppnv\") pod \"f7c4371d-4330-457c-8865-69b36bd1a5d2\" (UID: \"f7c4371d-4330-457c-8865-69b36bd1a5d2\") " Apr 16 20:24:50.120674 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:24:50.120618 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f7c4371d-4330-457c-8865-69b36bd1a5d2-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "f7c4371d-4330-457c-8865-69b36bd1a5d2" (UID: "f7c4371d-4330-457c-8865-69b36bd1a5d2"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:24:50.120674 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:24:50.120612 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f7c4371d-4330-457c-8865-69b36bd1a5d2-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "f7c4371d-4330-457c-8865-69b36bd1a5d2" (UID: "f7c4371d-4330-457c-8865-69b36bd1a5d2"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:24:50.120792 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:24:50.120720 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f7c4371d-4330-457c-8865-69b36bd1a5d2-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "f7c4371d-4330-457c-8865-69b36bd1a5d2" (UID: "f7c4371d-4330-457c-8865-69b36bd1a5d2"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:24:50.121068 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:24:50.121032 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f7c4371d-4330-457c-8865-69b36bd1a5d2-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "f7c4371d-4330-457c-8865-69b36bd1a5d2" (UID: "f7c4371d-4330-457c-8865-69b36bd1a5d2"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:24:50.122385 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:24:50.122361 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f7c4371d-4330-457c-8865-69b36bd1a5d2-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "f7c4371d-4330-457c-8865-69b36bd1a5d2" (UID: "f7c4371d-4330-457c-8865-69b36bd1a5d2"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:24:50.122578 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:24:50.122559 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f7c4371d-4330-457c-8865-69b36bd1a5d2-kube-api-access-8ppnv" (OuterVolumeSpecName: "kube-api-access-8ppnv") pod "f7c4371d-4330-457c-8865-69b36bd1a5d2" (UID: "f7c4371d-4330-457c-8865-69b36bd1a5d2"). InnerVolumeSpecName "kube-api-access-8ppnv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 20:24:50.221962 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:24:50.221922 2572 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f7c4371d-4330-457c-8865-69b36bd1a5d2-kserve-provision-location\") on node \"ip-10-0-128-48.ec2.internal\" DevicePath \"\"" Apr 16 20:24:50.221962 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:24:50.221952 2572 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/f7c4371d-4330-457c-8865-69b36bd1a5d2-tokenizer-uds\") on node \"ip-10-0-128-48.ec2.internal\" DevicePath \"\"" Apr 16 20:24:50.221962 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:24:50.221965 2572 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/f7c4371d-4330-457c-8865-69b36bd1a5d2-tokenizer-cache\") on node \"ip-10-0-128-48.ec2.internal\" DevicePath \"\"" Apr 16 20:24:50.222315 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:24:50.221978 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-8ppnv\" (UniqueName: \"kubernetes.io/projected/f7c4371d-4330-457c-8865-69b36bd1a5d2-kube-api-access-8ppnv\") on node \"ip-10-0-128-48.ec2.internal\" DevicePath \"\"" Apr 16 20:24:50.222315 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:24:50.221988 2572 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/f7c4371d-4330-457c-8865-69b36bd1a5d2-tokenizer-tmp\") on node \"ip-10-0-128-48.ec2.internal\" DevicePath \"\"" Apr 16 20:24:50.222315 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:24:50.221997 2572 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f7c4371d-4330-457c-8865-69b36bd1a5d2-tls-certs\") on node \"ip-10-0-128-48.ec2.internal\" DevicePath \"\"" Apr 16 20:24:50.953692 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:24:50.953655 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-c56884tkxg" event={"ID":"f7c4371d-4330-457c-8865-69b36bd1a5d2","Type":"ContainerDied","Data":"ff1cc9d95aa4fb0a644dd37411d5db46dac9625f4102b09287791bb36a2d2841"} Apr 16 20:24:50.953692 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:24:50.953698 2572 scope.go:117] "RemoveContainer" containerID="a393e930e2dfc001e0fc00f9e565eeb7631afce14000e7c5ca29ba301a4c0eaa" Apr 16 20:24:50.954263 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:24:50.953704 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-c56884tkxg" Apr 16 20:24:50.968732 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:24:50.968711 2572 scope.go:117] "RemoveContainer" containerID="b0125ada958e3a3dc2d4303d8a98c3020e0f1ccf5799130c552befed80fddd77" Apr 16 20:24:50.976250 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:24:50.976225 2572 scope.go:117] "RemoveContainer" containerID="92afa34b9ce72ead022f37638b1cfa1cc101165f2612d37dc67d5677aa0792cf" Apr 16 20:24:50.979621 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:24:50.979597 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-c56884tkxg"] Apr 16 20:24:50.984653 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:24:50.984631 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-c56884tkxg"] Apr 16 20:24:51.365980 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:24:51.365940 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f7c4371d-4330-457c-8865-69b36bd1a5d2" path="/var/lib/kubelet/pods/f7c4371d-4330-457c-8865-69b36bd1a5d2/volumes" Apr 16 20:24:52.386074 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:24:52.386032 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6d656b98b8c94tl" Apr 16 20:24:52.386460 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:24:52.386086 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6d656b98b8c94tl" Apr 16 20:24:52.387507 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:24:52.387482 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6d656b98b8c94tl" podUID="cb08cac1-521a-4c93-825e-0bb146316665" containerName="main" probeResult="failure" output="Get \"https://10.132.0.44:8000/health\": dial tcp 10.132.0.44:8000: connect: connection refused" Apr 16 20:24:52.474756 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:24:52.474696 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-8485dcccd-66n2h" podUID="150cd8db-a7e6-4238-9ad0-760a02ca4ba4" containerName="llm-d-routing-sidecar" containerID="cri-o://73f47184264722da006d935a8d86c701750b1360f0e9dff7c089b89db63b3150" gracePeriod=2 Apr 16 20:24:52.743842 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:24:52.743817 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-6b75fc484c-rqxnl" Apr 16 20:24:52.812715 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:24:52.812692 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-8485dcccd-66n2h_150cd8db-a7e6-4238-9ad0-760a02ca4ba4/main/0.log" Apr 16 20:24:52.813371 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:24:52.813347 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-8485dcccd-66n2h" Apr 16 20:24:52.845052 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:24:52.845029 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/d92bff9e-c823-4740-9e8c-bd283d0e313b-dshm\") pod \"d92bff9e-c823-4740-9e8c-bd283d0e313b\" (UID: \"d92bff9e-c823-4740-9e8c-bd283d0e313b\") " Apr 16 20:24:52.845167 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:24:52.845066 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d92bff9e-c823-4740-9e8c-bd283d0e313b-tls-certs\") pod \"d92bff9e-c823-4740-9e8c-bd283d0e313b\" (UID: \"d92bff9e-c823-4740-9e8c-bd283d0e313b\") " Apr 16 20:24:52.845167 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:24:52.845125 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-snh5b\" (UniqueName: \"kubernetes.io/projected/d92bff9e-c823-4740-9e8c-bd283d0e313b-kube-api-access-snh5b\") pod \"d92bff9e-c823-4740-9e8c-bd283d0e313b\" (UID: \"d92bff9e-c823-4740-9e8c-bd283d0e313b\") " Apr 16 20:24:52.845291 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:24:52.845182 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/d92bff9e-c823-4740-9e8c-bd283d0e313b-home\") pod \"d92bff9e-c823-4740-9e8c-bd283d0e313b\" (UID: \"d92bff9e-c823-4740-9e8c-bd283d0e313b\") " Apr 16 20:24:52.845291 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:24:52.845224 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d92bff9e-c823-4740-9e8c-bd283d0e313b-kserve-provision-location\") pod \"d92bff9e-c823-4740-9e8c-bd283d0e313b\" (UID: \"d92bff9e-c823-4740-9e8c-bd283d0e313b\") " Apr 16 20:24:52.845291 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:24:52.845254 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/d92bff9e-c823-4740-9e8c-bd283d0e313b-model-cache\") pod \"d92bff9e-c823-4740-9e8c-bd283d0e313b\" (UID: \"d92bff9e-c823-4740-9e8c-bd283d0e313b\") " Apr 16 20:24:52.845715 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:24:52.845688 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d92bff9e-c823-4740-9e8c-bd283d0e313b-model-cache" (OuterVolumeSpecName: "model-cache") pod "d92bff9e-c823-4740-9e8c-bd283d0e313b" (UID: "d92bff9e-c823-4740-9e8c-bd283d0e313b"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:24:52.845830 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:24:52.845701 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d92bff9e-c823-4740-9e8c-bd283d0e313b-home" (OuterVolumeSpecName: "home") pod "d92bff9e-c823-4740-9e8c-bd283d0e313b" (UID: "d92bff9e-c823-4740-9e8c-bd283d0e313b"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:24:52.847644 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:24:52.847594 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d92bff9e-c823-4740-9e8c-bd283d0e313b-dshm" (OuterVolumeSpecName: "dshm") pod "d92bff9e-c823-4740-9e8c-bd283d0e313b" (UID: "d92bff9e-c823-4740-9e8c-bd283d0e313b"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:24:52.847735 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:24:52.847695 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d92bff9e-c823-4740-9e8c-bd283d0e313b-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "d92bff9e-c823-4740-9e8c-bd283d0e313b" (UID: "d92bff9e-c823-4740-9e8c-bd283d0e313b"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:24:52.847933 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:24:52.847906 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d92bff9e-c823-4740-9e8c-bd283d0e313b-kube-api-access-snh5b" (OuterVolumeSpecName: "kube-api-access-snh5b") pod "d92bff9e-c823-4740-9e8c-bd283d0e313b" (UID: "d92bff9e-c823-4740-9e8c-bd283d0e313b"). InnerVolumeSpecName "kube-api-access-snh5b". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 20:24:52.886176 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:24:52.886135 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d92bff9e-c823-4740-9e8c-bd283d0e313b-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "d92bff9e-c823-4740-9e8c-bd283d0e313b" (UID: "d92bff9e-c823-4740-9e8c-bd283d0e313b"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:24:52.945870 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:24:52.945792 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/150cd8db-a7e6-4238-9ad0-760a02ca4ba4-home\") pod \"150cd8db-a7e6-4238-9ad0-760a02ca4ba4\" (UID: \"150cd8db-a7e6-4238-9ad0-760a02ca4ba4\") " Apr 16 20:24:52.945870 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:24:52.945841 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/150cd8db-a7e6-4238-9ad0-760a02ca4ba4-tls-certs\") pod \"150cd8db-a7e6-4238-9ad0-760a02ca4ba4\" (UID: \"150cd8db-a7e6-4238-9ad0-760a02ca4ba4\") " Apr 16 20:24:52.946049 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:24:52.945879 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/150cd8db-a7e6-4238-9ad0-760a02ca4ba4-kserve-provision-location\") pod \"150cd8db-a7e6-4238-9ad0-760a02ca4ba4\" (UID: \"150cd8db-a7e6-4238-9ad0-760a02ca4ba4\") " Apr 16 20:24:52.946049 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:24:52.945906 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-twscj\" (UniqueName: \"kubernetes.io/projected/150cd8db-a7e6-4238-9ad0-760a02ca4ba4-kube-api-access-twscj\") pod \"150cd8db-a7e6-4238-9ad0-760a02ca4ba4\" (UID: \"150cd8db-a7e6-4238-9ad0-760a02ca4ba4\") " Apr 16 20:24:52.946049 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:24:52.945942 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/150cd8db-a7e6-4238-9ad0-760a02ca4ba4-model-cache\") pod \"150cd8db-a7e6-4238-9ad0-760a02ca4ba4\" (UID: \"150cd8db-a7e6-4238-9ad0-760a02ca4ba4\") " Apr 16 20:24:52.946049 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:24:52.945968 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/150cd8db-a7e6-4238-9ad0-760a02ca4ba4-dshm\") pod \"150cd8db-a7e6-4238-9ad0-760a02ca4ba4\" (UID: \"150cd8db-a7e6-4238-9ad0-760a02ca4ba4\") " Apr 16 20:24:52.946279 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:24:52.946205 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/150cd8db-a7e6-4238-9ad0-760a02ca4ba4-home" (OuterVolumeSpecName: "home") pod "150cd8db-a7e6-4238-9ad0-760a02ca4ba4" (UID: "150cd8db-a7e6-4238-9ad0-760a02ca4ba4"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:24:52.946279 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:24:52.946222 2572 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/d92bff9e-c823-4740-9e8c-bd283d0e313b-dshm\") on node \"ip-10-0-128-48.ec2.internal\" DevicePath \"\"" Apr 16 20:24:52.946279 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:24:52.946245 2572 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d92bff9e-c823-4740-9e8c-bd283d0e313b-tls-certs\") on node \"ip-10-0-128-48.ec2.internal\" DevicePath \"\"" Apr 16 20:24:52.946279 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:24:52.946263 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-snh5b\" (UniqueName: \"kubernetes.io/projected/d92bff9e-c823-4740-9e8c-bd283d0e313b-kube-api-access-snh5b\") on node \"ip-10-0-128-48.ec2.internal\" DevicePath \"\"" Apr 16 20:24:52.946279 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:24:52.946279 2572 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/d92bff9e-c823-4740-9e8c-bd283d0e313b-home\") on node \"ip-10-0-128-48.ec2.internal\" DevicePath \"\"" Apr 16 20:24:52.946500 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:24:52.946293 2572 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d92bff9e-c823-4740-9e8c-bd283d0e313b-kserve-provision-location\") on node \"ip-10-0-128-48.ec2.internal\" DevicePath \"\"" Apr 16 20:24:52.946500 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:24:52.946211 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/150cd8db-a7e6-4238-9ad0-760a02ca4ba4-model-cache" (OuterVolumeSpecName: "model-cache") pod "150cd8db-a7e6-4238-9ad0-760a02ca4ba4" (UID: "150cd8db-a7e6-4238-9ad0-760a02ca4ba4"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:24:52.946500 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:24:52.946307 2572 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/d92bff9e-c823-4740-9e8c-bd283d0e313b-model-cache\") on node \"ip-10-0-128-48.ec2.internal\" DevicePath \"\"" Apr 16 20:24:52.948107 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:24:52.948067 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/150cd8db-a7e6-4238-9ad0-760a02ca4ba4-dshm" (OuterVolumeSpecName: "dshm") pod "150cd8db-a7e6-4238-9ad0-760a02ca4ba4" (UID: "150cd8db-a7e6-4238-9ad0-760a02ca4ba4"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:24:52.948495 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:24:52.948471 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/150cd8db-a7e6-4238-9ad0-760a02ca4ba4-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "150cd8db-a7e6-4238-9ad0-760a02ca4ba4" (UID: "150cd8db-a7e6-4238-9ad0-760a02ca4ba4"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:24:52.948581 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:24:52.948504 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/150cd8db-a7e6-4238-9ad0-760a02ca4ba4-kube-api-access-twscj" (OuterVolumeSpecName: "kube-api-access-twscj") pod "150cd8db-a7e6-4238-9ad0-760a02ca4ba4" (UID: "150cd8db-a7e6-4238-9ad0-760a02ca4ba4"). InnerVolumeSpecName "kube-api-access-twscj". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 20:24:52.962795 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:24:52.962771 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-8485dcccd-66n2h_150cd8db-a7e6-4238-9ad0-760a02ca4ba4/main/0.log" Apr 16 20:24:52.963444 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:24:52.963418 2572 generic.go:358] "Generic (PLEG): container finished" podID="150cd8db-a7e6-4238-9ad0-760a02ca4ba4" containerID="bde189940f3079e6ecb2d9020e43340d68a64ab3c022357383d330eb8fa54513" exitCode=137 Apr 16 20:24:52.963444 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:24:52.963431 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/150cd8db-a7e6-4238-9ad0-760a02ca4ba4-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "150cd8db-a7e6-4238-9ad0-760a02ca4ba4" (UID: "150cd8db-a7e6-4238-9ad0-760a02ca4ba4"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:24:52.963621 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:24:52.963443 2572 generic.go:358] "Generic (PLEG): container finished" podID="150cd8db-a7e6-4238-9ad0-760a02ca4ba4" containerID="73f47184264722da006d935a8d86c701750b1360f0e9dff7c089b89db63b3150" exitCode=0 Apr 16 20:24:52.963621 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:24:52.963460 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-8485dcccd-66n2h" event={"ID":"150cd8db-a7e6-4238-9ad0-760a02ca4ba4","Type":"ContainerDied","Data":"bde189940f3079e6ecb2d9020e43340d68a64ab3c022357383d330eb8fa54513"} Apr 16 20:24:52.963621 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:24:52.963493 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-8485dcccd-66n2h" event={"ID":"150cd8db-a7e6-4238-9ad0-760a02ca4ba4","Type":"ContainerDied","Data":"73f47184264722da006d935a8d86c701750b1360f0e9dff7c089b89db63b3150"} Apr 16 20:24:52.963621 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:24:52.963509 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-8485dcccd-66n2h" event={"ID":"150cd8db-a7e6-4238-9ad0-760a02ca4ba4","Type":"ContainerDied","Data":"50cae8db8888646371b2f14e6dddb7ca72350717592c7375b7ead3d9f791a022"} Apr 16 20:24:52.963621 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:24:52.963518 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-8485dcccd-66n2h" Apr 16 20:24:52.963621 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:24:52.963529 2572 scope.go:117] "RemoveContainer" containerID="bde189940f3079e6ecb2d9020e43340d68a64ab3c022357383d330eb8fa54513" Apr 16 20:24:52.965020 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:24:52.964997 2572 generic.go:358] "Generic (PLEG): container finished" podID="d92bff9e-c823-4740-9e8c-bd283d0e313b" containerID="998a4fc4216eaaa4343d8725cd1acc1b5e5b811c8566e7110bc05e803373045d" exitCode=137 Apr 16 20:24:52.965127 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:24:52.965046 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-6b75fc484c-rqxnl" event={"ID":"d92bff9e-c823-4740-9e8c-bd283d0e313b","Type":"ContainerDied","Data":"998a4fc4216eaaa4343d8725cd1acc1b5e5b811c8566e7110bc05e803373045d"} Apr 16 20:24:52.965127 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:24:52.965070 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-6b75fc484c-rqxnl" event={"ID":"d92bff9e-c823-4740-9e8c-bd283d0e313b","Type":"ContainerDied","Data":"f5a3866b5cb10ca74533c16afae8d7481951151c9ead99161964a143796d9862"} Apr 16 20:24:52.965127 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:24:52.965075 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-6b75fc484c-rqxnl" Apr 16 20:24:52.984321 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:24:52.984298 2572 scope.go:117] "RemoveContainer" containerID="6c1c32c806873db3f84bffd390ae3430a12ab7be9588373942926ba9fdeac435" Apr 16 20:24:52.991440 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:24:52.991417 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-6b75fc484c-rqxnl"] Apr 16 20:24:52.994908 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:24:52.994886 2572 scope.go:117] "RemoveContainer" containerID="73f47184264722da006d935a8d86c701750b1360f0e9dff7c089b89db63b3150" Apr 16 20:24:52.995763 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:24:52.995737 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-6b75fc484c-rqxnl"] Apr 16 20:24:53.004032 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:24:53.004012 2572 scope.go:117] "RemoveContainer" containerID="bde189940f3079e6ecb2d9020e43340d68a64ab3c022357383d330eb8fa54513" Apr 16 20:24:53.004350 ip-10-0-128-48 kubenswrapper[2572]: E0416 20:24:53.004325 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bde189940f3079e6ecb2d9020e43340d68a64ab3c022357383d330eb8fa54513\": container with ID starting with bde189940f3079e6ecb2d9020e43340d68a64ab3c022357383d330eb8fa54513 not found: ID does not exist" containerID="bde189940f3079e6ecb2d9020e43340d68a64ab3c022357383d330eb8fa54513" Apr 16 20:24:53.004433 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:24:53.004359 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bde189940f3079e6ecb2d9020e43340d68a64ab3c022357383d330eb8fa54513"} err="failed to get container status \"bde189940f3079e6ecb2d9020e43340d68a64ab3c022357383d330eb8fa54513\": rpc error: code = NotFound desc = could not find container \"bde189940f3079e6ecb2d9020e43340d68a64ab3c022357383d330eb8fa54513\": container with ID starting with bde189940f3079e6ecb2d9020e43340d68a64ab3c022357383d330eb8fa54513 not found: ID does not exist" Apr 16 20:24:53.004433 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:24:53.004381 2572 scope.go:117] "RemoveContainer" containerID="6c1c32c806873db3f84bffd390ae3430a12ab7be9588373942926ba9fdeac435" Apr 16 20:24:53.004651 ip-10-0-128-48 kubenswrapper[2572]: E0416 20:24:53.004615 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6c1c32c806873db3f84bffd390ae3430a12ab7be9588373942926ba9fdeac435\": container with ID starting with 6c1c32c806873db3f84bffd390ae3430a12ab7be9588373942926ba9fdeac435 not found: ID does not exist" containerID="6c1c32c806873db3f84bffd390ae3430a12ab7be9588373942926ba9fdeac435" Apr 16 20:24:53.004701 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:24:53.004662 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c1c32c806873db3f84bffd390ae3430a12ab7be9588373942926ba9fdeac435"} err="failed to get container status \"6c1c32c806873db3f84bffd390ae3430a12ab7be9588373942926ba9fdeac435\": rpc error: code = NotFound desc = could not find container \"6c1c32c806873db3f84bffd390ae3430a12ab7be9588373942926ba9fdeac435\": container with ID starting with 6c1c32c806873db3f84bffd390ae3430a12ab7be9588373942926ba9fdeac435 not found: ID does not exist" Apr 16 20:24:53.004701 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:24:53.004685 2572 scope.go:117] "RemoveContainer" containerID="73f47184264722da006d935a8d86c701750b1360f0e9dff7c089b89db63b3150" Apr 16 20:24:53.004961 ip-10-0-128-48 kubenswrapper[2572]: E0416 20:24:53.004924 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"73f47184264722da006d935a8d86c701750b1360f0e9dff7c089b89db63b3150\": container with ID starting with 73f47184264722da006d935a8d86c701750b1360f0e9dff7c089b89db63b3150 not found: ID does not exist" containerID="73f47184264722da006d935a8d86c701750b1360f0e9dff7c089b89db63b3150" Apr 16 20:24:53.004961 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:24:53.004946 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"73f47184264722da006d935a8d86c701750b1360f0e9dff7c089b89db63b3150"} err="failed to get container status \"73f47184264722da006d935a8d86c701750b1360f0e9dff7c089b89db63b3150\": rpc error: code = NotFound desc = could not find container \"73f47184264722da006d935a8d86c701750b1360f0e9dff7c089b89db63b3150\": container with ID starting with 73f47184264722da006d935a8d86c701750b1360f0e9dff7c089b89db63b3150 not found: ID does not exist" Apr 16 20:24:53.005108 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:24:53.004963 2572 scope.go:117] "RemoveContainer" containerID="bde189940f3079e6ecb2d9020e43340d68a64ab3c022357383d330eb8fa54513" Apr 16 20:24:53.005261 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:24:53.005238 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bde189940f3079e6ecb2d9020e43340d68a64ab3c022357383d330eb8fa54513"} err="failed to get container status \"bde189940f3079e6ecb2d9020e43340d68a64ab3c022357383d330eb8fa54513\": rpc error: code = NotFound desc = could not find container \"bde189940f3079e6ecb2d9020e43340d68a64ab3c022357383d330eb8fa54513\": container with ID starting with bde189940f3079e6ecb2d9020e43340d68a64ab3c022357383d330eb8fa54513 not found: ID does not exist" Apr 16 20:24:53.005340 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:24:53.005264 2572 scope.go:117] "RemoveContainer" containerID="6c1c32c806873db3f84bffd390ae3430a12ab7be9588373942926ba9fdeac435" Apr 16 20:24:53.005546 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:24:53.005522 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c1c32c806873db3f84bffd390ae3430a12ab7be9588373942926ba9fdeac435"} err="failed to get container status \"6c1c32c806873db3f84bffd390ae3430a12ab7be9588373942926ba9fdeac435\": rpc error: code = NotFound desc = could not find container \"6c1c32c806873db3f84bffd390ae3430a12ab7be9588373942926ba9fdeac435\": container with ID starting with 6c1c32c806873db3f84bffd390ae3430a12ab7be9588373942926ba9fdeac435 not found: ID does not exist" Apr 16 20:24:53.005630 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:24:53.005565 2572 scope.go:117] "RemoveContainer" containerID="73f47184264722da006d935a8d86c701750b1360f0e9dff7c089b89db63b3150" Apr 16 20:24:53.005912 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:24:53.005880 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"73f47184264722da006d935a8d86c701750b1360f0e9dff7c089b89db63b3150"} err="failed to get container status \"73f47184264722da006d935a8d86c701750b1360f0e9dff7c089b89db63b3150\": rpc error: code = NotFound desc = could not find container \"73f47184264722da006d935a8d86c701750b1360f0e9dff7c089b89db63b3150\": container with ID starting with 73f47184264722da006d935a8d86c701750b1360f0e9dff7c089b89db63b3150 not found: ID does not exist" Apr 16 20:24:53.006019 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:24:53.005913 2572 scope.go:117] "RemoveContainer" containerID="998a4fc4216eaaa4343d8725cd1acc1b5e5b811c8566e7110bc05e803373045d" Apr 16 20:24:53.008875 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:24:53.008855 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-8485dcccd-66n2h"] Apr 16 20:24:53.012807 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:24:53.012785 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-8485dcccd-66n2h"] Apr 16 20:24:53.026330 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:24:53.026307 2572 scope.go:117] "RemoveContainer" containerID="aacac551091c77eade4a549bf938a61a54ad1f9fea9c36bd15efddaa720d18fa" Apr 16 20:24:53.047606 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:24:53.047586 2572 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/150cd8db-a7e6-4238-9ad0-760a02ca4ba4-tls-certs\") on node \"ip-10-0-128-48.ec2.internal\" DevicePath \"\"" Apr 16 20:24:53.047712 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:24:53.047612 2572 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/150cd8db-a7e6-4238-9ad0-760a02ca4ba4-kserve-provision-location\") on node \"ip-10-0-128-48.ec2.internal\" DevicePath \"\"" Apr 16 20:24:53.047712 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:24:53.047629 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-twscj\" (UniqueName: \"kubernetes.io/projected/150cd8db-a7e6-4238-9ad0-760a02ca4ba4-kube-api-access-twscj\") on node \"ip-10-0-128-48.ec2.internal\" DevicePath \"\"" Apr 16 20:24:53.047712 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:24:53.047643 2572 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/150cd8db-a7e6-4238-9ad0-760a02ca4ba4-model-cache\") on node \"ip-10-0-128-48.ec2.internal\" DevicePath \"\"" Apr 16 20:24:53.047712 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:24:53.047656 2572 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/150cd8db-a7e6-4238-9ad0-760a02ca4ba4-dshm\") on node \"ip-10-0-128-48.ec2.internal\" DevicePath \"\"" Apr 16 20:24:53.047712 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:24:53.047669 2572 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/150cd8db-a7e6-4238-9ad0-760a02ca4ba4-home\") on node \"ip-10-0-128-48.ec2.internal\" DevicePath \"\"" Apr 16 20:24:53.080949 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:24:53.080919 2572 scope.go:117] "RemoveContainer" containerID="998a4fc4216eaaa4343d8725cd1acc1b5e5b811c8566e7110bc05e803373045d" Apr 16 20:24:53.081455 ip-10-0-128-48 kubenswrapper[2572]: E0416 20:24:53.081430 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"998a4fc4216eaaa4343d8725cd1acc1b5e5b811c8566e7110bc05e803373045d\": container with ID starting with 998a4fc4216eaaa4343d8725cd1acc1b5e5b811c8566e7110bc05e803373045d not found: ID does not exist" containerID="998a4fc4216eaaa4343d8725cd1acc1b5e5b811c8566e7110bc05e803373045d" Apr 16 20:24:53.081550 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:24:53.081465 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"998a4fc4216eaaa4343d8725cd1acc1b5e5b811c8566e7110bc05e803373045d"} err="failed to get container status \"998a4fc4216eaaa4343d8725cd1acc1b5e5b811c8566e7110bc05e803373045d\": rpc error: code = NotFound desc = could not find container \"998a4fc4216eaaa4343d8725cd1acc1b5e5b811c8566e7110bc05e803373045d\": container with ID starting with 998a4fc4216eaaa4343d8725cd1acc1b5e5b811c8566e7110bc05e803373045d not found: ID does not exist" Apr 16 20:24:53.081550 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:24:53.081491 2572 scope.go:117] "RemoveContainer" containerID="aacac551091c77eade4a549bf938a61a54ad1f9fea9c36bd15efddaa720d18fa" Apr 16 20:24:53.081761 ip-10-0-128-48 kubenswrapper[2572]: E0416 20:24:53.081741 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aacac551091c77eade4a549bf938a61a54ad1f9fea9c36bd15efddaa720d18fa\": container with ID starting with aacac551091c77eade4a549bf938a61a54ad1f9fea9c36bd15efddaa720d18fa not found: ID does not exist" containerID="aacac551091c77eade4a549bf938a61a54ad1f9fea9c36bd15efddaa720d18fa" Apr 16 20:24:53.081800 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:24:53.081769 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aacac551091c77eade4a549bf938a61a54ad1f9fea9c36bd15efddaa720d18fa"} err="failed to get container status \"aacac551091c77eade4a549bf938a61a54ad1f9fea9c36bd15efddaa720d18fa\": rpc error: code = NotFound desc = could not find container \"aacac551091c77eade4a549bf938a61a54ad1f9fea9c36bd15efddaa720d18fa\": container with ID starting with aacac551091c77eade4a549bf938a61a54ad1f9fea9c36bd15efddaa720d18fa not found: ID does not exist" Apr 16 20:24:53.367430 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:24:53.367402 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="150cd8db-a7e6-4238-9ad0-760a02ca4ba4" path="/var/lib/kubelet/pods/150cd8db-a7e6-4238-9ad0-760a02ca4ba4/volumes" Apr 16 20:24:53.367846 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:24:53.367833 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d92bff9e-c823-4740-9e8c-bd283d0e313b" path="/var/lib/kubelet/pods/d92bff9e-c823-4740-9e8c-bd283d0e313b/volumes" Apr 16 20:25:02.386779 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:25:02.386732 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6d656b98b8c94tl" podUID="cb08cac1-521a-4c93-825e-0bb146316665" containerName="main" probeResult="failure" output="Get \"https://10.132.0.44:8000/health\": dial tcp 10.132.0.44:8000: connect: connection refused" Apr 16 20:25:12.385905 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:25:12.385812 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6d656b98b8c94tl" podUID="cb08cac1-521a-4c93-825e-0bb146316665" containerName="main" probeResult="failure" output="Get \"https://10.132.0.44:8000/health\": dial tcp 10.132.0.44:8000: connect: connection refused" Apr 16 20:25:22.386632 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:25:22.386585 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6d656b98b8c94tl" podUID="cb08cac1-521a-4c93-825e-0bb146316665" containerName="main" probeResult="failure" output="Get \"https://10.132.0.44:8000/health\": dial tcp 10.132.0.44:8000: connect: connection refused" Apr 16 20:25:32.386682 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:25:32.386629 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6d656b98b8c94tl" podUID="cb08cac1-521a-4c93-825e-0bb146316665" containerName="main" probeResult="failure" output="Get \"https://10.132.0.44:8000/health\": dial tcp 10.132.0.44:8000: connect: connection refused" Apr 16 20:25:42.386479 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:25:42.386437 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6d656b98b8c94tl" podUID="cb08cac1-521a-4c93-825e-0bb146316665" containerName="main" probeResult="failure" output="Get \"https://10.132.0.44:8000/health\": dial tcp 10.132.0.44:8000: connect: connection refused" Apr 16 20:25:52.385857 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:25:52.385820 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6d656b98b8c94tl" podUID="cb08cac1-521a-4c93-825e-0bb146316665" containerName="main" probeResult="failure" output="Get \"https://10.132.0.44:8000/health\": dial tcp 10.132.0.44:8000: connect: connection refused" Apr 16 20:26:02.385916 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:26:02.385871 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6d656b98b8c94tl" podUID="cb08cac1-521a-4c93-825e-0bb146316665" containerName="main" probeResult="failure" output="Get \"https://10.132.0.44:8000/health\": dial tcp 10.132.0.44:8000: connect: connection refused" Apr 16 20:26:12.386283 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:26:12.386236 2572 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6d656b98b8c94tl" podUID="cb08cac1-521a-4c93-825e-0bb146316665" containerName="main" probeResult="failure" output="Get \"https://10.132.0.44:8000/health\": dial tcp 10.132.0.44:8000: connect: connection refused" Apr 16 20:26:22.396049 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:26:22.396016 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6d656b98b8c94tl" Apr 16 20:26:22.403769 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:26:22.403744 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6d656b98b8c94tl" Apr 16 20:26:27.351673 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:26:27.351620 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6d656b98b8c94tl"] Apr 16 20:26:27.352066 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:26:27.351881 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6d656b98b8c94tl" podUID="cb08cac1-521a-4c93-825e-0bb146316665" containerName="main" containerID="cri-o://e50447c7d77ff073a540185635a43e934524e850c7415d5f5d13517da54fd9e0" gracePeriod=30 Apr 16 20:26:43.291955 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:26:43.291872 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6d656b98b8c94tl_cb08cac1-521a-4c93-825e-0bb146316665/main/0.log" Apr 16 20:26:43.312327 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:26:43.312291 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6d656b98b8c94tl_cb08cac1-521a-4c93-825e-0bb146316665/storage-initializer/0.log" Apr 16 20:26:44.309403 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:26:44.309370 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6d656b98b8c94tl_cb08cac1-521a-4c93-825e-0bb146316665/main/0.log" Apr 16 20:26:44.317672 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:26:44.317642 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6d656b98b8c94tl_cb08cac1-521a-4c93-825e-0bb146316665/storage-initializer/0.log" Apr 16 20:26:45.302607 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:26:45.302579 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6d656b98b8c94tl_cb08cac1-521a-4c93-825e-0bb146316665/main/0.log" Apr 16 20:26:45.311380 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:26:45.311352 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6d656b98b8c94tl_cb08cac1-521a-4c93-825e-0bb146316665/storage-initializer/0.log" Apr 16 20:26:46.263173 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:26:46.263144 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6d656b98b8c94tl_cb08cac1-521a-4c93-825e-0bb146316665/main/0.log" Apr 16 20:26:46.272344 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:26:46.272319 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6d656b98b8c94tl_cb08cac1-521a-4c93-825e-0bb146316665/storage-initializer/0.log" Apr 16 20:26:47.219543 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:26:47.219510 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6d656b98b8c94tl_cb08cac1-521a-4c93-825e-0bb146316665/main/0.log" Apr 16 20:26:47.227825 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:26:47.227803 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6d656b98b8c94tl_cb08cac1-521a-4c93-825e-0bb146316665/storage-initializer/0.log" Apr 16 20:26:48.166812 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:26:48.166782 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6d656b98b8c94tl_cb08cac1-521a-4c93-825e-0bb146316665/main/0.log" Apr 16 20:26:48.173597 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:26:48.173570 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6d656b98b8c94tl_cb08cac1-521a-4c93-825e-0bb146316665/storage-initializer/0.log" Apr 16 20:26:49.112764 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:26:49.112722 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6d656b98b8c94tl_cb08cac1-521a-4c93-825e-0bb146316665/main/0.log" Apr 16 20:26:49.123185 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:26:49.123163 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6d656b98b8c94tl_cb08cac1-521a-4c93-825e-0bb146316665/storage-initializer/0.log" Apr 16 20:26:50.068061 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:26:50.068022 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6d656b98b8c94tl_cb08cac1-521a-4c93-825e-0bb146316665/main/0.log" Apr 16 20:26:50.076030 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:26:50.076007 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6d656b98b8c94tl_cb08cac1-521a-4c93-825e-0bb146316665/storage-initializer/0.log" Apr 16 20:26:51.039244 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:26:51.039215 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6d656b98b8c94tl_cb08cac1-521a-4c93-825e-0bb146316665/main/0.log" Apr 16 20:26:51.048436 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:26:51.048410 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6d656b98b8c94tl_cb08cac1-521a-4c93-825e-0bb146316665/storage-initializer/0.log" Apr 16 20:26:51.981645 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:26:51.981615 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6d656b98b8c94tl_cb08cac1-521a-4c93-825e-0bb146316665/main/0.log" Apr 16 20:26:51.989008 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:26:51.988983 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6d656b98b8c94tl_cb08cac1-521a-4c93-825e-0bb146316665/storage-initializer/0.log" Apr 16 20:26:52.904071 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:26:52.904041 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6d656b98b8c94tl_cb08cac1-521a-4c93-825e-0bb146316665/main/0.log" Apr 16 20:26:52.911219 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:26:52.911198 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6d656b98b8c94tl_cb08cac1-521a-4c93-825e-0bb146316665/storage-initializer/0.log" Apr 16 20:26:53.830514 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:26:53.830483 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6d656b98b8c94tl_cb08cac1-521a-4c93-825e-0bb146316665/main/0.log" Apr 16 20:26:53.837421 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:26:53.837396 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6d656b98b8c94tl_cb08cac1-521a-4c93-825e-0bb146316665/storage-initializer/0.log" Apr 16 20:26:54.789239 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:26:54.789207 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6d656b98b8c94tl_cb08cac1-521a-4c93-825e-0bb146316665/main/0.log" Apr 16 20:26:54.797676 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:26:54.797637 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6d656b98b8c94tl_cb08cac1-521a-4c93-825e-0bb146316665/storage-initializer/0.log" Apr 16 20:26:55.777710 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:26:55.777681 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6d656b98b8c94tl_cb08cac1-521a-4c93-825e-0bb146316665/main/0.log" Apr 16 20:26:55.785283 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:26:55.785259 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6d656b98b8c94tl_cb08cac1-521a-4c93-825e-0bb146316665/storage-initializer/0.log" Apr 16 20:26:56.850282 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:26:56.850253 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_openshift-ai-inference-openshift-default-7c5447bb76-5pvbr_33697ceb-0a50-49f2-830f-8bf9e962c02c/istio-proxy/0.log" Apr 16 20:26:56.865175 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:26:56.865152 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-74fc646496-l58qj_142d9ab9-4b05-479d-b198-2760a09292d1/router/0.log" Apr 16 20:26:57.586774 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:26:57.586753 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6d656b98b8c94tl_cb08cac1-521a-4c93-825e-0bb146316665/main/0.log" Apr 16 20:26:57.587084 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:26:57.587070 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6d656b98b8c94tl" Apr 16 20:26:57.664397 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:26:57.664328 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_openshift-ai-inference-openshift-default-7c5447bb76-5pvbr_33697ceb-0a50-49f2-830f-8bf9e962c02c/istio-proxy/0.log" Apr 16 20:26:57.680433 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:26:57.680410 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-74fc646496-l58qj_142d9ab9-4b05-479d-b198-2760a09292d1/router/0.log" Apr 16 20:26:57.684900 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:26:57.684878 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p7x24\" (UniqueName: \"kubernetes.io/projected/cb08cac1-521a-4c93-825e-0bb146316665-kube-api-access-p7x24\") pod \"cb08cac1-521a-4c93-825e-0bb146316665\" (UID: \"cb08cac1-521a-4c93-825e-0bb146316665\") " Apr 16 20:26:57.685026 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:26:57.684914 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/cb08cac1-521a-4c93-825e-0bb146316665-kserve-provision-location\") pod \"cb08cac1-521a-4c93-825e-0bb146316665\" (UID: \"cb08cac1-521a-4c93-825e-0bb146316665\") " Apr 16 20:26:57.685026 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:26:57.684949 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/cb08cac1-521a-4c93-825e-0bb146316665-dshm\") pod \"cb08cac1-521a-4c93-825e-0bb146316665\" (UID: \"cb08cac1-521a-4c93-825e-0bb146316665\") " Apr 16 20:26:57.685026 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:26:57.684976 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/cb08cac1-521a-4c93-825e-0bb146316665-home\") pod \"cb08cac1-521a-4c93-825e-0bb146316665\" (UID: \"cb08cac1-521a-4c93-825e-0bb146316665\") " Apr 16 20:26:57.685211 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:26:57.685030 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/cb08cac1-521a-4c93-825e-0bb146316665-tls-certs\") pod \"cb08cac1-521a-4c93-825e-0bb146316665\" (UID: \"cb08cac1-521a-4c93-825e-0bb146316665\") " Apr 16 20:26:57.685211 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:26:57.685060 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/cb08cac1-521a-4c93-825e-0bb146316665-model-cache\") pod \"cb08cac1-521a-4c93-825e-0bb146316665\" (UID: \"cb08cac1-521a-4c93-825e-0bb146316665\") " Apr 16 20:26:57.685434 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:26:57.685396 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cb08cac1-521a-4c93-825e-0bb146316665-home" (OuterVolumeSpecName: "home") pod "cb08cac1-521a-4c93-825e-0bb146316665" (UID: "cb08cac1-521a-4c93-825e-0bb146316665"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:26:57.685569 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:26:57.685542 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cb08cac1-521a-4c93-825e-0bb146316665-model-cache" (OuterVolumeSpecName: "model-cache") pod "cb08cac1-521a-4c93-825e-0bb146316665" (UID: "cb08cac1-521a-4c93-825e-0bb146316665"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:26:57.687111 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:26:57.687075 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cb08cac1-521a-4c93-825e-0bb146316665-kube-api-access-p7x24" (OuterVolumeSpecName: "kube-api-access-p7x24") pod "cb08cac1-521a-4c93-825e-0bb146316665" (UID: "cb08cac1-521a-4c93-825e-0bb146316665"). InnerVolumeSpecName "kube-api-access-p7x24". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 20:26:57.687461 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:26:57.687438 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cb08cac1-521a-4c93-825e-0bb146316665-dshm" (OuterVolumeSpecName: "dshm") pod "cb08cac1-521a-4c93-825e-0bb146316665" (UID: "cb08cac1-521a-4c93-825e-0bb146316665"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:26:57.687541 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:26:57.687505 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb08cac1-521a-4c93-825e-0bb146316665-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "cb08cac1-521a-4c93-825e-0bb146316665" (UID: "cb08cac1-521a-4c93-825e-0bb146316665"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:26:57.749652 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:26:57.749619 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cb08cac1-521a-4c93-825e-0bb146316665-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "cb08cac1-521a-4c93-825e-0bb146316665" (UID: "cb08cac1-521a-4c93-825e-0bb146316665"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:26:57.785923 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:26:57.785898 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-p7x24\" (UniqueName: \"kubernetes.io/projected/cb08cac1-521a-4c93-825e-0bb146316665-kube-api-access-p7x24\") on node \"ip-10-0-128-48.ec2.internal\" DevicePath \"\"" Apr 16 20:26:57.785923 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:26:57.785919 2572 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/cb08cac1-521a-4c93-825e-0bb146316665-kserve-provision-location\") on node \"ip-10-0-128-48.ec2.internal\" DevicePath \"\"" Apr 16 20:26:57.786068 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:26:57.785929 2572 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/cb08cac1-521a-4c93-825e-0bb146316665-dshm\") on node \"ip-10-0-128-48.ec2.internal\" DevicePath \"\"" Apr 16 20:26:57.786068 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:26:57.785939 2572 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/cb08cac1-521a-4c93-825e-0bb146316665-home\") on node \"ip-10-0-128-48.ec2.internal\" DevicePath \"\"" Apr 16 20:26:57.786068 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:26:57.785948 2572 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/cb08cac1-521a-4c93-825e-0bb146316665-tls-certs\") on node \"ip-10-0-128-48.ec2.internal\" DevicePath \"\"" Apr 16 20:26:57.786068 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:26:57.785957 2572 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/cb08cac1-521a-4c93-825e-0bb146316665-model-cache\") on node \"ip-10-0-128-48.ec2.internal\" DevicePath \"\"" Apr 16 20:26:58.376298 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:26:58.376270 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6d656b98b8c94tl_cb08cac1-521a-4c93-825e-0bb146316665/main/0.log" Apr 16 20:26:58.376666 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:26:58.376557 2572 generic.go:358] "Generic (PLEG): container finished" podID="cb08cac1-521a-4c93-825e-0bb146316665" containerID="e50447c7d77ff073a540185635a43e934524e850c7415d5f5d13517da54fd9e0" exitCode=137 Apr 16 20:26:58.376666 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:26:58.376618 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6d656b98b8c94tl" event={"ID":"cb08cac1-521a-4c93-825e-0bb146316665","Type":"ContainerDied","Data":"e50447c7d77ff073a540185635a43e934524e850c7415d5f5d13517da54fd9e0"} Apr 16 20:26:58.376666 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:26:58.376639 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6d656b98b8c94tl" Apr 16 20:26:58.376666 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:26:58.376660 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6d656b98b8c94tl" event={"ID":"cb08cac1-521a-4c93-825e-0bb146316665","Type":"ContainerDied","Data":"dc6faac0da577d2c9b26e878ade3371cecfb0c092a564592413bf0f17f79b109"} Apr 16 20:26:58.376835 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:26:58.376678 2572 scope.go:117] "RemoveContainer" containerID="e50447c7d77ff073a540185635a43e934524e850c7415d5f5d13517da54fd9e0" Apr 16 20:26:58.396705 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:26:58.396686 2572 scope.go:117] "RemoveContainer" containerID="3c4ccabd09a1aa5666de4113511affc0b4c802a0cfe883bb0fa40ac2a9532700" Apr 16 20:26:58.400209 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:26:58.400185 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6d656b98b8c94tl"] Apr 16 20:26:58.404548 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:26:58.404526 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-6d656b98b8c94tl"] Apr 16 20:26:58.406710 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:26:58.406690 2572 scope.go:117] "RemoveContainer" containerID="e50447c7d77ff073a540185635a43e934524e850c7415d5f5d13517da54fd9e0" Apr 16 20:26:58.407004 ip-10-0-128-48 kubenswrapper[2572]: E0416 20:26:58.406973 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e50447c7d77ff073a540185635a43e934524e850c7415d5f5d13517da54fd9e0\": container with ID starting with e50447c7d77ff073a540185635a43e934524e850c7415d5f5d13517da54fd9e0 not found: ID does not exist" containerID="e50447c7d77ff073a540185635a43e934524e850c7415d5f5d13517da54fd9e0" Apr 16 20:26:58.407104 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:26:58.407010 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e50447c7d77ff073a540185635a43e934524e850c7415d5f5d13517da54fd9e0"} err="failed to get container status \"e50447c7d77ff073a540185635a43e934524e850c7415d5f5d13517da54fd9e0\": rpc error: code = NotFound desc = could not find container \"e50447c7d77ff073a540185635a43e934524e850c7415d5f5d13517da54fd9e0\": container with ID starting with e50447c7d77ff073a540185635a43e934524e850c7415d5f5d13517da54fd9e0 not found: ID does not exist" Apr 16 20:26:58.407104 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:26:58.407032 2572 scope.go:117] "RemoveContainer" containerID="3c4ccabd09a1aa5666de4113511affc0b4c802a0cfe883bb0fa40ac2a9532700" Apr 16 20:26:58.407300 ip-10-0-128-48 kubenswrapper[2572]: E0416 20:26:58.407281 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3c4ccabd09a1aa5666de4113511affc0b4c802a0cfe883bb0fa40ac2a9532700\": container with ID starting with 3c4ccabd09a1aa5666de4113511affc0b4c802a0cfe883bb0fa40ac2a9532700 not found: ID does not exist" containerID="3c4ccabd09a1aa5666de4113511affc0b4c802a0cfe883bb0fa40ac2a9532700" Apr 16 20:26:58.407358 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:26:58.407306 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3c4ccabd09a1aa5666de4113511affc0b4c802a0cfe883bb0fa40ac2a9532700"} err="failed to get container status \"3c4ccabd09a1aa5666de4113511affc0b4c802a0cfe883bb0fa40ac2a9532700\": rpc error: code = NotFound desc = could not find container \"3c4ccabd09a1aa5666de4113511affc0b4c802a0cfe883bb0fa40ac2a9532700\": container with ID starting with 3c4ccabd09a1aa5666de4113511affc0b4c802a0cfe883bb0fa40ac2a9532700 not found: ID does not exist" Apr 16 20:26:58.531689 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:26:58.531663 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-controller-manager-6ddf9554fc-ltgfr_9eaa2062-a5e2-480b-bcf7-9a163da37f94/manager/0.log" Apr 16 20:26:59.365866 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:26:59.365832 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cb08cac1-521a-4c93-825e-0bb146316665" path="/var/lib/kubelet/pods/cb08cac1-521a-4c93-825e-0bb146316665/volumes" Apr 16 20:27:03.825931 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:27:03.825899 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-9mwx9_fec5c910-3cf0-49a0-b436-62a7236c7d68/global-pull-secret-syncer/0.log" Apr 16 20:27:03.895370 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:27:03.895344 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-dxbfj_9fd607fa-9f37-4836-a69c-147c1052dbc4/konnectivity-agent/0.log" Apr 16 20:27:03.976728 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:27:03.976704 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-128-48.ec2.internal_0ef9ecbb9850ae7e0cccd45a7695b7be/haproxy/0.log" Apr 16 20:27:08.259052 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:27:08.259016 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-controller-manager-6ddf9554fc-ltgfr_9eaa2062-a5e2-480b-bcf7-9a163da37f94/manager/0.log" Apr 16 20:27:09.425730 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:27:09.425698 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-rcxgg_b9ff25d1-4296-4a79-9bfa-cad826fb48cb/cluster-monitoring-operator/0.log" Apr 16 20:27:09.755985 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:27:09.755909 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-ghm4g_3b43c525-2b23-4b6e-b52f-b7b02c633eb0/node-exporter/0.log" Apr 16 20:27:09.775198 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:27:09.775174 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-ghm4g_3b43c525-2b23-4b6e-b52f-b7b02c633eb0/kube-rbac-proxy/0.log" Apr 16 20:27:09.797705 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:27:09.797685 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-ghm4g_3b43c525-2b23-4b6e-b52f-b7b02c633eb0/init-textfile/0.log" Apr 16 20:27:11.474016 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:27:11.473990 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-cb95c66f6-lcvgf_dea42027-362c-41e6-a940-a20b985788b0/networking-console-plugin/0.log" Apr 16 20:27:12.032214 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:27:12.032184 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2xttw_d7cbe800-699f-48fe-9a8a-b74c32bf0dcc/console-operator/2.log" Apr 16 20:27:12.037417 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:27:12.037391 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2xttw_d7cbe800-699f-48fe-9a8a-b74c32bf0dcc/console-operator/3.log" Apr 16 20:27:12.783054 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:27:12.783023 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-plvk7/perf-node-gather-daemonset-hk2hx"] Apr 16 20:27:12.783435 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:27:12.783311 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f7c4371d-4330-457c-8865-69b36bd1a5d2" containerName="tokenizer" Apr 16 20:27:12.783435 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:27:12.783322 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7c4371d-4330-457c-8865-69b36bd1a5d2" containerName="tokenizer" Apr 16 20:27:12.783435 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:27:12.783332 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d92bff9e-c823-4740-9e8c-bd283d0e313b" containerName="storage-initializer" Apr 16 20:27:12.783435 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:27:12.783338 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="d92bff9e-c823-4740-9e8c-bd283d0e313b" containerName="storage-initializer" Apr 16 20:27:12.783435 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:27:12.783345 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cb08cac1-521a-4c93-825e-0bb146316665" containerName="storage-initializer" Apr 16 20:27:12.783435 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:27:12.783351 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb08cac1-521a-4c93-825e-0bb146316665" containerName="storage-initializer" Apr 16 20:27:12.783435 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:27:12.783359 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="150cd8db-a7e6-4238-9ad0-760a02ca4ba4" containerName="storage-initializer" Apr 16 20:27:12.783435 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:27:12.783364 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="150cd8db-a7e6-4238-9ad0-760a02ca4ba4" containerName="storage-initializer" Apr 16 20:27:12.783435 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:27:12.783377 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="150cd8db-a7e6-4238-9ad0-760a02ca4ba4" containerName="llm-d-routing-sidecar" Apr 16 20:27:12.783435 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:27:12.783382 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="150cd8db-a7e6-4238-9ad0-760a02ca4ba4" containerName="llm-d-routing-sidecar" Apr 16 20:27:12.783435 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:27:12.783388 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d92bff9e-c823-4740-9e8c-bd283d0e313b" containerName="main" Apr 16 20:27:12.783435 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:27:12.783393 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="d92bff9e-c823-4740-9e8c-bd283d0e313b" containerName="main" Apr 16 20:27:12.783435 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:27:12.783400 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cb08cac1-521a-4c93-825e-0bb146316665" containerName="main" Apr 16 20:27:12.783435 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:27:12.783405 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb08cac1-521a-4c93-825e-0bb146316665" containerName="main" Apr 16 20:27:12.783435 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:27:12.783410 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f7c4371d-4330-457c-8865-69b36bd1a5d2" containerName="main" Apr 16 20:27:12.783435 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:27:12.783415 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7c4371d-4330-457c-8865-69b36bd1a5d2" containerName="main" Apr 16 20:27:12.783435 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:27:12.783424 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="150cd8db-a7e6-4238-9ad0-760a02ca4ba4" containerName="main" Apr 16 20:27:12.783435 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:27:12.783429 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="150cd8db-a7e6-4238-9ad0-760a02ca4ba4" containerName="main" Apr 16 20:27:12.783435 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:27:12.783438 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f7c4371d-4330-457c-8865-69b36bd1a5d2" containerName="storage-initializer" Apr 16 20:27:12.783435 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:27:12.783443 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7c4371d-4330-457c-8865-69b36bd1a5d2" containerName="storage-initializer" Apr 16 20:27:12.784024 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:27:12.783488 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="d92bff9e-c823-4740-9e8c-bd283d0e313b" containerName="main" Apr 16 20:27:12.784024 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:27:12.783496 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="f7c4371d-4330-457c-8865-69b36bd1a5d2" containerName="main" Apr 16 20:27:12.784024 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:27:12.783503 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="150cd8db-a7e6-4238-9ad0-760a02ca4ba4" containerName="main" Apr 16 20:27:12.784024 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:27:12.783509 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="cb08cac1-521a-4c93-825e-0bb146316665" containerName="main" Apr 16 20:27:12.784024 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:27:12.783517 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="150cd8db-a7e6-4238-9ad0-760a02ca4ba4" containerName="llm-d-routing-sidecar" Apr 16 20:27:12.784024 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:27:12.783525 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="f7c4371d-4330-457c-8865-69b36bd1a5d2" containerName="tokenizer" Apr 16 20:27:12.788600 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:27:12.788581 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-plvk7/perf-node-gather-daemonset-hk2hx" Apr 16 20:27:12.792723 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:27:12.792701 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-plvk7\"/\"kube-root-ca.crt\"" Apr 16 20:27:12.794073 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:27:12.794051 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-plvk7\"/\"default-dockercfg-57lmf\"" Apr 16 20:27:12.794195 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:27:12.794110 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-plvk7\"/\"openshift-service-ca.crt\"" Apr 16 20:27:12.796467 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:27:12.796443 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-plvk7/perf-node-gather-daemonset-hk2hx"] Apr 16 20:27:12.905782 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:27:12.905755 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/73e49f2e-6ee9-4783-8ab9-74b87946b467-sys\") pod \"perf-node-gather-daemonset-hk2hx\" (UID: \"73e49f2e-6ee9-4783-8ab9-74b87946b467\") " pod="openshift-must-gather-plvk7/perf-node-gather-daemonset-hk2hx" Apr 16 20:27:12.905915 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:27:12.905792 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/73e49f2e-6ee9-4783-8ab9-74b87946b467-proc\") pod \"perf-node-gather-daemonset-hk2hx\" (UID: \"73e49f2e-6ee9-4783-8ab9-74b87946b467\") " pod="openshift-must-gather-plvk7/perf-node-gather-daemonset-hk2hx" Apr 16 20:27:12.905915 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:27:12.905811 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/73e49f2e-6ee9-4783-8ab9-74b87946b467-lib-modules\") pod \"perf-node-gather-daemonset-hk2hx\" (UID: \"73e49f2e-6ee9-4783-8ab9-74b87946b467\") " pod="openshift-must-gather-plvk7/perf-node-gather-daemonset-hk2hx" Apr 16 20:27:12.905915 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:27:12.905827 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xrf6w\" (UniqueName: \"kubernetes.io/projected/73e49f2e-6ee9-4783-8ab9-74b87946b467-kube-api-access-xrf6w\") pod \"perf-node-gather-daemonset-hk2hx\" (UID: \"73e49f2e-6ee9-4783-8ab9-74b87946b467\") " pod="openshift-must-gather-plvk7/perf-node-gather-daemonset-hk2hx" Apr 16 20:27:12.905915 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:27:12.905872 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/73e49f2e-6ee9-4783-8ab9-74b87946b467-podres\") pod \"perf-node-gather-daemonset-hk2hx\" (UID: \"73e49f2e-6ee9-4783-8ab9-74b87946b467\") " pod="openshift-must-gather-plvk7/perf-node-gather-daemonset-hk2hx" Apr 16 20:27:12.998132 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:27:12.998087 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_volume-data-source-validator-7c6cbb6c87-6cnrc_72d1ed32-dd1f-4c30-adc3-db411de6c394/volume-data-source-validator/0.log" Apr 16 20:27:13.006377 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:27:13.006346 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/73e49f2e-6ee9-4783-8ab9-74b87946b467-podres\") pod \"perf-node-gather-daemonset-hk2hx\" (UID: \"73e49f2e-6ee9-4783-8ab9-74b87946b467\") " pod="openshift-must-gather-plvk7/perf-node-gather-daemonset-hk2hx" Apr 16 20:27:13.006495 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:27:13.006412 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/73e49f2e-6ee9-4783-8ab9-74b87946b467-sys\") pod \"perf-node-gather-daemonset-hk2hx\" (UID: \"73e49f2e-6ee9-4783-8ab9-74b87946b467\") " pod="openshift-must-gather-plvk7/perf-node-gather-daemonset-hk2hx" Apr 16 20:27:13.006495 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:27:13.006452 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/73e49f2e-6ee9-4783-8ab9-74b87946b467-proc\") pod \"perf-node-gather-daemonset-hk2hx\" (UID: \"73e49f2e-6ee9-4783-8ab9-74b87946b467\") " pod="openshift-must-gather-plvk7/perf-node-gather-daemonset-hk2hx" Apr 16 20:27:13.006495 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:27:13.006479 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/73e49f2e-6ee9-4783-8ab9-74b87946b467-lib-modules\") pod \"perf-node-gather-daemonset-hk2hx\" (UID: \"73e49f2e-6ee9-4783-8ab9-74b87946b467\") " pod="openshift-must-gather-plvk7/perf-node-gather-daemonset-hk2hx" Apr 16 20:27:13.006629 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:27:13.006503 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xrf6w\" (UniqueName: \"kubernetes.io/projected/73e49f2e-6ee9-4783-8ab9-74b87946b467-kube-api-access-xrf6w\") pod \"perf-node-gather-daemonset-hk2hx\" (UID: \"73e49f2e-6ee9-4783-8ab9-74b87946b467\") " pod="openshift-must-gather-plvk7/perf-node-gather-daemonset-hk2hx" Apr 16 20:27:13.006629 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:27:13.006520 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/73e49f2e-6ee9-4783-8ab9-74b87946b467-podres\") pod \"perf-node-gather-daemonset-hk2hx\" (UID: \"73e49f2e-6ee9-4783-8ab9-74b87946b467\") " pod="openshift-must-gather-plvk7/perf-node-gather-daemonset-hk2hx" Apr 16 20:27:13.006629 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:27:13.006517 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/73e49f2e-6ee9-4783-8ab9-74b87946b467-sys\") pod \"perf-node-gather-daemonset-hk2hx\" (UID: \"73e49f2e-6ee9-4783-8ab9-74b87946b467\") " pod="openshift-must-gather-plvk7/perf-node-gather-daemonset-hk2hx" Apr 16 20:27:13.006629 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:27:13.006565 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/73e49f2e-6ee9-4783-8ab9-74b87946b467-proc\") pod \"perf-node-gather-daemonset-hk2hx\" (UID: \"73e49f2e-6ee9-4783-8ab9-74b87946b467\") " pod="openshift-must-gather-plvk7/perf-node-gather-daemonset-hk2hx" Apr 16 20:27:13.006629 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:27:13.006613 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/73e49f2e-6ee9-4783-8ab9-74b87946b467-lib-modules\") pod \"perf-node-gather-daemonset-hk2hx\" (UID: \"73e49f2e-6ee9-4783-8ab9-74b87946b467\") " pod="openshift-must-gather-plvk7/perf-node-gather-daemonset-hk2hx" Apr 16 20:27:13.014283 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:27:13.014266 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xrf6w\" (UniqueName: \"kubernetes.io/projected/73e49f2e-6ee9-4783-8ab9-74b87946b467-kube-api-access-xrf6w\") pod \"perf-node-gather-daemonset-hk2hx\" (UID: \"73e49f2e-6ee9-4783-8ab9-74b87946b467\") " pod="openshift-must-gather-plvk7/perf-node-gather-daemonset-hk2hx" Apr 16 20:27:13.099289 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:27:13.099266 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-plvk7/perf-node-gather-daemonset-hk2hx" Apr 16 20:27:13.217250 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:27:13.217225 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-plvk7/perf-node-gather-daemonset-hk2hx"] Apr 16 20:27:13.219563 ip-10-0-128-48 kubenswrapper[2572]: W0416 20:27:13.219535 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod73e49f2e_6ee9_4783_8ab9_74b87946b467.slice/crio-840c29f17cae0f57566dcedd03b5fbea2952c020538774cdd581ad90e3eee9e6 WatchSource:0}: Error finding container 840c29f17cae0f57566dcedd03b5fbea2952c020538774cdd581ad90e3eee9e6: Status 404 returned error can't find the container with id 840c29f17cae0f57566dcedd03b5fbea2952c020538774cdd581ad90e3eee9e6 Apr 16 20:27:13.424126 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:27:13.424024 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-plvk7/perf-node-gather-daemonset-hk2hx" event={"ID":"73e49f2e-6ee9-4783-8ab9-74b87946b467","Type":"ContainerStarted","Data":"62e7b0d01f25eebc91b970becdb91e7594b2a9dbefb1f28dde1db222fd022e87"} Apr 16 20:27:13.424126 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:27:13.424069 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-plvk7/perf-node-gather-daemonset-hk2hx" event={"ID":"73e49f2e-6ee9-4783-8ab9-74b87946b467","Type":"ContainerStarted","Data":"840c29f17cae0f57566dcedd03b5fbea2952c020538774cdd581ad90e3eee9e6"} Apr 16 20:27:13.424344 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:27:13.424123 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-plvk7/perf-node-gather-daemonset-hk2hx" Apr 16 20:27:13.441340 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:27:13.441299 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-plvk7/perf-node-gather-daemonset-hk2hx" podStartSLOduration=1.441287688 podStartE2EDuration="1.441287688s" podCreationTimestamp="2026-04-16 20:27:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:27:13.439478904 +0000 UTC m=+2000.618240929" watchObservedRunningTime="2026-04-16 20:27:13.441287688 +0000 UTC m=+2000.620049713" Apr 16 20:27:13.698573 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:27:13.698500 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-9clf4_40b37910-94fc-4da8-aa1d-163af810d004/dns/0.log" Apr 16 20:27:13.719509 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:27:13.719486 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-9clf4_40b37910-94fc-4da8-aa1d-163af810d004/kube-rbac-proxy/0.log" Apr 16 20:27:13.869307 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:27:13.869280 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-t4xmk_e34d0c6b-2806-42a6-9665-4b769fb05f24/dns-node-resolver/0.log" Apr 16 20:27:14.357896 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:27:14.357861 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-2jzjc_b9261443-4578-43ee-abc0-7931d8ab9f10/node-ca/0.log" Apr 16 20:27:15.229955 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:27:15.229924 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_openshift-ai-inference-openshift-default-7c5447bb76-5pvbr_33697ceb-0a50-49f2-830f-8bf9e962c02c/istio-proxy/0.log" Apr 16 20:27:15.253289 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:27:15.253264 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-74fc646496-l58qj_142d9ab9-4b05-479d-b198-2760a09292d1/router/0.log" Apr 16 20:27:15.703791 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:27:15.703765 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-ppsrs_2676cacd-954e-4b7c-a85c-1b43b90f0471/serve-healthcheck-canary/0.log" Apr 16 20:27:16.146085 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:27:16.146057 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-gm4s4_c485156a-3973-41b9-9936-2cd58d6d6ea4/insights-operator/0.log" Apr 16 20:27:16.146251 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:27:16.146152 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-gm4s4_c485156a-3973-41b9-9936-2cd58d6d6ea4/insights-operator/1.log" Apr 16 20:27:16.239623 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:27:16.239598 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-cl59r_9b0ca5b9-11b5-46e9-80c6-d8a389e78051/kube-rbac-proxy/0.log" Apr 16 20:27:16.260555 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:27:16.260525 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-cl59r_9b0ca5b9-11b5-46e9-80c6-d8a389e78051/exporter/0.log" Apr 16 20:27:16.281408 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:27:16.281382 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-cl59r_9b0ca5b9-11b5-46e9-80c6-d8a389e78051/extractor/0.log" Apr 16 20:27:18.910610 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:27:18.910584 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_lws-controller-manager-ddc57ffc5-tnlfz_58fbb2f0-9cdf-4a93-8d11-f46cb14742e4/manager/0.log" Apr 16 20:27:19.437994 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:27:19.437962 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-plvk7/perf-node-gather-daemonset-hk2hx" Apr 16 20:27:24.933846 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:27:24.933809 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-g96dz_499fb04b-1629-4d2a-8d4c-4b6f38ec093e/kube-storage-version-migrator-operator/1.log" Apr 16 20:27:24.935021 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:27:24.934997 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-g96dz_499fb04b-1629-4d2a-8d4c-4b6f38ec093e/kube-storage-version-migrator-operator/0.log" Apr 16 20:27:26.127618 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:27:26.127585 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-j48bx_2cd1935c-d57b-4e12-881b-0c81444e85ac/kube-multus-additional-cni-plugins/0.log" Apr 16 20:27:26.147562 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:27:26.147538 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-j48bx_2cd1935c-d57b-4e12-881b-0c81444e85ac/egress-router-binary-copy/0.log" Apr 16 20:27:26.165499 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:27:26.165471 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-j48bx_2cd1935c-d57b-4e12-881b-0c81444e85ac/cni-plugins/0.log" Apr 16 20:27:26.184029 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:27:26.184006 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-j48bx_2cd1935c-d57b-4e12-881b-0c81444e85ac/bond-cni-plugin/0.log" Apr 16 20:27:26.201856 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:27:26.201838 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-j48bx_2cd1935c-d57b-4e12-881b-0c81444e85ac/routeoverride-cni/0.log" Apr 16 20:27:26.220748 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:27:26.220708 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-j48bx_2cd1935c-d57b-4e12-881b-0c81444e85ac/whereabouts-cni-bincopy/0.log" Apr 16 20:27:26.238750 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:27:26.238732 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-j48bx_2cd1935c-d57b-4e12-881b-0c81444e85ac/whereabouts-cni/0.log" Apr 16 20:27:26.472054 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:27:26.471979 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-mjhhc_74091848-cf67-4c6a-9ad9-adc12a5e47ad/kube-multus/0.log" Apr 16 20:27:26.490868 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:27:26.490832 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-5j478_458f83e2-e97a-457a-9081-a5ae099b6973/network-metrics-daemon/0.log" Apr 16 20:27:26.508135 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:27:26.508115 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-5j478_458f83e2-e97a-457a-9081-a5ae099b6973/kube-rbac-proxy/0.log" Apr 16 20:27:27.653125 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:27:27.653081 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bsbmz_40cddc46-c9be-411b-92b8-a65e04009fc9/ovn-controller/0.log" Apr 16 20:27:27.668780 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:27:27.668753 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bsbmz_40cddc46-c9be-411b-92b8-a65e04009fc9/ovn-acl-logging/0.log" Apr 16 20:27:27.676839 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:27:27.676822 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bsbmz_40cddc46-c9be-411b-92b8-a65e04009fc9/ovn-acl-logging/1.log" Apr 16 20:27:27.697119 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:27:27.697083 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bsbmz_40cddc46-c9be-411b-92b8-a65e04009fc9/kube-rbac-proxy-node/0.log" Apr 16 20:27:27.716443 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:27:27.716427 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bsbmz_40cddc46-c9be-411b-92b8-a65e04009fc9/kube-rbac-proxy-ovn-metrics/0.log" Apr 16 20:27:27.733986 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:27:27.733969 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bsbmz_40cddc46-c9be-411b-92b8-a65e04009fc9/northd/0.log" Apr 16 20:27:27.753430 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:27:27.753413 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bsbmz_40cddc46-c9be-411b-92b8-a65e04009fc9/nbdb/0.log" Apr 16 20:27:27.773134 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:27:27.773118 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bsbmz_40cddc46-c9be-411b-92b8-a65e04009fc9/sbdb/0.log" Apr 16 20:27:27.880215 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:27:27.880193 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bsbmz_40cddc46-c9be-411b-92b8-a65e04009fc9/ovnkube-controller/0.log" Apr 16 20:27:29.235470 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:27:29.235443 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-8894fc9bd-c8jsl_528c7eb6-e54f-42e5-bd14-acbc205001c1/check-endpoints/0.log" Apr 16 20:27:29.282263 ip-10-0-128-48 kubenswrapper[2572]: I0416 20:27:29.282235 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-qvsgg_dc2e7638-ebb5-4713-a221-8c885ed0b19d/network-check-target-container/0.log"