Apr 21 15:35:04.733078 ip-10-0-136-123 systemd[1]: Starting Kubernetes Kubelet... Apr 21 15:35:05.226390 ip-10-0-136-123 kubenswrapper[2573]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 21 15:35:05.226390 ip-10-0-136-123 kubenswrapper[2573]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 21 15:35:05.226390 ip-10-0-136-123 kubenswrapper[2573]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 21 15:35:05.226390 ip-10-0-136-123 kubenswrapper[2573]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 21 15:35:05.226390 ip-10-0-136-123 kubenswrapper[2573]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 21 15:35:05.228647 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.228555 2573 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 21 15:35:05.233760 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.233744 2573 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 15:35:05.233792 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.233762 2573 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 15:35:05.233792 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.233766 2573 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 15:35:05.233792 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.233770 2573 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 15:35:05.233792 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.233773 2573 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 15:35:05.233792 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.233777 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 15:35:05.233792 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.233780 2573 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 15:35:05.233792 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.233783 2573 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 15:35:05.233792 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.233786 2573 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 15:35:05.233792 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.233789 2573 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 15:35:05.233792 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.233803 2573 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 15:35:05.234074 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.233809 2573 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 15:35:05.234074 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.233813 2573 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 15:35:05.234074 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.233818 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 15:35:05.234074 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.233822 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 15:35:05.234074 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.233826 2573 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 15:35:05.234074 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.233829 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 15:35:05.234074 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.233834 2573 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 15:35:05.234074 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.233838 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 15:35:05.234074 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.233841 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 15:35:05.234074 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.233844 2573 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 15:35:05.234074 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.233846 2573 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 15:35:05.234074 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.233849 2573 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 15:35:05.234074 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.233852 2573 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 15:35:05.234074 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.233855 2573 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 15:35:05.234074 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.233857 2573 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 15:35:05.234074 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.233860 2573 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 15:35:05.234074 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.233862 2573 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 15:35:05.234074 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.233865 2573 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 15:35:05.234074 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.233868 2573 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 15:35:05.234074 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.233870 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 15:35:05.234544 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.233873 2573 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 15:35:05.234544 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.233875 2573 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 15:35:05.234544 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.233878 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 15:35:05.234544 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.233881 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 15:35:05.234544 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.233883 2573 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 15:35:05.234544 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.233886 2573 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 15:35:05.234544 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.233888 2573 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 15:35:05.234544 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.233891 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 15:35:05.234544 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.233894 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 15:35:05.234544 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.233897 2573 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 15:35:05.234544 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.233900 2573 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 15:35:05.234544 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.233902 2573 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 15:35:05.234544 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.233905 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 15:35:05.234544 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.233908 2573 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 15:35:05.234544 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.233913 2573 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 15:35:05.234544 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.233917 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 15:35:05.234544 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.233920 2573 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 15:35:05.234544 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.233923 2573 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 15:35:05.234544 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.233926 2573 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 15:35:05.235014 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.233929 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 15:35:05.235014 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.233932 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 15:35:05.235014 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.233934 2573 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 15:35:05.235014 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.233937 2573 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 15:35:05.235014 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.233939 2573 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 15:35:05.235014 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.233942 2573 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 15:35:05.235014 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.233945 2573 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 15:35:05.235014 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.233947 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 15:35:05.235014 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.233950 2573 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 15:35:05.235014 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.233952 2573 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 15:35:05.235014 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.233955 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 15:35:05.235014 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.233957 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 15:35:05.235014 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.233959 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 15:35:05.235014 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.233962 2573 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 15:35:05.235014 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.233964 2573 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 15:35:05.235014 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.233967 2573 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 15:35:05.235014 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.233969 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 15:35:05.235014 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.233972 2573 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 15:35:05.235014 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.233974 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 15:35:05.235014 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.233977 2573 feature_gate.go:328] unrecognized feature gate: Example Apr 21 15:35:05.235507 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.233980 2573 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 15:35:05.235507 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.233983 2573 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 15:35:05.235507 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.233985 2573 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 15:35:05.235507 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.233988 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 15:35:05.235507 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.233990 2573 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 15:35:05.235507 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.233993 2573 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 15:35:05.235507 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.233997 2573 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 15:35:05.235507 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.233999 2573 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 15:35:05.235507 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.234002 2573 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 15:35:05.235507 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.234004 2573 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 15:35:05.235507 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.234007 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 15:35:05.235507 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.234010 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 15:35:05.235507 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.234012 2573 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 15:35:05.235507 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.234015 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 15:35:05.235507 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.234017 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 15:35:05.235507 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.234020 2573 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 15:35:05.235507 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.234430 2573 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 15:35:05.235507 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.234435 2573 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 15:35:05.235507 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.234438 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 15:35:05.235964 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.234441 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 15:35:05.235964 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.234443 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 15:35:05.235964 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.234446 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 15:35:05.235964 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.234449 2573 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 15:35:05.235964 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.234452 2573 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 15:35:05.235964 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.234455 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 15:35:05.235964 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.234457 2573 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 15:35:05.235964 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.234460 2573 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 15:35:05.235964 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.234463 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 15:35:05.235964 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.234465 2573 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 15:35:05.235964 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.234468 2573 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 15:35:05.235964 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.234470 2573 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 15:35:05.235964 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.234473 2573 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 15:35:05.235964 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.234476 2573 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 15:35:05.235964 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.234479 2573 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 15:35:05.235964 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.234481 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 15:35:05.235964 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.234484 2573 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 15:35:05.235964 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.234486 2573 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 15:35:05.235964 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.234489 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 15:35:05.235964 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.234493 2573 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 15:35:05.236440 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.234496 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 15:35:05.236440 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.234499 2573 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 15:35:05.236440 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.234501 2573 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 15:35:05.236440 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.234504 2573 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 15:35:05.236440 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.234506 2573 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 15:35:05.236440 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.234509 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 15:35:05.236440 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.234511 2573 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 15:35:05.236440 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.234514 2573 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 15:35:05.236440 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.234516 2573 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 15:35:05.236440 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.234519 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 15:35:05.236440 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.234521 2573 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 15:35:05.236440 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.234524 2573 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 15:35:05.236440 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.234526 2573 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 15:35:05.236440 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.234529 2573 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 15:35:05.236440 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.234531 2573 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 15:35:05.236440 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.234534 2573 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 15:35:05.236440 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.234536 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 15:35:05.236440 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.234539 2573 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 15:35:05.236440 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.234542 2573 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 15:35:05.236440 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.234544 2573 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 15:35:05.236962 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.234547 2573 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 15:35:05.236962 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.234550 2573 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 15:35:05.236962 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.234553 2573 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 15:35:05.236962 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.234555 2573 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 15:35:05.236962 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.234558 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 15:35:05.236962 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.234561 2573 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 15:35:05.236962 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.234563 2573 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 15:35:05.236962 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.234567 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 15:35:05.236962 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.234569 2573 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 15:35:05.236962 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.234572 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 15:35:05.236962 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.234575 2573 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 15:35:05.236962 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.234578 2573 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 15:35:05.236962 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.234581 2573 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 15:35:05.236962 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.234584 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 15:35:05.236962 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.234586 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 15:35:05.236962 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.234590 2573 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 15:35:05.236962 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.234594 2573 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 15:35:05.236962 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.234597 2573 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 15:35:05.236962 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.234599 2573 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 15:35:05.236962 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.234602 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 15:35:05.237453 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.234605 2573 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 15:35:05.237453 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.234607 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 15:35:05.237453 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.234609 2573 feature_gate.go:328] unrecognized feature gate: Example Apr 21 15:35:05.237453 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.234612 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 15:35:05.237453 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.234615 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 15:35:05.237453 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.234617 2573 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 15:35:05.237453 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.234620 2573 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 15:35:05.237453 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.234622 2573 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 15:35:05.237453 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.234625 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 15:35:05.237453 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.234627 2573 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 15:35:05.237453 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.234630 2573 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 15:35:05.237453 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.234632 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 15:35:05.237453 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.234635 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 15:35:05.237453 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.234640 2573 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 15:35:05.237453 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.234643 2573 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 15:35:05.237453 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.234646 2573 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 15:35:05.237453 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.234648 2573 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 15:35:05.237453 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.234651 2573 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 15:35:05.237453 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.234653 2573 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 15:35:05.237928 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.234656 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 15:35:05.237928 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.234659 2573 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 15:35:05.237928 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.234662 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 15:35:05.237928 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.234665 2573 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 15:35:05.237928 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.236210 2573 flags.go:64] FLAG: --address="0.0.0.0" Apr 21 15:35:05.237928 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.236220 2573 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 21 15:35:05.237928 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.236227 2573 flags.go:64] FLAG: --anonymous-auth="true" Apr 21 15:35:05.237928 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.236232 2573 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 21 15:35:05.237928 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.236237 2573 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 21 15:35:05.237928 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.236241 2573 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 21 15:35:05.237928 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.236246 2573 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 21 15:35:05.237928 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.236251 2573 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 21 15:35:05.237928 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.236254 2573 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 21 15:35:05.237928 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.236257 2573 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 21 15:35:05.237928 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.236261 2573 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 21 15:35:05.237928 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.236264 2573 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 21 15:35:05.237928 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.236267 2573 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 21 15:35:05.237928 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.236270 2573 flags.go:64] FLAG: --cgroup-root="" Apr 21 15:35:05.237928 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.236273 2573 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 21 15:35:05.237928 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.236276 2573 flags.go:64] FLAG: --client-ca-file="" Apr 21 15:35:05.237928 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.236279 2573 flags.go:64] FLAG: --cloud-config="" Apr 21 15:35:05.237928 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.236282 2573 flags.go:64] FLAG: --cloud-provider="external" Apr 21 15:35:05.237928 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.236285 2573 flags.go:64] FLAG: --cluster-dns="[]" Apr 21 15:35:05.237928 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.236290 2573 flags.go:64] FLAG: --cluster-domain="" Apr 21 15:35:05.238506 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.236293 2573 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 21 15:35:05.238506 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.236297 2573 flags.go:64] FLAG: --config-dir="" Apr 21 15:35:05.238506 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.236300 2573 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 21 15:35:05.238506 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.236303 2573 flags.go:64] FLAG: --container-log-max-files="5" Apr 21 15:35:05.238506 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.236307 2573 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 21 15:35:05.238506 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.236310 2573 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 21 15:35:05.238506 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.236314 2573 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 21 15:35:05.238506 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.236317 2573 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 21 15:35:05.238506 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.236321 2573 flags.go:64] FLAG: --contention-profiling="false" Apr 21 15:35:05.238506 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.236324 2573 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 21 15:35:05.238506 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.236327 2573 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 21 15:35:05.238506 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.236331 2573 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 21 15:35:05.238506 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.236334 2573 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 21 15:35:05.238506 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.236339 2573 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 21 15:35:05.238506 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.236342 2573 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 21 15:35:05.238506 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.236345 2573 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 21 15:35:05.238506 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.236349 2573 flags.go:64] FLAG: --enable-load-reader="false" Apr 21 15:35:05.238506 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.236352 2573 flags.go:64] FLAG: --enable-server="true" Apr 21 15:35:05.238506 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.236355 2573 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 21 15:35:05.238506 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.236359 2573 flags.go:64] FLAG: --event-burst="100" Apr 21 15:35:05.238506 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.236363 2573 flags.go:64] FLAG: --event-qps="50" Apr 21 15:35:05.238506 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.236366 2573 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 21 15:35:05.238506 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.236369 2573 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 21 15:35:05.238506 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.236372 2573 flags.go:64] FLAG: --eviction-hard="" Apr 21 15:35:05.238506 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.236376 2573 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 21 15:35:05.239131 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.236379 2573 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 21 15:35:05.239131 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.236383 2573 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 21 15:35:05.239131 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.236386 2573 flags.go:64] FLAG: --eviction-soft="" Apr 21 15:35:05.239131 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.236389 2573 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 21 15:35:05.239131 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.236392 2573 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 21 15:35:05.239131 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.236395 2573 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 21 15:35:05.239131 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.236398 2573 flags.go:64] FLAG: --experimental-mounter-path="" Apr 21 15:35:05.239131 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.236401 2573 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 21 15:35:05.239131 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.236404 2573 flags.go:64] FLAG: --fail-swap-on="true" Apr 21 15:35:05.239131 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.236407 2573 flags.go:64] FLAG: --feature-gates="" Apr 21 15:35:05.239131 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.236411 2573 flags.go:64] FLAG: --file-check-frequency="20s" Apr 21 15:35:05.239131 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.236414 2573 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 21 15:35:05.239131 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.236417 2573 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 21 15:35:05.239131 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.236421 2573 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 21 15:35:05.239131 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.236424 2573 flags.go:64] FLAG: --healthz-port="10248" Apr 21 15:35:05.239131 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.236427 2573 flags.go:64] FLAG: --help="false" Apr 21 15:35:05.239131 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.236430 2573 flags.go:64] FLAG: --hostname-override="ip-10-0-136-123.ec2.internal" Apr 21 15:35:05.239131 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.236433 2573 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 21 15:35:05.239131 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.236437 2573 flags.go:64] FLAG: --http-check-frequency="20s" Apr 21 15:35:05.239131 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.236440 2573 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 21 15:35:05.239131 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.236444 2573 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 21 15:35:05.239131 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.236447 2573 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 21 15:35:05.239131 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.236451 2573 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 21 15:35:05.239131 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.236454 2573 flags.go:64] FLAG: --image-service-endpoint="" Apr 21 15:35:05.239689 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.236457 2573 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 21 15:35:05.239689 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.236460 2573 flags.go:64] FLAG: --kube-api-burst="100" Apr 21 15:35:05.239689 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.236462 2573 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 21 15:35:05.239689 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.236466 2573 flags.go:64] FLAG: --kube-api-qps="50" Apr 21 15:35:05.239689 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.236469 2573 flags.go:64] FLAG: --kube-reserved="" Apr 21 15:35:05.239689 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.236472 2573 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 21 15:35:05.239689 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.236475 2573 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 21 15:35:05.239689 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.236478 2573 flags.go:64] FLAG: --kubelet-cgroups="" Apr 21 15:35:05.239689 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.236481 2573 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 21 15:35:05.239689 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.236484 2573 flags.go:64] FLAG: --lock-file="" Apr 21 15:35:05.239689 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.236487 2573 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 21 15:35:05.239689 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.236490 2573 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 21 15:35:05.239689 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.236493 2573 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 21 15:35:05.239689 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.236498 2573 flags.go:64] FLAG: --log-json-split-stream="false" Apr 21 15:35:05.239689 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.236501 2573 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 21 15:35:05.239689 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.236504 2573 flags.go:64] FLAG: --log-text-split-stream="false" Apr 21 15:35:05.239689 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.236507 2573 flags.go:64] FLAG: --logging-format="text" Apr 21 15:35:05.239689 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.236509 2573 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 21 15:35:05.239689 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.236513 2573 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 21 15:35:05.239689 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.236516 2573 flags.go:64] FLAG: --manifest-url="" Apr 21 15:35:05.239689 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.236518 2573 flags.go:64] FLAG: --manifest-url-header="" Apr 21 15:35:05.239689 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.236523 2573 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 21 15:35:05.239689 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.236527 2573 flags.go:64] FLAG: --max-open-files="1000000" Apr 21 15:35:05.239689 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.236531 2573 flags.go:64] FLAG: --max-pods="110" Apr 21 15:35:05.239689 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.236534 2573 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 21 15:35:05.240306 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.236537 2573 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 21 15:35:05.240306 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.236540 2573 flags.go:64] FLAG: --memory-manager-policy="None" Apr 21 15:35:05.240306 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.236546 2573 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 21 15:35:05.240306 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.236550 2573 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 21 15:35:05.240306 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.236553 2573 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 21 15:35:05.240306 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.236556 2573 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 21 15:35:05.240306 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.236564 2573 flags.go:64] FLAG: --node-status-max-images="50" Apr 21 15:35:05.240306 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.236567 2573 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 21 15:35:05.240306 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.236571 2573 flags.go:64] FLAG: --oom-score-adj="-999" Apr 21 15:35:05.240306 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.236574 2573 flags.go:64] FLAG: --pod-cidr="" Apr 21 15:35:05.240306 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.236577 2573 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 21 15:35:05.240306 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.236582 2573 flags.go:64] FLAG: --pod-manifest-path="" Apr 21 15:35:05.240306 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.236585 2573 flags.go:64] FLAG: --pod-max-pids="-1" Apr 21 15:35:05.240306 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.236588 2573 flags.go:64] FLAG: --pods-per-core="0" Apr 21 15:35:05.240306 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.236591 2573 flags.go:64] FLAG: --port="10250" Apr 21 15:35:05.240306 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.236594 2573 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 21 15:35:05.240306 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.236597 2573 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-02193db40e4e35ee2" Apr 21 15:35:05.240306 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.236601 2573 flags.go:64] FLAG: --qos-reserved="" Apr 21 15:35:05.240306 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.236604 2573 flags.go:64] FLAG: --read-only-port="10255" Apr 21 15:35:05.240306 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.236607 2573 flags.go:64] FLAG: --register-node="true" Apr 21 15:35:05.240306 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.236610 2573 flags.go:64] FLAG: --register-schedulable="true" Apr 21 15:35:05.240306 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.236613 2573 flags.go:64] FLAG: --register-with-taints="" Apr 21 15:35:05.240306 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.236617 2573 flags.go:64] FLAG: --registry-burst="10" Apr 21 15:35:05.240306 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.236620 2573 flags.go:64] FLAG: --registry-qps="5" Apr 21 15:35:05.240925 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.236622 2573 flags.go:64] FLAG: --reserved-cpus="" Apr 21 15:35:05.240925 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.236625 2573 flags.go:64] FLAG: --reserved-memory="" Apr 21 15:35:05.240925 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.236629 2573 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 21 15:35:05.240925 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.236632 2573 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 21 15:35:05.240925 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.236635 2573 flags.go:64] FLAG: --rotate-certificates="false" Apr 21 15:35:05.240925 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.236638 2573 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 21 15:35:05.240925 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.236641 2573 flags.go:64] FLAG: --runonce="false" Apr 21 15:35:05.240925 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.236644 2573 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 21 15:35:05.240925 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.236647 2573 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 21 15:35:05.240925 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.236650 2573 flags.go:64] FLAG: --seccomp-default="false" Apr 21 15:35:05.240925 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.236655 2573 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 21 15:35:05.240925 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.236658 2573 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 21 15:35:05.240925 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.236662 2573 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 21 15:35:05.240925 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.236665 2573 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 21 15:35:05.240925 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.236668 2573 flags.go:64] FLAG: --storage-driver-password="root" Apr 21 15:35:05.240925 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.236671 2573 flags.go:64] FLAG: --storage-driver-secure="false" Apr 21 15:35:05.240925 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.236673 2573 flags.go:64] FLAG: --storage-driver-table="stats" Apr 21 15:35:05.240925 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.236676 2573 flags.go:64] FLAG: --storage-driver-user="root" Apr 21 15:35:05.240925 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.236679 2573 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 21 15:35:05.240925 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.236682 2573 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 21 15:35:05.240925 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.236685 2573 flags.go:64] FLAG: --system-cgroups="" Apr 21 15:35:05.240925 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.236688 2573 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 21 15:35:05.240925 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.236693 2573 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 21 15:35:05.240925 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.236696 2573 flags.go:64] FLAG: --tls-cert-file="" Apr 21 15:35:05.240925 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.236699 2573 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 21 15:35:05.241520 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.236704 2573 flags.go:64] FLAG: --tls-min-version="" Apr 21 15:35:05.241520 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.236707 2573 flags.go:64] FLAG: --tls-private-key-file="" Apr 21 15:35:05.241520 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.236709 2573 flags.go:64] FLAG: --topology-manager-policy="none" Apr 21 15:35:05.241520 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.236713 2573 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 21 15:35:05.241520 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.236716 2573 flags.go:64] FLAG: --topology-manager-scope="container" Apr 21 15:35:05.241520 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.236719 2573 flags.go:64] FLAG: --v="2" Apr 21 15:35:05.241520 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.236723 2573 flags.go:64] FLAG: --version="false" Apr 21 15:35:05.241520 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.236727 2573 flags.go:64] FLAG: --vmodule="" Apr 21 15:35:05.241520 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.236731 2573 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 21 15:35:05.241520 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.236735 2573 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 21 15:35:05.241520 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.236860 2573 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 15:35:05.241520 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.236865 2573 feature_gate.go:328] unrecognized feature gate: Example Apr 21 15:35:05.241520 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.236868 2573 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 15:35:05.241520 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.236871 2573 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 15:35:05.241520 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.236874 2573 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 15:35:05.241520 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.236877 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 15:35:05.241520 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.236879 2573 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 15:35:05.241520 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.236884 2573 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 15:35:05.241520 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.236886 2573 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 15:35:05.241520 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.236889 2573 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 15:35:05.241520 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.236892 2573 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 15:35:05.241520 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.236895 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 15:35:05.242090 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.236897 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 15:35:05.242090 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.236900 2573 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 15:35:05.242090 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.236902 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 15:35:05.242090 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.236905 2573 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 15:35:05.242090 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.236908 2573 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 15:35:05.242090 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.236910 2573 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 15:35:05.242090 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.236913 2573 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 15:35:05.242090 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.236915 2573 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 15:35:05.242090 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.236917 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 15:35:05.242090 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.236920 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 15:35:05.242090 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.236923 2573 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 15:35:05.242090 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.236925 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 15:35:05.242090 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.236928 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 15:35:05.242090 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.236930 2573 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 15:35:05.242090 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.236933 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 15:35:05.242090 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.236935 2573 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 15:35:05.242090 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.236938 2573 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 15:35:05.242090 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.236940 2573 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 15:35:05.242090 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.236943 2573 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 15:35:05.242090 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.236945 2573 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 15:35:05.242593 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.236948 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 15:35:05.242593 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.236950 2573 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 15:35:05.242593 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.236956 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 15:35:05.242593 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.236958 2573 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 15:35:05.242593 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.236961 2573 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 15:35:05.242593 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.236963 2573 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 15:35:05.242593 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.236966 2573 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 15:35:05.242593 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.236970 2573 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 15:35:05.242593 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.236973 2573 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 15:35:05.242593 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.236976 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 15:35:05.242593 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.236979 2573 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 15:35:05.242593 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.236982 2573 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 15:35:05.242593 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.236984 2573 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 15:35:05.242593 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.236987 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 15:35:05.242593 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.236989 2573 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 15:35:05.242593 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.236992 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 15:35:05.242593 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.236994 2573 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 15:35:05.242593 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.236997 2573 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 15:35:05.242593 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.237000 2573 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 15:35:05.243084 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.237002 2573 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 15:35:05.243084 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.237006 2573 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 15:35:05.243084 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.237010 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 15:35:05.243084 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.237013 2573 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 15:35:05.243084 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.237017 2573 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 15:35:05.243084 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.237021 2573 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 15:35:05.243084 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.237024 2573 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 15:35:05.243084 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.237026 2573 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 15:35:05.243084 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.237029 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 15:35:05.243084 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.237032 2573 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 15:35:05.243084 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.237035 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 15:35:05.243084 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.237037 2573 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 15:35:05.243084 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.237040 2573 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 15:35:05.243084 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.237043 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 15:35:05.243084 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.237046 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 15:35:05.243084 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.237050 2573 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 15:35:05.243084 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.237053 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 15:35:05.243084 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.237055 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 15:35:05.243084 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.237058 2573 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 15:35:05.243727 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.237060 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 15:35:05.243727 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.237064 2573 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 15:35:05.243727 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.237066 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 15:35:05.243727 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.237069 2573 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 15:35:05.243727 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.237071 2573 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 15:35:05.243727 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.237075 2573 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 15:35:05.243727 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.237077 2573 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 15:35:05.243727 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.237080 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 15:35:05.243727 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.237082 2573 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 15:35:05.243727 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.237085 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 15:35:05.243727 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.237087 2573 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 15:35:05.243727 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.237089 2573 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 15:35:05.243727 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.237092 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 15:35:05.243727 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.237094 2573 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 15:35:05.243727 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.237097 2573 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 15:35:05.243727 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.237099 2573 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 15:35:05.244368 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.238018 2573 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 21 15:35:05.246061 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.246033 2573 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 21 15:35:05.246061 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.246057 2573 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 21 15:35:05.246235 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.246149 2573 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 15:35:05.246235 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.246157 2573 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 15:35:05.246235 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.246162 2573 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 15:35:05.246235 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.246167 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 15:35:05.246235 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.246171 2573 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 15:35:05.246235 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.246176 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 15:35:05.246235 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.246180 2573 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 15:35:05.246235 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.246184 2573 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 15:35:05.246235 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.246188 2573 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 15:35:05.246235 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.246192 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 15:35:05.246235 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.246197 2573 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 15:35:05.246235 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.246201 2573 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 15:35:05.246235 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.246206 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 15:35:05.246235 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.246210 2573 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 15:35:05.246235 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.246217 2573 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 15:35:05.246235 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.246223 2573 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 15:35:05.246235 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.246228 2573 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 15:35:05.246235 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.246233 2573 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 15:35:05.246235 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.246237 2573 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 15:35:05.247112 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.246241 2573 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 15:35:05.247112 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.246246 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 15:35:05.247112 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.246250 2573 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 15:35:05.247112 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.246254 2573 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 15:35:05.247112 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.246259 2573 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 15:35:05.247112 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.246264 2573 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 15:35:05.247112 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.246268 2573 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 15:35:05.247112 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.246272 2573 feature_gate.go:328] unrecognized feature gate: Example Apr 21 15:35:05.247112 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.246276 2573 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 15:35:05.247112 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.246281 2573 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 15:35:05.247112 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.246285 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 15:35:05.247112 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.246289 2573 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 15:35:05.247112 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.246293 2573 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 15:35:05.247112 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.246297 2573 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 15:35:05.247112 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.246301 2573 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 15:35:05.247112 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.246305 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 15:35:05.247112 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.246310 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 15:35:05.247112 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.246314 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 15:35:05.247112 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.246320 2573 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 15:35:05.247112 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.246326 2573 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 15:35:05.247677 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.246330 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 15:35:05.247677 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.246334 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 15:35:05.247677 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.246339 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 15:35:05.247677 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.246343 2573 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 15:35:05.247677 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.246347 2573 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 15:35:05.247677 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.246351 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 15:35:05.247677 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.246355 2573 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 15:35:05.247677 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.246359 2573 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 15:35:05.247677 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.246364 2573 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 15:35:05.247677 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.246368 2573 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 15:35:05.247677 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.246372 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 15:35:05.247677 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.246376 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 15:35:05.247677 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.246381 2573 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 15:35:05.247677 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.246386 2573 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 15:35:05.247677 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.246391 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 15:35:05.247677 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.246396 2573 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 15:35:05.247677 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.246400 2573 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 15:35:05.247677 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.246404 2573 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 15:35:05.247677 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.246409 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 15:35:05.247677 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.246413 2573 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 15:35:05.248372 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.246417 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 15:35:05.248372 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.246421 2573 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 15:35:05.248372 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.246426 2573 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 15:35:05.248372 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.246430 2573 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 15:35:05.248372 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.246434 2573 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 15:35:05.248372 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.246438 2573 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 15:35:05.248372 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.246442 2573 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 15:35:05.248372 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.246446 2573 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 15:35:05.248372 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.246450 2573 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 15:35:05.248372 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.246454 2573 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 15:35:05.248372 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.246458 2573 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 15:35:05.248372 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.246463 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 15:35:05.248372 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.246468 2573 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 15:35:05.248372 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.246472 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 15:35:05.248372 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.246476 2573 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 15:35:05.248372 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.246480 2573 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 15:35:05.248372 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.246484 2573 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 15:35:05.248372 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.246489 2573 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 15:35:05.248372 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.246494 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 15:35:05.248372 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.246498 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 15:35:05.249017 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.246502 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 15:35:05.249017 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.246506 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 15:35:05.249017 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.246510 2573 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 15:35:05.249017 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.246515 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 15:35:05.249017 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.246519 2573 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 15:35:05.249017 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.246523 2573 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 15:35:05.249017 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.246528 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 15:35:05.249017 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.246536 2573 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 21 15:35:05.249017 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.246692 2573 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 15:35:05.249017 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.246701 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 15:35:05.249017 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.246707 2573 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 15:35:05.249017 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.246712 2573 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 15:35:05.249017 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.246716 2573 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 15:35:05.249017 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.246721 2573 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 15:35:05.249017 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.246725 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 15:35:05.249017 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.246729 2573 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 15:35:05.249522 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.246733 2573 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 15:35:05.249522 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.246737 2573 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 15:35:05.249522 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.246741 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 15:35:05.249522 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.246746 2573 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 15:35:05.249522 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.246750 2573 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 15:35:05.249522 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.246753 2573 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 15:35:05.249522 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.246758 2573 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 15:35:05.249522 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.246762 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 15:35:05.249522 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.246767 2573 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 15:35:05.249522 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.246771 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 15:35:05.249522 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.246775 2573 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 15:35:05.249522 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.246779 2573 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 15:35:05.249522 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.246783 2573 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 15:35:05.249522 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.246787 2573 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 15:35:05.249522 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.246811 2573 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 15:35:05.249522 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.246816 2573 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 15:35:05.249522 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.246821 2573 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 15:35:05.249522 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.246825 2573 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 15:35:05.249522 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.246830 2573 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 15:35:05.249522 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.246833 2573 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 15:35:05.250033 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.246837 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 15:35:05.250033 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.246842 2573 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 15:35:05.250033 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.246846 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 15:35:05.250033 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.246851 2573 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 15:35:05.250033 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.246855 2573 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 15:35:05.250033 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.246859 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 15:35:05.250033 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.246863 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 15:35:05.250033 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.246867 2573 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 15:35:05.250033 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.246871 2573 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 15:35:05.250033 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.246876 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 15:35:05.250033 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.246880 2573 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 15:35:05.250033 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.246884 2573 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 15:35:05.250033 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.246887 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 15:35:05.250033 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.246892 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 15:35:05.250033 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.246896 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 15:35:05.250033 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.246900 2573 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 15:35:05.250033 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.246904 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 15:35:05.250033 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.246909 2573 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 15:35:05.250033 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.246913 2573 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 15:35:05.250033 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.246917 2573 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 15:35:05.250540 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.246921 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 15:35:05.250540 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.246925 2573 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 15:35:05.250540 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.246932 2573 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 15:35:05.250540 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.246939 2573 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 15:35:05.250540 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.246944 2573 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 15:35:05.250540 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.246949 2573 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 15:35:05.250540 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.246953 2573 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 15:35:05.250540 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.246959 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 15:35:05.250540 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.246963 2573 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 15:35:05.250540 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.246967 2573 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 15:35:05.250540 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.246971 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 15:35:05.250540 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.246976 2573 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 15:35:05.250540 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.246980 2573 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 15:35:05.250540 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.246985 2573 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 15:35:05.250540 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.246989 2573 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 15:35:05.250540 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.246994 2573 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 15:35:05.250540 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.246998 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 15:35:05.250540 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.247002 2573 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 15:35:05.250540 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.247006 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 15:35:05.251173 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.247010 2573 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 15:35:05.251173 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.247014 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 15:35:05.251173 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.247018 2573 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 15:35:05.251173 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.247022 2573 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 15:35:05.251173 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.247026 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 15:35:05.251173 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.247030 2573 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 15:35:05.251173 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.247034 2573 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 15:35:05.251173 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.247038 2573 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 15:35:05.251173 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.247050 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 15:35:05.251173 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.247055 2573 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 15:35:05.251173 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.247059 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 15:35:05.251173 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.247064 2573 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 15:35:05.251173 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.247068 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 15:35:05.251173 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.247072 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 15:35:05.251173 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.247077 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 15:35:05.251173 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.247082 2573 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 15:35:05.251173 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.247088 2573 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 15:35:05.251173 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.247094 2573 feature_gate.go:328] unrecognized feature gate: Example Apr 21 15:35:05.251173 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:05.247099 2573 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 15:35:05.252025 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.247108 2573 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 21 15:35:05.252025 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.248094 2573 server.go:962] "Client rotation is on, will bootstrap in background" Apr 21 15:35:05.252496 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.252477 2573 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 21 15:35:05.253778 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.253765 2573 server.go:1019] "Starting client certificate rotation" Apr 21 15:35:05.253870 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.253824 2573 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 21 15:35:05.253870 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.253866 2573 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 21 15:35:05.282343 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.282316 2573 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 21 15:35:05.285019 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.284993 2573 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 21 15:35:05.298081 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.298063 2573 log.go:25] "Validated CRI v1 runtime API" Apr 21 15:35:05.304256 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.304240 2573 log.go:25] "Validated CRI v1 image API" Apr 21 15:35:05.306014 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.305996 2573 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 21 15:35:05.310948 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.310929 2573 fs.go:135] Filesystem UUIDs: map[02205959-de89-4fb8-9603-6a31f93a94d8:/dev/nvme0n1p3 30d5a790-79c4-465a-bb15-e78c63360316:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2] Apr 21 15:35:05.311004 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.310949 2573 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 21 15:35:05.317123 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.317004 2573 manager.go:217] Machine: {Timestamp:2026-04-21 15:35:05.314690053 +0000 UTC m=+0.444743678 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3100404 MemoryCapacity:33164492800 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec219dd26599ff2792b5571d60dc0533 SystemUUID:ec219dd2-6599-ff27-92b5-571d60dc0533 BootID:c67114f4-1c94-4497-876f-b90f41d39cae Filesystems:[{Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:b4:46:3c:3e:b9 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:b4:46:3c:3e:b9 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:32:50:f0:86:5d:bc Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164492800 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 21 15:35:05.317123 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.317122 2573 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 21 15:35:05.317226 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.317209 2573 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 21 15:35:05.319065 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.319034 2573 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 21 15:35:05.319206 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.319067 2573 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-136-123.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 21 15:35:05.319246 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.319215 2573 topology_manager.go:138] "Creating topology manager with none policy" Apr 21 15:35:05.319246 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.319225 2573 container_manager_linux.go:306] "Creating device plugin manager" Apr 21 15:35:05.319246 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.319238 2573 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 21 15:35:05.319331 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.319248 2573 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 21 15:35:05.320346 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.320336 2573 state_mem.go:36] "Initialized new in-memory state store" Apr 21 15:35:05.320455 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.320446 2573 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 21 15:35:05.324253 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.324241 2573 kubelet.go:491] "Attempting to sync node with API server" Apr 21 15:35:05.324288 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.324263 2573 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 21 15:35:05.324288 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.324275 2573 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 21 15:35:05.324288 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.324285 2573 kubelet.go:397] "Adding apiserver pod source" Apr 21 15:35:05.324409 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.324295 2573 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 21 15:35:05.325862 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.325847 2573 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 21 15:35:05.325904 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.325874 2573 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 21 15:35:05.328426 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.328404 2573 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 21 15:35:05.329271 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.329238 2573 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 21 15:35:05.330645 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.330632 2573 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 21 15:35:05.333363 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.333351 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 21 15:35:05.333422 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.333369 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 21 15:35:05.333422 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.333375 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 21 15:35:05.333422 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.333380 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 21 15:35:05.333422 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.333386 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 21 15:35:05.333422 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.333392 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 21 15:35:05.333422 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.333398 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 21 15:35:05.333422 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.333404 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 21 15:35:05.333422 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.333412 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 21 15:35:05.333422 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.333421 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 21 15:35:05.333650 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.333431 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 21 15:35:05.333650 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.333439 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 21 15:35:05.335429 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.335416 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 21 15:35:05.335471 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.335431 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 21 15:35:05.338931 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.338918 2573 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 21 15:35:05.338989 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.338957 2573 server.go:1295] "Started kubelet" Apr 21 15:35:05.339060 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.339024 2573 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 21 15:35:05.339149 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.339111 2573 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 21 15:35:05.339199 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.339163 2573 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 21 15:35:05.339975 ip-10-0-136-123 systemd[1]: Started Kubernetes Kubelet. Apr 21 15:35:05.340245 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.340227 2573 server.go:317] "Adding debug handlers to kubelet server" Apr 21 15:35:05.340526 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.340514 2573 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 21 15:35:05.344394 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.344376 2573 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-136-123.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 21 15:35:05.344394 ip-10-0-136-123 kubenswrapper[2573]: E0421 15:35:05.344385 2573 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 21 15:35:05.344545 ip-10-0-136-123 kubenswrapper[2573]: E0421 15:35:05.344409 2573 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-136-123.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 21 15:35:05.349435 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.349416 2573 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 21 15:35:05.350232 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.350218 2573 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 21 15:35:05.351692 ip-10-0-136-123 kubenswrapper[2573]: E0421 15:35:05.351663 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-123.ec2.internal\" not found" Apr 21 15:35:05.351975 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.351953 2573 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 21 15:35:05.351975 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.351976 2573 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 21 15:35:05.352280 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.352238 2573 reconstruct.go:97] "Volume reconstruction finished" Apr 21 15:35:05.352280 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.352251 2573 reconciler.go:26] "Reconciler: start to sync state" Apr 21 15:35:05.352594 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.352572 2573 factory.go:55] Registering systemd factory Apr 21 15:35:05.352738 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.352721 2573 factory.go:223] Registration of the systemd container factory successfully Apr 21 15:35:05.352861 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.352653 2573 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 21 15:35:05.353030 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.353017 2573 factory.go:153] Registering CRI-O factory Apr 21 15:35:05.353099 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.353033 2573 factory.go:223] Registration of the crio container factory successfully Apr 21 15:35:05.353099 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.353089 2573 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 21 15:35:05.353224 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.353106 2573 factory.go:103] Registering Raw factory Apr 21 15:35:05.353224 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.353121 2573 manager.go:1196] Started watching for new ooms in manager Apr 21 15:35:05.353776 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.353760 2573 manager.go:319] Starting recovery of all containers Apr 21 15:35:05.354339 ip-10-0-136-123 kubenswrapper[2573]: E0421 15:35:05.354306 2573 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-136-123.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 21 15:35:05.354753 ip-10-0-136-123 kubenswrapper[2573]: E0421 15:35:05.354728 2573 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 21 15:35:05.355370 ip-10-0-136-123 kubenswrapper[2573]: E0421 15:35:05.355343 2573 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 21 15:35:05.355463 ip-10-0-136-123 kubenswrapper[2573]: E0421 15:35:05.354195 2573 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-136-123.ec2.internal.18a8692f68b8031d default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-136-123.ec2.internal,UID:ip-10-0-136-123.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-136-123.ec2.internal,},FirstTimestamp:2026-04-21 15:35:05.338929949 +0000 UTC m=+0.468983573,LastTimestamp:2026-04-21 15:35:05.338929949 +0000 UTC m=+0.468983573,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-136-123.ec2.internal,}" Apr 21 15:35:05.364007 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.363849 2573 manager.go:324] Recovery completed Apr 21 15:35:05.368398 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.368387 2573 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 15:35:05.371314 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.371298 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-123.ec2.internal" event="NodeHasSufficientMemory" Apr 21 15:35:05.371389 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.371325 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-123.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 15:35:05.371389 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.371336 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-123.ec2.internal" event="NodeHasSufficientPID" Apr 21 15:35:05.371857 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.371836 2573 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 21 15:35:05.371857 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.371856 2573 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 21 15:35:05.371968 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.371876 2573 state_mem.go:36] "Initialized new in-memory state store" Apr 21 15:35:05.374435 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.374423 2573 policy_none.go:49] "None policy: Start" Apr 21 15:35:05.374472 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.374440 2573 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 21 15:35:05.374472 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.374450 2573 state_mem.go:35] "Initializing new in-memory state store" Apr 21 15:35:05.380090 ip-10-0-136-123 kubenswrapper[2573]: E0421 15:35:05.380023 2573 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-136-123.ec2.internal.18a8692f6aa620ae default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-136-123.ec2.internal,UID:ip-10-0-136-123.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-136-123.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-136-123.ec2.internal,},FirstTimestamp:2026-04-21 15:35:05.371312302 +0000 UTC m=+0.501365927,LastTimestamp:2026-04-21 15:35:05.371312302 +0000 UTC m=+0.501365927,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-136-123.ec2.internal,}" Apr 21 15:35:05.393206 ip-10-0-136-123 kubenswrapper[2573]: E0421 15:35:05.393135 2573 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-136-123.ec2.internal.18a8692f6aa666d2 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-136-123.ec2.internal,UID:ip-10-0-136-123.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node ip-10-0-136-123.ec2.internal status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:ip-10-0-136-123.ec2.internal,},FirstTimestamp:2026-04-21 15:35:05.371330258 +0000 UTC m=+0.501383883,LastTimestamp:2026-04-21 15:35:05.371330258 +0000 UTC m=+0.501383883,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-136-123.ec2.internal,}" Apr 21 15:35:05.405216 ip-10-0-136-123 kubenswrapper[2573]: E0421 15:35:05.405141 2573 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-136-123.ec2.internal.18a8692f6aa68bf7 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-136-123.ec2.internal,UID:ip-10-0-136-123.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node ip-10-0-136-123.ec2.internal status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:ip-10-0-136-123.ec2.internal,},FirstTimestamp:2026-04-21 15:35:05.371339767 +0000 UTC m=+0.501393392,LastTimestamp:2026-04-21 15:35:05.371339767 +0000 UTC m=+0.501393392,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-136-123.ec2.internal,}" Apr 21 15:35:05.424894 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.407452 2573 manager.go:341] "Starting Device Plugin manager" Apr 21 15:35:05.424894 ip-10-0-136-123 kubenswrapper[2573]: E0421 15:35:05.407481 2573 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 21 15:35:05.424894 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.407491 2573 server.go:85] "Starting device plugin registration server" Apr 21 15:35:05.424894 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.407693 2573 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 21 15:35:05.424894 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.407703 2573 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 21 15:35:05.424894 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.407782 2573 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 21 15:35:05.424894 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.407870 2573 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 21 15:35:05.424894 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.407880 2573 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 21 15:35:05.424894 ip-10-0-136-123 kubenswrapper[2573]: E0421 15:35:05.408402 2573 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 21 15:35:05.424894 ip-10-0-136-123 kubenswrapper[2573]: E0421 15:35:05.408440 2573 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-136-123.ec2.internal\" not found" Apr 21 15:35:05.424894 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.415949 2573 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-l2rzg" Apr 21 15:35:05.424894 ip-10-0-136-123 kubenswrapper[2573]: E0421 15:35:05.422702 2573 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-136-123.ec2.internal.18a8692f6cefd0ed default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-136-123.ec2.internal,UID:ip-10-0-136-123.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeAllocatableEnforced,Message:Updated Node Allocatable limit across pods,Source:EventSource{Component:kubelet,Host:ip-10-0-136-123.ec2.internal,},FirstTimestamp:2026-04-21 15:35:05.409695981 +0000 UTC m=+0.539749597,LastTimestamp:2026-04-21 15:35:05.409695981 +0000 UTC m=+0.539749597,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-136-123.ec2.internal,}" Apr 21 15:35:05.426675 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.426659 2573 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-l2rzg" Apr 21 15:35:05.474848 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.474788 2573 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 21 15:35:05.475995 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.475981 2573 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 21 15:35:05.476060 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.476008 2573 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 21 15:35:05.476060 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.476033 2573 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 21 15:35:05.476060 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.476041 2573 kubelet.go:2451] "Starting kubelet main sync loop" Apr 21 15:35:05.476185 ip-10-0-136-123 kubenswrapper[2573]: E0421 15:35:05.476074 2573 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 21 15:35:05.494575 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.494512 2573 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 15:35:05.507842 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.507826 2573 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 15:35:05.508697 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.508679 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-123.ec2.internal" event="NodeHasSufficientMemory" Apr 21 15:35:05.508697 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.508710 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-123.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 15:35:05.508860 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.508720 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-123.ec2.internal" event="NodeHasSufficientPID" Apr 21 15:35:05.508860 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.508747 2573 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-136-123.ec2.internal" Apr 21 15:35:05.520624 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.520607 2573 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-136-123.ec2.internal" Apr 21 15:35:05.520727 ip-10-0-136-123 kubenswrapper[2573]: E0421 15:35:05.520631 2573 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-136-123.ec2.internal\": node \"ip-10-0-136-123.ec2.internal\" not found" Apr 21 15:35:05.546091 ip-10-0-136-123 kubenswrapper[2573]: E0421 15:35:05.546061 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-123.ec2.internal\" not found" Apr 21 15:35:05.577106 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.577081 2573 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-123.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-136-123.ec2.internal"] Apr 21 15:35:05.577189 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.577150 2573 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 15:35:05.578655 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.578641 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-123.ec2.internal" event="NodeHasSufficientMemory" Apr 21 15:35:05.578710 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.578670 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-123.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 15:35:05.578710 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.578684 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-123.ec2.internal" event="NodeHasSufficientPID" Apr 21 15:35:05.580145 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.580132 2573 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 15:35:05.580838 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.580823 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-123.ec2.internal" event="NodeHasSufficientMemory" Apr 21 15:35:05.580907 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.580851 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-123.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 15:35:05.580907 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.580865 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-123.ec2.internal" event="NodeHasSufficientPID" Apr 21 15:35:05.580907 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.580892 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-123.ec2.internal" Apr 21 15:35:05.580993 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.580920 2573 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 15:35:05.581541 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.581520 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-123.ec2.internal" event="NodeHasSufficientMemory" Apr 21 15:35:05.581592 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.581552 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-123.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 15:35:05.581592 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.581565 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-123.ec2.internal" event="NodeHasSufficientPID" Apr 21 15:35:05.582438 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.582426 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-136-123.ec2.internal" Apr 21 15:35:05.582483 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.582465 2573 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 15:35:05.583093 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.583076 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-123.ec2.internal" event="NodeHasSufficientMemory" Apr 21 15:35:05.583166 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.583104 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-123.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 15:35:05.583166 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.583115 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-123.ec2.internal" event="NodeHasSufficientPID" Apr 21 15:35:05.607551 ip-10-0-136-123 kubenswrapper[2573]: E0421 15:35:05.607522 2573 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-136-123.ec2.internal\" not found" node="ip-10-0-136-123.ec2.internal" Apr 21 15:35:05.611952 ip-10-0-136-123 kubenswrapper[2573]: E0421 15:35:05.611935 2573 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-136-123.ec2.internal\" not found" node="ip-10-0-136-123.ec2.internal" Apr 21 15:35:05.646930 ip-10-0-136-123 kubenswrapper[2573]: E0421 15:35:05.646903 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-123.ec2.internal\" not found" Apr 21 15:35:05.747271 ip-10-0-136-123 kubenswrapper[2573]: E0421 15:35:05.747185 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-123.ec2.internal\" not found" Apr 21 15:35:05.753568 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.753546 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/9a041855c13fefe5756acf5eec467e79-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-136-123.ec2.internal\" (UID: \"9a041855c13fefe5756acf5eec467e79\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-123.ec2.internal" Apr 21 15:35:05.753618 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.753577 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9a041855c13fefe5756acf5eec467e79-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-136-123.ec2.internal\" (UID: \"9a041855c13fefe5756acf5eec467e79\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-123.ec2.internal" Apr 21 15:35:05.753618 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.753600 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/e27d8eec64ea5a1e06e35be5839e2c48-config\") pod \"kube-apiserver-proxy-ip-10-0-136-123.ec2.internal\" (UID: \"e27d8eec64ea5a1e06e35be5839e2c48\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-136-123.ec2.internal" Apr 21 15:35:05.848028 ip-10-0-136-123 kubenswrapper[2573]: E0421 15:35:05.847983 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-123.ec2.internal\" not found" Apr 21 15:35:05.854421 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.854399 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/9a041855c13fefe5756acf5eec467e79-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-136-123.ec2.internal\" (UID: \"9a041855c13fefe5756acf5eec467e79\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-123.ec2.internal" Apr 21 15:35:05.854517 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.854431 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9a041855c13fefe5756acf5eec467e79-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-136-123.ec2.internal\" (UID: \"9a041855c13fefe5756acf5eec467e79\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-123.ec2.internal" Apr 21 15:35:05.854517 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.854456 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/e27d8eec64ea5a1e06e35be5839e2c48-config\") pod \"kube-apiserver-proxy-ip-10-0-136-123.ec2.internal\" (UID: \"e27d8eec64ea5a1e06e35be5839e2c48\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-136-123.ec2.internal" Apr 21 15:35:05.854517 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.854497 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/9a041855c13fefe5756acf5eec467e79-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-136-123.ec2.internal\" (UID: \"9a041855c13fefe5756acf5eec467e79\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-123.ec2.internal" Apr 21 15:35:05.854634 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.854513 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9a041855c13fefe5756acf5eec467e79-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-136-123.ec2.internal\" (UID: \"9a041855c13fefe5756acf5eec467e79\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-123.ec2.internal" Apr 21 15:35:05.854634 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.854514 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/e27d8eec64ea5a1e06e35be5839e2c48-config\") pod \"kube-apiserver-proxy-ip-10-0-136-123.ec2.internal\" (UID: \"e27d8eec64ea5a1e06e35be5839e2c48\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-136-123.ec2.internal" Apr 21 15:35:05.909579 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.909540 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-123.ec2.internal" Apr 21 15:35:05.914138 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:05.914120 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-136-123.ec2.internal" Apr 21 15:35:05.948753 ip-10-0-136-123 kubenswrapper[2573]: E0421 15:35:05.948718 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-123.ec2.internal\" not found" Apr 21 15:35:06.049435 ip-10-0-136-123 kubenswrapper[2573]: E0421 15:35:06.049340 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-123.ec2.internal\" not found" Apr 21 15:35:06.149963 ip-10-0-136-123 kubenswrapper[2573]: E0421 15:35:06.149913 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-123.ec2.internal\" not found" Apr 21 15:35:06.203149 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.203122 2573 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 15:35:06.250277 ip-10-0-136-123 kubenswrapper[2573]: E0421 15:35:06.250236 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-123.ec2.internal\" not found" Apr 21 15:35:06.253470 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.253454 2573 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 21 15:35:06.253598 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.253579 2573 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 21 15:35:06.253656 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.253621 2573 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 21 15:35:06.322397 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.322372 2573 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 15:35:06.326019 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.325995 2573 apiserver.go:52] "Watching apiserver" Apr 21 15:35:06.337847 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.337780 2573 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 21 15:35:06.338115 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.338095 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-operator/iptables-alerter-4qs56","openshift-ovn-kubernetes/ovnkube-node-9v44z","kube-system/konnectivity-agent-xph98","openshift-image-registry/node-ca-m9xpg","openshift-multus/multus-additional-cni-plugins-gkvb2","openshift-multus/multus-rvv9j","openshift-multus/network-metrics-daemon-28b7m","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-44f8h","openshift-cluster-node-tuning-operator/tuned-2flj5","openshift-dns/node-resolver-lkj85","openshift-network-diagnostics/network-check-target-ntgnx"] Apr 21 15:35:06.339593 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.339571 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4qs56" Apr 21 15:35:06.340849 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.340826 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-9v44z" Apr 21 15:35:06.342225 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.342199 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-xph98" Apr 21 15:35:06.342326 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.342227 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-8q28v\"" Apr 21 15:35:06.342326 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.342278 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 21 15:35:06.342452 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.342278 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 21 15:35:06.342452 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.342417 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 21 15:35:06.343850 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.343828 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-m9xpg" Apr 21 15:35:06.344087 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.344063 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 21 15:35:06.344302 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.344270 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 21 15:35:06.344407 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.344368 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 21 15:35:06.344780 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.344760 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-5qjrv\"" Apr 21 15:35:06.344899 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.344846 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 21 15:35:06.344899 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.344877 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 21 15:35:06.345325 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.345308 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-gkvb2" Apr 21 15:35:06.345676 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.345652 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 21 15:35:06.348999 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.348770 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 21 15:35:06.348999 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.348781 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-v64vk\"" Apr 21 15:35:06.348999 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.348954 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-j7cc2\"" Apr 21 15:35:06.349320 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.349243 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 21 15:35:06.349577 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.349489 2573 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 21 15:35:06.349659 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.349595 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 21 15:35:06.349878 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.349857 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 21 15:35:06.350070 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.350046 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-rvv9j" Apr 21 15:35:06.350755 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.350385 2573 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-123.ec2.internal" Apr 21 15:35:06.350854 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.350770 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 21 15:35:06.350854 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.350610 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 21 15:35:06.350854 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.350842 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 21 15:35:06.350986 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.350872 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 21 15:35:06.350986 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.350928 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 21 15:35:06.350986 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.350956 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-5ftvd\"" Apr 21 15:35:06.351121 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.351014 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 21 15:35:06.352078 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.352059 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-28b7m" Apr 21 15:35:06.352169 ip-10-0-136-123 kubenswrapper[2573]: E0421 15:35:06.352136 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-28b7m" podUID="9c107ca7-f14c-4f8c-a8d4-4e08e3acb233" Apr 21 15:35:06.353757 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.353738 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-44f8h" Apr 21 15:35:06.354413 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.354389 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 21 15:35:06.354496 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.354449 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-qxmnr\"" Apr 21 15:35:06.355336 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.355319 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-2flj5" Apr 21 15:35:06.356122 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.356105 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-sp8fc\"" Apr 21 15:35:06.356982 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.356954 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/31c04054-fa66-445a-9246-9c32b20cd60d-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-gkvb2\" (UID: \"31c04054-fa66-445a-9246-9c32b20cd60d\") " pod="openshift-multus/multus-additional-cni-plugins-gkvb2" Apr 21 15:35:06.357066 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.356989 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/a7bb4a1e-4105-43ec-a600-43495885c030-multus-socket-dir-parent\") pod \"multus-rvv9j\" (UID: \"a7bb4a1e-4105-43ec-a600-43495885c030\") " pod="openshift-multus/multus-rvv9j" Apr 21 15:35:06.357066 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.357010 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/a7bb4a1e-4105-43ec-a600-43495885c030-host-var-lib-cni-multus\") pod \"multus-rvv9j\" (UID: \"a7bb4a1e-4105-43ec-a600-43495885c030\") " pod="openshift-multus/multus-rvv9j" Apr 21 15:35:06.357066 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.357024 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-lkj85" Apr 21 15:35:06.357066 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.357044 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a8821bf6-e244-4b55-bfcc-7d85dec39bc4-run-openvswitch\") pod \"ovnkube-node-9v44z\" (UID: \"a8821bf6-e244-4b55-bfcc-7d85dec39bc4\") " pod="openshift-ovn-kubernetes/ovnkube-node-9v44z" Apr 21 15:35:06.357066 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.357065 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/a8821bf6-e244-4b55-bfcc-7d85dec39bc4-ovnkube-script-lib\") pod \"ovnkube-node-9v44z\" (UID: \"a8821bf6-e244-4b55-bfcc-7d85dec39bc4\") " pod="openshift-ovn-kubernetes/ovnkube-node-9v44z" Apr 21 15:35:06.357291 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.357082 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/a7bb4a1e-4105-43ec-a600-43495885c030-cni-binary-copy\") pod \"multus-rvv9j\" (UID: \"a7bb4a1e-4105-43ec-a600-43495885c030\") " pod="openshift-multus/multus-rvv9j" Apr 21 15:35:06.357291 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.357109 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/31c04054-fa66-445a-9246-9c32b20cd60d-os-release\") pod \"multus-additional-cni-plugins-gkvb2\" (UID: \"31c04054-fa66-445a-9246-9c32b20cd60d\") " pod="openshift-multus/multus-additional-cni-plugins-gkvb2" Apr 21 15:35:06.357291 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.357157 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/31c04054-fa66-445a-9246-9c32b20cd60d-tuning-conf-dir\") pod \"multus-additional-cni-plugins-gkvb2\" (UID: \"31c04054-fa66-445a-9246-9c32b20cd60d\") " pod="openshift-multus/multus-additional-cni-plugins-gkvb2" Apr 21 15:35:06.357291 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.357212 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6a9ce3f2-ec7b-4278-88b3-b6eba783ca7b-kubelet-dir\") pod \"aws-ebs-csi-driver-node-44f8h\" (UID: \"6a9ce3f2-ec7b-4278-88b3-b6eba783ca7b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-44f8h" Apr 21 15:35:06.357291 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.357255 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/6a9ce3f2-ec7b-4278-88b3-b6eba783ca7b-socket-dir\") pod \"aws-ebs-csi-driver-node-44f8h\" (UID: \"6a9ce3f2-ec7b-4278-88b3-b6eba783ca7b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-44f8h" Apr 21 15:35:06.357291 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.357281 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/6a9ce3f2-ec7b-4278-88b3-b6eba783ca7b-registration-dir\") pod \"aws-ebs-csi-driver-node-44f8h\" (UID: \"6a9ce3f2-ec7b-4278-88b3-b6eba783ca7b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-44f8h" Apr 21 15:35:06.357590 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.357298 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/7c31902c-772d-4d70-a92a-d6c21f8a1a17-host-slash\") pod \"iptables-alerter-4qs56\" (UID: \"7c31902c-772d-4d70-a92a-d6c21f8a1a17\") " pod="openshift-network-operator/iptables-alerter-4qs56" Apr 21 15:35:06.357590 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.357314 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ggmm6\" (UniqueName: \"kubernetes.io/projected/7c31902c-772d-4d70-a92a-d6c21f8a1a17-kube-api-access-ggmm6\") pod \"iptables-alerter-4qs56\" (UID: \"7c31902c-772d-4d70-a92a-d6c21f8a1a17\") " pod="openshift-network-operator/iptables-alerter-4qs56" Apr 21 15:35:06.357590 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.357345 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a8821bf6-e244-4b55-bfcc-7d85dec39bc4-host-slash\") pod \"ovnkube-node-9v44z\" (UID: \"a8821bf6-e244-4b55-bfcc-7d85dec39bc4\") " pod="openshift-ovn-kubernetes/ovnkube-node-9v44z" Apr 21 15:35:06.357590 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.357364 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a8821bf6-e244-4b55-bfcc-7d85dec39bc4-var-lib-openvswitch\") pod \"ovnkube-node-9v44z\" (UID: \"a8821bf6-e244-4b55-bfcc-7d85dec39bc4\") " pod="openshift-ovn-kubernetes/ovnkube-node-9v44z" Apr 21 15:35:06.357590 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.357382 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a8821bf6-e244-4b55-bfcc-7d85dec39bc4-etc-openvswitch\") pod \"ovnkube-node-9v44z\" (UID: \"a8821bf6-e244-4b55-bfcc-7d85dec39bc4\") " pod="openshift-ovn-kubernetes/ovnkube-node-9v44z" Apr 21 15:35:06.357590 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.357400 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n724m\" (UniqueName: \"kubernetes.io/projected/a7bb4a1e-4105-43ec-a600-43495885c030-kube-api-access-n724m\") pod \"multus-rvv9j\" (UID: \"a7bb4a1e-4105-43ec-a600-43495885c030\") " pod="openshift-multus/multus-rvv9j" Apr 21 15:35:06.357590 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.357414 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/a8821bf6-e244-4b55-bfcc-7d85dec39bc4-run-systemd\") pod \"ovnkube-node-9v44z\" (UID: \"a8821bf6-e244-4b55-bfcc-7d85dec39bc4\") " pod="openshift-ovn-kubernetes/ovnkube-node-9v44z" Apr 21 15:35:06.357590 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.357427 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/a7bb4a1e-4105-43ec-a600-43495885c030-multus-daemon-config\") pod \"multus-rvv9j\" (UID: \"a7bb4a1e-4105-43ec-a600-43495885c030\") " pod="openshift-multus/multus-rvv9j" Apr 21 15:35:06.357590 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.357460 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a7bb4a1e-4105-43ec-a600-43495885c030-host-var-lib-kubelet\") pod \"multus-rvv9j\" (UID: \"a7bb4a1e-4105-43ec-a600-43495885c030\") " pod="openshift-multus/multus-rvv9j" Apr 21 15:35:06.357590 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.357503 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/a7bb4a1e-4105-43ec-a600-43495885c030-hostroot\") pod \"multus-rvv9j\" (UID: \"a7bb4a1e-4105-43ec-a600-43495885c030\") " pod="openshift-multus/multus-rvv9j" Apr 21 15:35:06.357590 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.357531 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9c107ca7-f14c-4f8c-a8d4-4e08e3acb233-metrics-certs\") pod \"network-metrics-daemon-28b7m\" (UID: \"9c107ca7-f14c-4f8c-a8d4-4e08e3acb233\") " pod="openshift-multus/network-metrics-daemon-28b7m" Apr 21 15:35:06.357590 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.357557 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a8821bf6-e244-4b55-bfcc-7d85dec39bc4-env-overrides\") pod \"ovnkube-node-9v44z\" (UID: \"a8821bf6-e244-4b55-bfcc-7d85dec39bc4\") " pod="openshift-ovn-kubernetes/ovnkube-node-9v44z" Apr 21 15:35:06.357590 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.357580 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a7bb4a1e-4105-43ec-a600-43495885c030-host-run-netns\") pod \"multus-rvv9j\" (UID: \"a7bb4a1e-4105-43ec-a600-43495885c030\") " pod="openshift-multus/multus-rvv9j" Apr 21 15:35:06.358060 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.357609 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a7bb4a1e-4105-43ec-a600-43495885c030-multus-conf-dir\") pod \"multus-rvv9j\" (UID: \"a7bb4a1e-4105-43ec-a600-43495885c030\") " pod="openshift-multus/multus-rvv9j" Apr 21 15:35:06.358060 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.357632 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a7bb4a1e-4105-43ec-a600-43495885c030-etc-kubernetes\") pod \"multus-rvv9j\" (UID: \"a7bb4a1e-4105-43ec-a600-43495885c030\") " pod="openshift-multus/multus-rvv9j" Apr 21 15:35:06.358060 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.357656 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a8821bf6-e244-4b55-bfcc-7d85dec39bc4-ovnkube-config\") pod \"ovnkube-node-9v44z\" (UID: \"a8821bf6-e244-4b55-bfcc-7d85dec39bc4\") " pod="openshift-ovn-kubernetes/ovnkube-node-9v44z" Apr 21 15:35:06.358060 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.357681 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a8821bf6-e244-4b55-bfcc-7d85dec39bc4-ovn-node-metrics-cert\") pod \"ovnkube-node-9v44z\" (UID: \"a8821bf6-e244-4b55-bfcc-7d85dec39bc4\") " pod="openshift-ovn-kubernetes/ovnkube-node-9v44z" Apr 21 15:35:06.358060 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.357704 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rxtw6\" (UniqueName: \"kubernetes.io/projected/39fccf23-7816-40f1-9d1a-0711aca322c8-kube-api-access-rxtw6\") pod \"node-ca-m9xpg\" (UID: \"39fccf23-7816-40f1-9d1a-0711aca322c8\") " pod="openshift-image-registry/node-ca-m9xpg" Apr 21 15:35:06.358060 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.357728 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z6gbm\" (UniqueName: \"kubernetes.io/projected/6a9ce3f2-ec7b-4278-88b3-b6eba783ca7b-kube-api-access-z6gbm\") pod \"aws-ebs-csi-driver-node-44f8h\" (UID: \"6a9ce3f2-ec7b-4278-88b3-b6eba783ca7b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-44f8h" Apr 21 15:35:06.358060 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.357755 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/a8821bf6-e244-4b55-bfcc-7d85dec39bc4-host-kubelet\") pod \"ovnkube-node-9v44z\" (UID: \"a8821bf6-e244-4b55-bfcc-7d85dec39bc4\") " pod="openshift-ovn-kubernetes/ovnkube-node-9v44z" Apr 21 15:35:06.358060 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.357788 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/31c04054-fa66-445a-9246-9c32b20cd60d-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-gkvb2\" (UID: \"31c04054-fa66-445a-9246-9c32b20cd60d\") " pod="openshift-multus/multus-additional-cni-plugins-gkvb2" Apr 21 15:35:06.358060 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.357827 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8hnb7\" (UniqueName: \"kubernetes.io/projected/31c04054-fa66-445a-9246-9c32b20cd60d-kube-api-access-8hnb7\") pod \"multus-additional-cni-plugins-gkvb2\" (UID: \"31c04054-fa66-445a-9246-9c32b20cd60d\") " pod="openshift-multus/multus-additional-cni-plugins-gkvb2" Apr 21 15:35:06.358060 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.357859 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/a8821bf6-e244-4b55-bfcc-7d85dec39bc4-node-log\") pod \"ovnkube-node-9v44z\" (UID: \"a8821bf6-e244-4b55-bfcc-7d85dec39bc4\") " pod="openshift-ovn-kubernetes/ovnkube-node-9v44z" Apr 21 15:35:06.358060 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.357887 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/a8821bf6-e244-4b55-bfcc-7d85dec39bc4-host-cni-netd\") pod \"ovnkube-node-9v44z\" (UID: \"a8821bf6-e244-4b55-bfcc-7d85dec39bc4\") " pod="openshift-ovn-kubernetes/ovnkube-node-9v44z" Apr 21 15:35:06.358060 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.357908 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4xnkv\" (UniqueName: \"kubernetes.io/projected/a8821bf6-e244-4b55-bfcc-7d85dec39bc4-kube-api-access-4xnkv\") pod \"ovnkube-node-9v44z\" (UID: \"a8821bf6-e244-4b55-bfcc-7d85dec39bc4\") " pod="openshift-ovn-kubernetes/ovnkube-node-9v44z" Apr 21 15:35:06.358060 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.357922 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/a7bb4a1e-4105-43ec-a600-43495885c030-os-release\") pod \"multus-rvv9j\" (UID: \"a7bb4a1e-4105-43ec-a600-43495885c030\") " pod="openshift-multus/multus-rvv9j" Apr 21 15:35:06.358060 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.357936 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/31c04054-fa66-445a-9246-9c32b20cd60d-cni-binary-copy\") pod \"multus-additional-cni-plugins-gkvb2\" (UID: \"31c04054-fa66-445a-9246-9c32b20cd60d\") " pod="openshift-multus/multus-additional-cni-plugins-gkvb2" Apr 21 15:35:06.358060 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.357959 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/7c31902c-772d-4d70-a92a-d6c21f8a1a17-iptables-alerter-script\") pod \"iptables-alerter-4qs56\" (UID: \"7c31902c-772d-4d70-a92a-d6c21f8a1a17\") " pod="openshift-network-operator/iptables-alerter-4qs56" Apr 21 15:35:06.358060 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.357997 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/63dd9652-ce6c-4395-ae74-cba66c5a8c72-agent-certs\") pod \"konnectivity-agent-xph98\" (UID: \"63dd9652-ce6c-4395-ae74-cba66c5a8c72\") " pod="kube-system/konnectivity-agent-xph98" Apr 21 15:35:06.358541 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.358018 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a7bb4a1e-4105-43ec-a600-43495885c030-system-cni-dir\") pod \"multus-rvv9j\" (UID: \"a7bb4a1e-4105-43ec-a600-43495885c030\") " pod="openshift-multus/multus-rvv9j" Apr 21 15:35:06.358541 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.358033 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a7bb4a1e-4105-43ec-a600-43495885c030-multus-cni-dir\") pod \"multus-rvv9j\" (UID: \"a7bb4a1e-4105-43ec-a600-43495885c030\") " pod="openshift-multus/multus-rvv9j" Apr 21 15:35:06.358541 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.358049 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/31c04054-fa66-445a-9246-9c32b20cd60d-system-cni-dir\") pod \"multus-additional-cni-plugins-gkvb2\" (UID: \"31c04054-fa66-445a-9246-9c32b20cd60d\") " pod="openshift-multus/multus-additional-cni-plugins-gkvb2" Apr 21 15:35:06.358541 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.358101 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/63dd9652-ce6c-4395-ae74-cba66c5a8c72-konnectivity-ca\") pod \"konnectivity-agent-xph98\" (UID: \"63dd9652-ce6c-4395-ae74-cba66c5a8c72\") " pod="kube-system/konnectivity-agent-xph98" Apr 21 15:35:06.358541 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.358127 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/31c04054-fa66-445a-9246-9c32b20cd60d-cnibin\") pod \"multus-additional-cni-plugins-gkvb2\" (UID: \"31c04054-fa66-445a-9246-9c32b20cd60d\") " pod="openshift-multus/multus-additional-cni-plugins-gkvb2" Apr 21 15:35:06.358541 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.358156 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/a8821bf6-e244-4b55-bfcc-7d85dec39bc4-run-ovn\") pod \"ovnkube-node-9v44z\" (UID: \"a8821bf6-e244-4b55-bfcc-7d85dec39bc4\") " pod="openshift-ovn-kubernetes/ovnkube-node-9v44z" Apr 21 15:35:06.358541 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.358173 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/a8821bf6-e244-4b55-bfcc-7d85dec39bc4-log-socket\") pod \"ovnkube-node-9v44z\" (UID: \"a8821bf6-e244-4b55-bfcc-7d85dec39bc4\") " pod="openshift-ovn-kubernetes/ovnkube-node-9v44z" Apr 21 15:35:06.358541 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.358187 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/39fccf23-7816-40f1-9d1a-0711aca322c8-serviceca\") pod \"node-ca-m9xpg\" (UID: \"39fccf23-7816-40f1-9d1a-0711aca322c8\") " pod="openshift-image-registry/node-ca-m9xpg" Apr 21 15:35:06.358541 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.358229 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/6a9ce3f2-ec7b-4278-88b3-b6eba783ca7b-sys-fs\") pod \"aws-ebs-csi-driver-node-44f8h\" (UID: \"6a9ce3f2-ec7b-4278-88b3-b6eba783ca7b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-44f8h" Apr 21 15:35:06.358541 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.358243 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/a7bb4a1e-4105-43ec-a600-43495885c030-cnibin\") pod \"multus-rvv9j\" (UID: \"a7bb4a1e-4105-43ec-a600-43495885c030\") " pod="openshift-multus/multus-rvv9j" Apr 21 15:35:06.358541 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.358257 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/6a9ce3f2-ec7b-4278-88b3-b6eba783ca7b-etc-selinux\") pod \"aws-ebs-csi-driver-node-44f8h\" (UID: \"6a9ce3f2-ec7b-4278-88b3-b6eba783ca7b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-44f8h" Apr 21 15:35:06.358541 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.358278 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/a8821bf6-e244-4b55-bfcc-7d85dec39bc4-systemd-units\") pod \"ovnkube-node-9v44z\" (UID: \"a8821bf6-e244-4b55-bfcc-7d85dec39bc4\") " pod="openshift-ovn-kubernetes/ovnkube-node-9v44z" Apr 21 15:35:06.358541 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.358291 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a8821bf6-e244-4b55-bfcc-7d85dec39bc4-host-cni-bin\") pod \"ovnkube-node-9v44z\" (UID: \"a8821bf6-e244-4b55-bfcc-7d85dec39bc4\") " pod="openshift-ovn-kubernetes/ovnkube-node-9v44z" Apr 21 15:35:06.358541 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.358304 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/39fccf23-7816-40f1-9d1a-0711aca322c8-host\") pod \"node-ca-m9xpg\" (UID: \"39fccf23-7816-40f1-9d1a-0711aca322c8\") " pod="openshift-image-registry/node-ca-m9xpg" Apr 21 15:35:06.358541 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.358317 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/a7bb4a1e-4105-43ec-a600-43495885c030-host-run-multus-certs\") pod \"multus-rvv9j\" (UID: \"a7bb4a1e-4105-43ec-a600-43495885c030\") " pod="openshift-multus/multus-rvv9j" Apr 21 15:35:06.358541 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.358338 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jz2sw\" (UniqueName: \"kubernetes.io/projected/9c107ca7-f14c-4f8c-a8d4-4e08e3acb233-kube-api-access-jz2sw\") pod \"network-metrics-daemon-28b7m\" (UID: \"9c107ca7-f14c-4f8c-a8d4-4e08e3acb233\") " pod="openshift-multus/network-metrics-daemon-28b7m" Apr 21 15:35:06.358541 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.358368 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a8821bf6-e244-4b55-bfcc-7d85dec39bc4-host-run-netns\") pod \"ovnkube-node-9v44z\" (UID: \"a8821bf6-e244-4b55-bfcc-7d85dec39bc4\") " pod="openshift-ovn-kubernetes/ovnkube-node-9v44z" Apr 21 15:35:06.359065 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.358390 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a8821bf6-e244-4b55-bfcc-7d85dec39bc4-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-9v44z\" (UID: \"a8821bf6-e244-4b55-bfcc-7d85dec39bc4\") " pod="openshift-ovn-kubernetes/ovnkube-node-9v44z" Apr 21 15:35:06.359065 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.358413 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a7bb4a1e-4105-43ec-a600-43495885c030-host-var-lib-cni-bin\") pod \"multus-rvv9j\" (UID: \"a7bb4a1e-4105-43ec-a600-43495885c030\") " pod="openshift-multus/multus-rvv9j" Apr 21 15:35:06.359065 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.358430 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/6a9ce3f2-ec7b-4278-88b3-b6eba783ca7b-device-dir\") pod \"aws-ebs-csi-driver-node-44f8h\" (UID: \"6a9ce3f2-ec7b-4278-88b3-b6eba783ca7b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-44f8h" Apr 21 15:35:06.359065 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.358451 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a8821bf6-e244-4b55-bfcc-7d85dec39bc4-host-run-ovn-kubernetes\") pod \"ovnkube-node-9v44z\" (UID: \"a8821bf6-e244-4b55-bfcc-7d85dec39bc4\") " pod="openshift-ovn-kubernetes/ovnkube-node-9v44z" Apr 21 15:35:06.359065 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.358468 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/a7bb4a1e-4105-43ec-a600-43495885c030-host-run-k8s-cni-cncf-io\") pod \"multus-rvv9j\" (UID: \"a7bb4a1e-4105-43ec-a600-43495885c030\") " pod="openshift-multus/multus-rvv9j" Apr 21 15:35:06.359065 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.358667 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ntgnx" Apr 21 15:35:06.359065 ip-10-0-136-123 kubenswrapper[2573]: E0421 15:35:06.358710 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ntgnx" podUID="8ea4d113-155e-4fa2-b765-c12d26b37fa1" Apr 21 15:35:06.361853 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.361829 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 21 15:35:06.361940 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.361857 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 21 15:35:06.361940 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.361899 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 21 15:35:06.362056 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.361954 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 21 15:35:06.362056 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.362002 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 21 15:35:06.362150 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.362132 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-mtn8s\"" Apr 21 15:35:06.362150 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.362143 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 21 15:35:06.362403 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.362387 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-4fz95\"" Apr 21 15:35:06.362639 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.362625 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 21 15:35:06.371351 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.371333 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-123.ec2.internal"] Apr 21 15:35:06.372317 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.372298 2573 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 21 15:35:06.372405 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.372385 2573 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-136-123.ec2.internal" Apr 21 15:35:06.375160 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.375141 2573 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 21 15:35:06.380839 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.380824 2573 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 21 15:35:06.380984 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.380969 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/kube-apiserver-proxy-ip-10-0-136-123.ec2.internal"] Apr 21 15:35:06.424976 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:06.424945 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9a041855c13fefe5756acf5eec467e79.slice/crio-049d321842388ea8f8a2d6b98881cb53480ac93c90b59fb50b6f31cd4d952fdf WatchSource:0}: Error finding container 049d321842388ea8f8a2d6b98881cb53480ac93c90b59fb50b6f31cd4d952fdf: Status 404 returned error can't find the container with id 049d321842388ea8f8a2d6b98881cb53480ac93c90b59fb50b6f31cd4d952fdf Apr 21 15:35:06.425156 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:06.425142 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode27d8eec64ea5a1e06e35be5839e2c48.slice/crio-552de36deba0dd5e48892b2c5f4568c3006c4f767fd2a99e4c00d372b5f785eb WatchSource:0}: Error finding container 552de36deba0dd5e48892b2c5f4568c3006c4f767fd2a99e4c00d372b5f785eb: Status 404 returned error can't find the container with id 552de36deba0dd5e48892b2c5f4568c3006c4f767fd2a99e4c00d372b5f785eb Apr 21 15:35:06.429119 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.429055 2573 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-20 15:30:05 +0000 UTC" deadline="2027-09-26 23:24:01.27605923 +0000 UTC" Apr 21 15:35:06.429119 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.429082 2573 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="12559h48m54.84698082s" Apr 21 15:35:06.429633 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.429616 2573 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 21 15:35:06.453691 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.453673 2573 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 21 15:35:06.457325 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.457307 2573 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-4ft7t" Apr 21 15:35:06.459489 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.459461 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a8821bf6-e244-4b55-bfcc-7d85dec39bc4-ovnkube-config\") pod \"ovnkube-node-9v44z\" (UID: \"a8821bf6-e244-4b55-bfcc-7d85dec39bc4\") " pod="openshift-ovn-kubernetes/ovnkube-node-9v44z" Apr 21 15:35:06.459548 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.459498 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a8821bf6-e244-4b55-bfcc-7d85dec39bc4-ovn-node-metrics-cert\") pod \"ovnkube-node-9v44z\" (UID: \"a8821bf6-e244-4b55-bfcc-7d85dec39bc4\") " pod="openshift-ovn-kubernetes/ovnkube-node-9v44z" Apr 21 15:35:06.459548 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.459515 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rxtw6\" (UniqueName: \"kubernetes.io/projected/39fccf23-7816-40f1-9d1a-0711aca322c8-kube-api-access-rxtw6\") pod \"node-ca-m9xpg\" (UID: \"39fccf23-7816-40f1-9d1a-0711aca322c8\") " pod="openshift-image-registry/node-ca-m9xpg" Apr 21 15:35:06.459548 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.459536 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t5phk\" (UniqueName: \"kubernetes.io/projected/8ea4d113-155e-4fa2-b765-c12d26b37fa1-kube-api-access-t5phk\") pod \"network-check-target-ntgnx\" (UID: \"8ea4d113-155e-4fa2-b765-c12d26b37fa1\") " pod="openshift-network-diagnostics/network-check-target-ntgnx" Apr 21 15:35:06.459646 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.459552 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/a45717a3-b37a-442d-822c-c0485d21bf6b-etc-sysctl-conf\") pod \"tuned-2flj5\" (UID: \"a45717a3-b37a-442d-822c-c0485d21bf6b\") " pod="openshift-cluster-node-tuning-operator/tuned-2flj5" Apr 21 15:35:06.459646 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.459569 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z6gbm\" (UniqueName: \"kubernetes.io/projected/6a9ce3f2-ec7b-4278-88b3-b6eba783ca7b-kube-api-access-z6gbm\") pod \"aws-ebs-csi-driver-node-44f8h\" (UID: \"6a9ce3f2-ec7b-4278-88b3-b6eba783ca7b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-44f8h" Apr 21 15:35:06.459646 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.459584 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/a8821bf6-e244-4b55-bfcc-7d85dec39bc4-host-kubelet\") pod \"ovnkube-node-9v44z\" (UID: \"a8821bf6-e244-4b55-bfcc-7d85dec39bc4\") " pod="openshift-ovn-kubernetes/ovnkube-node-9v44z" Apr 21 15:35:06.459646 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.459599 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/31c04054-fa66-445a-9246-9c32b20cd60d-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-gkvb2\" (UID: \"31c04054-fa66-445a-9246-9c32b20cd60d\") " pod="openshift-multus/multus-additional-cni-plugins-gkvb2" Apr 21 15:35:06.459646 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.459619 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8hnb7\" (UniqueName: \"kubernetes.io/projected/31c04054-fa66-445a-9246-9c32b20cd60d-kube-api-access-8hnb7\") pod \"multus-additional-cni-plugins-gkvb2\" (UID: \"31c04054-fa66-445a-9246-9c32b20cd60d\") " pod="openshift-multus/multus-additional-cni-plugins-gkvb2" Apr 21 15:35:06.459873 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.459671 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/96e29eb1-d270-4d82-a139-d970d1863b1c-hosts-file\") pod \"node-resolver-lkj85\" (UID: \"96e29eb1-d270-4d82-a139-d970d1863b1c\") " pod="openshift-dns/node-resolver-lkj85" Apr 21 15:35:06.459873 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.459722 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/a8821bf6-e244-4b55-bfcc-7d85dec39bc4-node-log\") pod \"ovnkube-node-9v44z\" (UID: \"a8821bf6-e244-4b55-bfcc-7d85dec39bc4\") " pod="openshift-ovn-kubernetes/ovnkube-node-9v44z" Apr 21 15:35:06.459873 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.459763 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/a8821bf6-e244-4b55-bfcc-7d85dec39bc4-host-cni-netd\") pod \"ovnkube-node-9v44z\" (UID: \"a8821bf6-e244-4b55-bfcc-7d85dec39bc4\") " pod="openshift-ovn-kubernetes/ovnkube-node-9v44z" Apr 21 15:35:06.459873 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.459787 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/a8821bf6-e244-4b55-bfcc-7d85dec39bc4-node-log\") pod \"ovnkube-node-9v44z\" (UID: \"a8821bf6-e244-4b55-bfcc-7d85dec39bc4\") " pod="openshift-ovn-kubernetes/ovnkube-node-9v44z" Apr 21 15:35:06.459873 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.459854 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4xnkv\" (UniqueName: \"kubernetes.io/projected/a8821bf6-e244-4b55-bfcc-7d85dec39bc4-kube-api-access-4xnkv\") pod \"ovnkube-node-9v44z\" (UID: \"a8821bf6-e244-4b55-bfcc-7d85dec39bc4\") " pod="openshift-ovn-kubernetes/ovnkube-node-9v44z" Apr 21 15:35:06.459873 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.459860 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/a8821bf6-e244-4b55-bfcc-7d85dec39bc4-host-cni-netd\") pod \"ovnkube-node-9v44z\" (UID: \"a8821bf6-e244-4b55-bfcc-7d85dec39bc4\") " pod="openshift-ovn-kubernetes/ovnkube-node-9v44z" Apr 21 15:35:06.459873 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.459864 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/a8821bf6-e244-4b55-bfcc-7d85dec39bc4-host-kubelet\") pod \"ovnkube-node-9v44z\" (UID: \"a8821bf6-e244-4b55-bfcc-7d85dec39bc4\") " pod="openshift-ovn-kubernetes/ovnkube-node-9v44z" Apr 21 15:35:06.460199 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.459890 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/a7bb4a1e-4105-43ec-a600-43495885c030-os-release\") pod \"multus-rvv9j\" (UID: \"a7bb4a1e-4105-43ec-a600-43495885c030\") " pod="openshift-multus/multus-rvv9j" Apr 21 15:35:06.460199 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.459872 2573 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 21 15:35:06.460199 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.459986 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/31c04054-fa66-445a-9246-9c32b20cd60d-cni-binary-copy\") pod \"multus-additional-cni-plugins-gkvb2\" (UID: \"31c04054-fa66-445a-9246-9c32b20cd60d\") " pod="openshift-multus/multus-additional-cni-plugins-gkvb2" Apr 21 15:35:06.460199 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.460016 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/a45717a3-b37a-442d-822c-c0485d21bf6b-run\") pod \"tuned-2flj5\" (UID: \"a45717a3-b37a-442d-822c-c0485d21bf6b\") " pod="openshift-cluster-node-tuning-operator/tuned-2flj5" Apr 21 15:35:06.460199 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.460035 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/7c31902c-772d-4d70-a92a-d6c21f8a1a17-iptables-alerter-script\") pod \"iptables-alerter-4qs56\" (UID: \"7c31902c-772d-4d70-a92a-d6c21f8a1a17\") " pod="openshift-network-operator/iptables-alerter-4qs56" Apr 21 15:35:06.460199 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.460077 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/63dd9652-ce6c-4395-ae74-cba66c5a8c72-agent-certs\") pod \"konnectivity-agent-xph98\" (UID: \"63dd9652-ce6c-4395-ae74-cba66c5a8c72\") " pod="kube-system/konnectivity-agent-xph98" Apr 21 15:35:06.460199 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.460109 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a7bb4a1e-4105-43ec-a600-43495885c030-system-cni-dir\") pod \"multus-rvv9j\" (UID: \"a7bb4a1e-4105-43ec-a600-43495885c030\") " pod="openshift-multus/multus-rvv9j" Apr 21 15:35:06.460199 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.460135 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a7bb4a1e-4105-43ec-a600-43495885c030-multus-cni-dir\") pod \"multus-rvv9j\" (UID: \"a7bb4a1e-4105-43ec-a600-43495885c030\") " pod="openshift-multus/multus-rvv9j" Apr 21 15:35:06.460199 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.460162 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/31c04054-fa66-445a-9246-9c32b20cd60d-system-cni-dir\") pod \"multus-additional-cni-plugins-gkvb2\" (UID: \"31c04054-fa66-445a-9246-9c32b20cd60d\") " pod="openshift-multus/multus-additional-cni-plugins-gkvb2" Apr 21 15:35:06.460199 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.460190 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a45717a3-b37a-442d-822c-c0485d21bf6b-etc-kubernetes\") pod \"tuned-2flj5\" (UID: \"a45717a3-b37a-442d-822c-c0485d21bf6b\") " pod="openshift-cluster-node-tuning-operator/tuned-2flj5" Apr 21 15:35:06.460649 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.460206 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a8821bf6-e244-4b55-bfcc-7d85dec39bc4-ovnkube-config\") pod \"ovnkube-node-9v44z\" (UID: \"a8821bf6-e244-4b55-bfcc-7d85dec39bc4\") " pod="openshift-ovn-kubernetes/ovnkube-node-9v44z" Apr 21 15:35:06.460649 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.460217 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/63dd9652-ce6c-4395-ae74-cba66c5a8c72-konnectivity-ca\") pod \"konnectivity-agent-xph98\" (UID: \"63dd9652-ce6c-4395-ae74-cba66c5a8c72\") " pod="kube-system/konnectivity-agent-xph98" Apr 21 15:35:06.460649 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.460244 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/31c04054-fa66-445a-9246-9c32b20cd60d-cnibin\") pod \"multus-additional-cni-plugins-gkvb2\" (UID: \"31c04054-fa66-445a-9246-9c32b20cd60d\") " pod="openshift-multus/multus-additional-cni-plugins-gkvb2" Apr 21 15:35:06.460649 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.460271 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jf6kg\" (UniqueName: \"kubernetes.io/projected/96e29eb1-d270-4d82-a139-d970d1863b1c-kube-api-access-jf6kg\") pod \"node-resolver-lkj85\" (UID: \"96e29eb1-d270-4d82-a139-d970d1863b1c\") " pod="openshift-dns/node-resolver-lkj85" Apr 21 15:35:06.460649 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.460278 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/31c04054-fa66-445a-9246-9c32b20cd60d-system-cni-dir\") pod \"multus-additional-cni-plugins-gkvb2\" (UID: \"31c04054-fa66-445a-9246-9c32b20cd60d\") " pod="openshift-multus/multus-additional-cni-plugins-gkvb2" Apr 21 15:35:06.460649 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.460305 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/a8821bf6-e244-4b55-bfcc-7d85dec39bc4-run-ovn\") pod \"ovnkube-node-9v44z\" (UID: \"a8821bf6-e244-4b55-bfcc-7d85dec39bc4\") " pod="openshift-ovn-kubernetes/ovnkube-node-9v44z" Apr 21 15:35:06.460649 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.460310 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/31c04054-fa66-445a-9246-9c32b20cd60d-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-gkvb2\" (UID: \"31c04054-fa66-445a-9246-9c32b20cd60d\") " pod="openshift-multus/multus-additional-cni-plugins-gkvb2" Apr 21 15:35:06.460649 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.460354 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a7bb4a1e-4105-43ec-a600-43495885c030-system-cni-dir\") pod \"multus-rvv9j\" (UID: \"a7bb4a1e-4105-43ec-a600-43495885c030\") " pod="openshift-multus/multus-rvv9j" Apr 21 15:35:06.460649 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.460337 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/a8821bf6-e244-4b55-bfcc-7d85dec39bc4-log-socket\") pod \"ovnkube-node-9v44z\" (UID: \"a8821bf6-e244-4b55-bfcc-7d85dec39bc4\") " pod="openshift-ovn-kubernetes/ovnkube-node-9v44z" Apr 21 15:35:06.460649 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.460398 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a7bb4a1e-4105-43ec-a600-43495885c030-multus-cni-dir\") pod \"multus-rvv9j\" (UID: \"a7bb4a1e-4105-43ec-a600-43495885c030\") " pod="openshift-multus/multus-rvv9j" Apr 21 15:35:06.460649 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.460404 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/31c04054-fa66-445a-9246-9c32b20cd60d-cnibin\") pod \"multus-additional-cni-plugins-gkvb2\" (UID: \"31c04054-fa66-445a-9246-9c32b20cd60d\") " pod="openshift-multus/multus-additional-cni-plugins-gkvb2" Apr 21 15:35:06.460649 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.460401 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/39fccf23-7816-40f1-9d1a-0711aca322c8-serviceca\") pod \"node-ca-m9xpg\" (UID: \"39fccf23-7816-40f1-9d1a-0711aca322c8\") " pod="openshift-image-registry/node-ca-m9xpg" Apr 21 15:35:06.460649 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.460468 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/a8821bf6-e244-4b55-bfcc-7d85dec39bc4-run-ovn\") pod \"ovnkube-node-9v44z\" (UID: \"a8821bf6-e244-4b55-bfcc-7d85dec39bc4\") " pod="openshift-ovn-kubernetes/ovnkube-node-9v44z" Apr 21 15:35:06.460649 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.460470 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/a8821bf6-e244-4b55-bfcc-7d85dec39bc4-log-socket\") pod \"ovnkube-node-9v44z\" (UID: \"a8821bf6-e244-4b55-bfcc-7d85dec39bc4\") " pod="openshift-ovn-kubernetes/ovnkube-node-9v44z" Apr 21 15:35:06.460649 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.460486 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/31c04054-fa66-445a-9246-9c32b20cd60d-cni-binary-copy\") pod \"multus-additional-cni-plugins-gkvb2\" (UID: \"31c04054-fa66-445a-9246-9c32b20cd60d\") " pod="openshift-multus/multus-additional-cni-plugins-gkvb2" Apr 21 15:35:06.460649 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.460532 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/7c31902c-772d-4d70-a92a-d6c21f8a1a17-iptables-alerter-script\") pod \"iptables-alerter-4qs56\" (UID: \"7c31902c-772d-4d70-a92a-d6c21f8a1a17\") " pod="openshift-network-operator/iptables-alerter-4qs56" Apr 21 15:35:06.460649 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.460556 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/a7bb4a1e-4105-43ec-a600-43495885c030-os-release\") pod \"multus-rvv9j\" (UID: \"a7bb4a1e-4105-43ec-a600-43495885c030\") " pod="openshift-multus/multus-rvv9j" Apr 21 15:35:06.461385 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.460588 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/a45717a3-b37a-442d-822c-c0485d21bf6b-etc-sysctl-d\") pod \"tuned-2flj5\" (UID: \"a45717a3-b37a-442d-822c-c0485d21bf6b\") " pod="openshift-cluster-node-tuning-operator/tuned-2flj5" Apr 21 15:35:06.461385 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.460621 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a45717a3-b37a-442d-822c-c0485d21bf6b-var-lib-kubelet\") pod \"tuned-2flj5\" (UID: \"a45717a3-b37a-442d-822c-c0485d21bf6b\") " pod="openshift-cluster-node-tuning-operator/tuned-2flj5" Apr 21 15:35:06.461385 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.460650 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/6a9ce3f2-ec7b-4278-88b3-b6eba783ca7b-sys-fs\") pod \"aws-ebs-csi-driver-node-44f8h\" (UID: \"6a9ce3f2-ec7b-4278-88b3-b6eba783ca7b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-44f8h" Apr 21 15:35:06.461385 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.460674 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/a7bb4a1e-4105-43ec-a600-43495885c030-cnibin\") pod \"multus-rvv9j\" (UID: \"a7bb4a1e-4105-43ec-a600-43495885c030\") " pod="openshift-multus/multus-rvv9j" Apr 21 15:35:06.461385 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.460699 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/96e29eb1-d270-4d82-a139-d970d1863b1c-tmp-dir\") pod \"node-resolver-lkj85\" (UID: \"96e29eb1-d270-4d82-a139-d970d1863b1c\") " pod="openshift-dns/node-resolver-lkj85" Apr 21 15:35:06.461385 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.460721 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a45717a3-b37a-442d-822c-c0485d21bf6b-lib-modules\") pod \"tuned-2flj5\" (UID: \"a45717a3-b37a-442d-822c-c0485d21bf6b\") " pod="openshift-cluster-node-tuning-operator/tuned-2flj5" Apr 21 15:35:06.461385 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.460748 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/6a9ce3f2-ec7b-4278-88b3-b6eba783ca7b-etc-selinux\") pod \"aws-ebs-csi-driver-node-44f8h\" (UID: \"6a9ce3f2-ec7b-4278-88b3-b6eba783ca7b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-44f8h" Apr 21 15:35:06.461385 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.460751 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/a7bb4a1e-4105-43ec-a600-43495885c030-cnibin\") pod \"multus-rvv9j\" (UID: \"a7bb4a1e-4105-43ec-a600-43495885c030\") " pod="openshift-multus/multus-rvv9j" Apr 21 15:35:06.461385 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.460751 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/6a9ce3f2-ec7b-4278-88b3-b6eba783ca7b-sys-fs\") pod \"aws-ebs-csi-driver-node-44f8h\" (UID: \"6a9ce3f2-ec7b-4278-88b3-b6eba783ca7b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-44f8h" Apr 21 15:35:06.461385 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.460787 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/a8821bf6-e244-4b55-bfcc-7d85dec39bc4-systemd-units\") pod \"ovnkube-node-9v44z\" (UID: \"a8821bf6-e244-4b55-bfcc-7d85dec39bc4\") " pod="openshift-ovn-kubernetes/ovnkube-node-9v44z" Apr 21 15:35:06.461385 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.460831 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a8821bf6-e244-4b55-bfcc-7d85dec39bc4-host-cni-bin\") pod \"ovnkube-node-9v44z\" (UID: \"a8821bf6-e244-4b55-bfcc-7d85dec39bc4\") " pod="openshift-ovn-kubernetes/ovnkube-node-9v44z" Apr 21 15:35:06.461385 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.460851 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/39fccf23-7816-40f1-9d1a-0711aca322c8-host\") pod \"node-ca-m9xpg\" (UID: \"39fccf23-7816-40f1-9d1a-0711aca322c8\") " pod="openshift-image-registry/node-ca-m9xpg" Apr 21 15:35:06.461385 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.460856 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/39fccf23-7816-40f1-9d1a-0711aca322c8-serviceca\") pod \"node-ca-m9xpg\" (UID: \"39fccf23-7816-40f1-9d1a-0711aca322c8\") " pod="openshift-image-registry/node-ca-m9xpg" Apr 21 15:35:06.461385 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.460876 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/a7bb4a1e-4105-43ec-a600-43495885c030-host-run-multus-certs\") pod \"multus-rvv9j\" (UID: \"a7bb4a1e-4105-43ec-a600-43495885c030\") " pod="openshift-multus/multus-rvv9j" Apr 21 15:35:06.461385 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.460878 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/6a9ce3f2-ec7b-4278-88b3-b6eba783ca7b-etc-selinux\") pod \"aws-ebs-csi-driver-node-44f8h\" (UID: \"6a9ce3f2-ec7b-4278-88b3-b6eba783ca7b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-44f8h" Apr 21 15:35:06.461385 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.460888 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/63dd9652-ce6c-4395-ae74-cba66c5a8c72-konnectivity-ca\") pod \"konnectivity-agent-xph98\" (UID: \"63dd9652-ce6c-4395-ae74-cba66c5a8c72\") " pod="kube-system/konnectivity-agent-xph98" Apr 21 15:35:06.461385 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.460911 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/39fccf23-7816-40f1-9d1a-0711aca322c8-host\") pod \"node-ca-m9xpg\" (UID: \"39fccf23-7816-40f1-9d1a-0711aca322c8\") " pod="openshift-image-registry/node-ca-m9xpg" Apr 21 15:35:06.461385 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.460911 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a8821bf6-e244-4b55-bfcc-7d85dec39bc4-host-cni-bin\") pod \"ovnkube-node-9v44z\" (UID: \"a8821bf6-e244-4b55-bfcc-7d85dec39bc4\") " pod="openshift-ovn-kubernetes/ovnkube-node-9v44z" Apr 21 15:35:06.462178 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.460936 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jz2sw\" (UniqueName: \"kubernetes.io/projected/9c107ca7-f14c-4f8c-a8d4-4e08e3acb233-kube-api-access-jz2sw\") pod \"network-metrics-daemon-28b7m\" (UID: \"9c107ca7-f14c-4f8c-a8d4-4e08e3acb233\") " pod="openshift-multus/network-metrics-daemon-28b7m" Apr 21 15:35:06.462178 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.460918 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/a7bb4a1e-4105-43ec-a600-43495885c030-host-run-multus-certs\") pod \"multus-rvv9j\" (UID: \"a7bb4a1e-4105-43ec-a600-43495885c030\") " pod="openshift-multus/multus-rvv9j" Apr 21 15:35:06.462178 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.460954 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/a45717a3-b37a-442d-822c-c0485d21bf6b-tmp\") pod \"tuned-2flj5\" (UID: \"a45717a3-b37a-442d-822c-c0485d21bf6b\") " pod="openshift-cluster-node-tuning-operator/tuned-2flj5" Apr 21 15:35:06.462178 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.460923 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/a8821bf6-e244-4b55-bfcc-7d85dec39bc4-systemd-units\") pod \"ovnkube-node-9v44z\" (UID: \"a8821bf6-e244-4b55-bfcc-7d85dec39bc4\") " pod="openshift-ovn-kubernetes/ovnkube-node-9v44z" Apr 21 15:35:06.462178 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.460984 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a8821bf6-e244-4b55-bfcc-7d85dec39bc4-host-run-netns\") pod \"ovnkube-node-9v44z\" (UID: \"a8821bf6-e244-4b55-bfcc-7d85dec39bc4\") " pod="openshift-ovn-kubernetes/ovnkube-node-9v44z" Apr 21 15:35:06.462178 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.461059 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a8821bf6-e244-4b55-bfcc-7d85dec39bc4-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-9v44z\" (UID: \"a8821bf6-e244-4b55-bfcc-7d85dec39bc4\") " pod="openshift-ovn-kubernetes/ovnkube-node-9v44z" Apr 21 15:35:06.462178 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.461077 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a8821bf6-e244-4b55-bfcc-7d85dec39bc4-host-run-netns\") pod \"ovnkube-node-9v44z\" (UID: \"a8821bf6-e244-4b55-bfcc-7d85dec39bc4\") " pod="openshift-ovn-kubernetes/ovnkube-node-9v44z" Apr 21 15:35:06.462178 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.461105 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a7bb4a1e-4105-43ec-a600-43495885c030-host-var-lib-cni-bin\") pod \"multus-rvv9j\" (UID: \"a7bb4a1e-4105-43ec-a600-43495885c030\") " pod="openshift-multus/multus-rvv9j" Apr 21 15:35:06.462178 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.461112 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a8821bf6-e244-4b55-bfcc-7d85dec39bc4-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-9v44z\" (UID: \"a8821bf6-e244-4b55-bfcc-7d85dec39bc4\") " pod="openshift-ovn-kubernetes/ovnkube-node-9v44z" Apr 21 15:35:06.462178 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.461139 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/a45717a3-b37a-442d-822c-c0485d21bf6b-etc-modprobe-d\") pod \"tuned-2flj5\" (UID: \"a45717a3-b37a-442d-822c-c0485d21bf6b\") " pod="openshift-cluster-node-tuning-operator/tuned-2flj5" Apr 21 15:35:06.462178 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.461147 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a7bb4a1e-4105-43ec-a600-43495885c030-host-var-lib-cni-bin\") pod \"multus-rvv9j\" (UID: \"a7bb4a1e-4105-43ec-a600-43495885c030\") " pod="openshift-multus/multus-rvv9j" Apr 21 15:35:06.462178 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.461174 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/a45717a3-b37a-442d-822c-c0485d21bf6b-etc-tuned\") pod \"tuned-2flj5\" (UID: \"a45717a3-b37a-442d-822c-c0485d21bf6b\") " pod="openshift-cluster-node-tuning-operator/tuned-2flj5" Apr 21 15:35:06.462178 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.461192 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/6a9ce3f2-ec7b-4278-88b3-b6eba783ca7b-device-dir\") pod \"aws-ebs-csi-driver-node-44f8h\" (UID: \"6a9ce3f2-ec7b-4278-88b3-b6eba783ca7b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-44f8h" Apr 21 15:35:06.462178 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.461232 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/6a9ce3f2-ec7b-4278-88b3-b6eba783ca7b-device-dir\") pod \"aws-ebs-csi-driver-node-44f8h\" (UID: \"6a9ce3f2-ec7b-4278-88b3-b6eba783ca7b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-44f8h" Apr 21 15:35:06.462178 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.461236 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a8821bf6-e244-4b55-bfcc-7d85dec39bc4-host-run-ovn-kubernetes\") pod \"ovnkube-node-9v44z\" (UID: \"a8821bf6-e244-4b55-bfcc-7d85dec39bc4\") " pod="openshift-ovn-kubernetes/ovnkube-node-9v44z" Apr 21 15:35:06.462178 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.461268 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a8821bf6-e244-4b55-bfcc-7d85dec39bc4-host-run-ovn-kubernetes\") pod \"ovnkube-node-9v44z\" (UID: \"a8821bf6-e244-4b55-bfcc-7d85dec39bc4\") " pod="openshift-ovn-kubernetes/ovnkube-node-9v44z" Apr 21 15:35:06.462178 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.461272 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/a7bb4a1e-4105-43ec-a600-43495885c030-host-run-k8s-cni-cncf-io\") pod \"multus-rvv9j\" (UID: \"a7bb4a1e-4105-43ec-a600-43495885c030\") " pod="openshift-multus/multus-rvv9j" Apr 21 15:35:06.462639 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.461298 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/31c04054-fa66-445a-9246-9c32b20cd60d-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-gkvb2\" (UID: \"31c04054-fa66-445a-9246-9c32b20cd60d\") " pod="openshift-multus/multus-additional-cni-plugins-gkvb2" Apr 21 15:35:06.462639 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.461322 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/a7bb4a1e-4105-43ec-a600-43495885c030-multus-socket-dir-parent\") pod \"multus-rvv9j\" (UID: \"a7bb4a1e-4105-43ec-a600-43495885c030\") " pod="openshift-multus/multus-rvv9j" Apr 21 15:35:06.462639 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.461338 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/a7bb4a1e-4105-43ec-a600-43495885c030-host-run-k8s-cni-cncf-io\") pod \"multus-rvv9j\" (UID: \"a7bb4a1e-4105-43ec-a600-43495885c030\") " pod="openshift-multus/multus-rvv9j" Apr 21 15:35:06.462639 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.461349 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/a7bb4a1e-4105-43ec-a600-43495885c030-host-var-lib-cni-multus\") pod \"multus-rvv9j\" (UID: \"a7bb4a1e-4105-43ec-a600-43495885c030\") " pod="openshift-multus/multus-rvv9j" Apr 21 15:35:06.462639 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.461379 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a8821bf6-e244-4b55-bfcc-7d85dec39bc4-run-openvswitch\") pod \"ovnkube-node-9v44z\" (UID: \"a8821bf6-e244-4b55-bfcc-7d85dec39bc4\") " pod="openshift-ovn-kubernetes/ovnkube-node-9v44z" Apr 21 15:35:06.462639 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.461394 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/a7bb4a1e-4105-43ec-a600-43495885c030-multus-socket-dir-parent\") pod \"multus-rvv9j\" (UID: \"a7bb4a1e-4105-43ec-a600-43495885c030\") " pod="openshift-multus/multus-rvv9j" Apr 21 15:35:06.462639 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.461403 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/a8821bf6-e244-4b55-bfcc-7d85dec39bc4-ovnkube-script-lib\") pod \"ovnkube-node-9v44z\" (UID: \"a8821bf6-e244-4b55-bfcc-7d85dec39bc4\") " pod="openshift-ovn-kubernetes/ovnkube-node-9v44z" Apr 21 15:35:06.462639 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.461427 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/a7bb4a1e-4105-43ec-a600-43495885c030-cni-binary-copy\") pod \"multus-rvv9j\" (UID: \"a7bb4a1e-4105-43ec-a600-43495885c030\") " pod="openshift-multus/multus-rvv9j" Apr 21 15:35:06.462639 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.461439 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/a7bb4a1e-4105-43ec-a600-43495885c030-host-var-lib-cni-multus\") pod \"multus-rvv9j\" (UID: \"a7bb4a1e-4105-43ec-a600-43495885c030\") " pod="openshift-multus/multus-rvv9j" Apr 21 15:35:06.462639 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.461453 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/31c04054-fa66-445a-9246-9c32b20cd60d-os-release\") pod \"multus-additional-cni-plugins-gkvb2\" (UID: \"31c04054-fa66-445a-9246-9c32b20cd60d\") " pod="openshift-multus/multus-additional-cni-plugins-gkvb2" Apr 21 15:35:06.462639 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.461482 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a8821bf6-e244-4b55-bfcc-7d85dec39bc4-run-openvswitch\") pod \"ovnkube-node-9v44z\" (UID: \"a8821bf6-e244-4b55-bfcc-7d85dec39bc4\") " pod="openshift-ovn-kubernetes/ovnkube-node-9v44z" Apr 21 15:35:06.462639 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.461479 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/31c04054-fa66-445a-9246-9c32b20cd60d-tuning-conf-dir\") pod \"multus-additional-cni-plugins-gkvb2\" (UID: \"31c04054-fa66-445a-9246-9c32b20cd60d\") " pod="openshift-multus/multus-additional-cni-plugins-gkvb2" Apr 21 15:35:06.462639 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.461530 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6a9ce3f2-ec7b-4278-88b3-b6eba783ca7b-kubelet-dir\") pod \"aws-ebs-csi-driver-node-44f8h\" (UID: \"6a9ce3f2-ec7b-4278-88b3-b6eba783ca7b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-44f8h" Apr 21 15:35:06.462639 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.461559 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/6a9ce3f2-ec7b-4278-88b3-b6eba783ca7b-socket-dir\") pod \"aws-ebs-csi-driver-node-44f8h\" (UID: \"6a9ce3f2-ec7b-4278-88b3-b6eba783ca7b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-44f8h" Apr 21 15:35:06.462639 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.461586 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/6a9ce3f2-ec7b-4278-88b3-b6eba783ca7b-registration-dir\") pod \"aws-ebs-csi-driver-node-44f8h\" (UID: \"6a9ce3f2-ec7b-4278-88b3-b6eba783ca7b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-44f8h" Apr 21 15:35:06.462639 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.461596 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6a9ce3f2-ec7b-4278-88b3-b6eba783ca7b-kubelet-dir\") pod \"aws-ebs-csi-driver-node-44f8h\" (UID: \"6a9ce3f2-ec7b-4278-88b3-b6eba783ca7b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-44f8h" Apr 21 15:35:06.462639 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.461559 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/31c04054-fa66-445a-9246-9c32b20cd60d-os-release\") pod \"multus-additional-cni-plugins-gkvb2\" (UID: \"31c04054-fa66-445a-9246-9c32b20cd60d\") " pod="openshift-multus/multus-additional-cni-plugins-gkvb2" Apr 21 15:35:06.463408 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.461610 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/7c31902c-772d-4d70-a92a-d6c21f8a1a17-host-slash\") pod \"iptables-alerter-4qs56\" (UID: \"7c31902c-772d-4d70-a92a-d6c21f8a1a17\") " pod="openshift-network-operator/iptables-alerter-4qs56" Apr 21 15:35:06.463408 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.461637 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ggmm6\" (UniqueName: \"kubernetes.io/projected/7c31902c-772d-4d70-a92a-d6c21f8a1a17-kube-api-access-ggmm6\") pod \"iptables-alerter-4qs56\" (UID: \"7c31902c-772d-4d70-a92a-d6c21f8a1a17\") " pod="openshift-network-operator/iptables-alerter-4qs56" Apr 21 15:35:06.463408 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.461670 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a8821bf6-e244-4b55-bfcc-7d85dec39bc4-host-slash\") pod \"ovnkube-node-9v44z\" (UID: \"a8821bf6-e244-4b55-bfcc-7d85dec39bc4\") " pod="openshift-ovn-kubernetes/ovnkube-node-9v44z" Apr 21 15:35:06.463408 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.461695 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a8821bf6-e244-4b55-bfcc-7d85dec39bc4-var-lib-openvswitch\") pod \"ovnkube-node-9v44z\" (UID: \"a8821bf6-e244-4b55-bfcc-7d85dec39bc4\") " pod="openshift-ovn-kubernetes/ovnkube-node-9v44z" Apr 21 15:35:06.463408 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.461709 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/6a9ce3f2-ec7b-4278-88b3-b6eba783ca7b-socket-dir\") pod \"aws-ebs-csi-driver-node-44f8h\" (UID: \"6a9ce3f2-ec7b-4278-88b3-b6eba783ca7b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-44f8h" Apr 21 15:35:06.463408 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.461724 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a8821bf6-e244-4b55-bfcc-7d85dec39bc4-etc-openvswitch\") pod \"ovnkube-node-9v44z\" (UID: \"a8821bf6-e244-4b55-bfcc-7d85dec39bc4\") " pod="openshift-ovn-kubernetes/ovnkube-node-9v44z" Apr 21 15:35:06.463408 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.461754 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n724m\" (UniqueName: \"kubernetes.io/projected/a7bb4a1e-4105-43ec-a600-43495885c030-kube-api-access-n724m\") pod \"multus-rvv9j\" (UID: \"a7bb4a1e-4105-43ec-a600-43495885c030\") " pod="openshift-multus/multus-rvv9j" Apr 21 15:35:06.463408 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.461776 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/31c04054-fa66-445a-9246-9c32b20cd60d-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-gkvb2\" (UID: \"31c04054-fa66-445a-9246-9c32b20cd60d\") " pod="openshift-multus/multus-additional-cni-plugins-gkvb2" Apr 21 15:35:06.463408 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.461782 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/a45717a3-b37a-442d-822c-c0485d21bf6b-etc-systemd\") pod \"tuned-2flj5\" (UID: \"a45717a3-b37a-442d-822c-c0485d21bf6b\") " pod="openshift-cluster-node-tuning-operator/tuned-2flj5" Apr 21 15:35:06.463408 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.461860 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a8821bf6-e244-4b55-bfcc-7d85dec39bc4-host-slash\") pod \"ovnkube-node-9v44z\" (UID: \"a8821bf6-e244-4b55-bfcc-7d85dec39bc4\") " pod="openshift-ovn-kubernetes/ovnkube-node-9v44z" Apr 21 15:35:06.463408 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.461869 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a8821bf6-e244-4b55-bfcc-7d85dec39bc4-etc-openvswitch\") pod \"ovnkube-node-9v44z\" (UID: \"a8821bf6-e244-4b55-bfcc-7d85dec39bc4\") " pod="openshift-ovn-kubernetes/ovnkube-node-9v44z" Apr 21 15:35:06.463408 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.461899 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a8821bf6-e244-4b55-bfcc-7d85dec39bc4-var-lib-openvswitch\") pod \"ovnkube-node-9v44z\" (UID: \"a8821bf6-e244-4b55-bfcc-7d85dec39bc4\") " pod="openshift-ovn-kubernetes/ovnkube-node-9v44z" Apr 21 15:35:06.463408 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.461947 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/6a9ce3f2-ec7b-4278-88b3-b6eba783ca7b-registration-dir\") pod \"aws-ebs-csi-driver-node-44f8h\" (UID: \"6a9ce3f2-ec7b-4278-88b3-b6eba783ca7b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-44f8h" Apr 21 15:35:06.463408 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.461963 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/7c31902c-772d-4d70-a92a-d6c21f8a1a17-host-slash\") pod \"iptables-alerter-4qs56\" (UID: \"7c31902c-772d-4d70-a92a-d6c21f8a1a17\") " pod="openshift-network-operator/iptables-alerter-4qs56" Apr 21 15:35:06.463408 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.461990 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a45717a3-b37a-442d-822c-c0485d21bf6b-sys\") pod \"tuned-2flj5\" (UID: \"a45717a3-b37a-442d-822c-c0485d21bf6b\") " pod="openshift-cluster-node-tuning-operator/tuned-2flj5" Apr 21 15:35:06.463408 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.462015 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/a7bb4a1e-4105-43ec-a600-43495885c030-cni-binary-copy\") pod \"multus-rvv9j\" (UID: \"a7bb4a1e-4105-43ec-a600-43495885c030\") " pod="openshift-multus/multus-rvv9j" Apr 21 15:35:06.463408 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.462019 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/a8821bf6-e244-4b55-bfcc-7d85dec39bc4-run-systemd\") pod \"ovnkube-node-9v44z\" (UID: \"a8821bf6-e244-4b55-bfcc-7d85dec39bc4\") " pod="openshift-ovn-kubernetes/ovnkube-node-9v44z" Apr 21 15:35:06.464210 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.462032 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/a8821bf6-e244-4b55-bfcc-7d85dec39bc4-ovnkube-script-lib\") pod \"ovnkube-node-9v44z\" (UID: \"a8821bf6-e244-4b55-bfcc-7d85dec39bc4\") " pod="openshift-ovn-kubernetes/ovnkube-node-9v44z" Apr 21 15:35:06.464210 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.462044 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/a7bb4a1e-4105-43ec-a600-43495885c030-multus-daemon-config\") pod \"multus-rvv9j\" (UID: \"a7bb4a1e-4105-43ec-a600-43495885c030\") " pod="openshift-multus/multus-rvv9j" Apr 21 15:35:06.464210 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.462073 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/a45717a3-b37a-442d-822c-c0485d21bf6b-etc-sysconfig\") pod \"tuned-2flj5\" (UID: \"a45717a3-b37a-442d-822c-c0485d21bf6b\") " pod="openshift-cluster-node-tuning-operator/tuned-2flj5" Apr 21 15:35:06.464210 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.462087 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/a8821bf6-e244-4b55-bfcc-7d85dec39bc4-run-systemd\") pod \"ovnkube-node-9v44z\" (UID: \"a8821bf6-e244-4b55-bfcc-7d85dec39bc4\") " pod="openshift-ovn-kubernetes/ovnkube-node-9v44z" Apr 21 15:35:06.464210 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.462099 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ht65t\" (UniqueName: \"kubernetes.io/projected/a45717a3-b37a-442d-822c-c0485d21bf6b-kube-api-access-ht65t\") pod \"tuned-2flj5\" (UID: \"a45717a3-b37a-442d-822c-c0485d21bf6b\") " pod="openshift-cluster-node-tuning-operator/tuned-2flj5" Apr 21 15:35:06.464210 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.462118 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/31c04054-fa66-445a-9246-9c32b20cd60d-tuning-conf-dir\") pod \"multus-additional-cni-plugins-gkvb2\" (UID: \"31c04054-fa66-445a-9246-9c32b20cd60d\") " pod="openshift-multus/multus-additional-cni-plugins-gkvb2" Apr 21 15:35:06.464210 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.462126 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a7bb4a1e-4105-43ec-a600-43495885c030-host-var-lib-kubelet\") pod \"multus-rvv9j\" (UID: \"a7bb4a1e-4105-43ec-a600-43495885c030\") " pod="openshift-multus/multus-rvv9j" Apr 21 15:35:06.464210 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.462150 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/a7bb4a1e-4105-43ec-a600-43495885c030-hostroot\") pod \"multus-rvv9j\" (UID: \"a7bb4a1e-4105-43ec-a600-43495885c030\") " pod="openshift-multus/multus-rvv9j" Apr 21 15:35:06.464210 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.462168 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a45717a3-b37a-442d-822c-c0485d21bf6b-host\") pod \"tuned-2flj5\" (UID: \"a45717a3-b37a-442d-822c-c0485d21bf6b\") " pod="openshift-cluster-node-tuning-operator/tuned-2flj5" Apr 21 15:35:06.464210 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.462184 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9c107ca7-f14c-4f8c-a8d4-4e08e3acb233-metrics-certs\") pod \"network-metrics-daemon-28b7m\" (UID: \"9c107ca7-f14c-4f8c-a8d4-4e08e3acb233\") " pod="openshift-multus/network-metrics-daemon-28b7m" Apr 21 15:35:06.464210 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.462209 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a8821bf6-e244-4b55-bfcc-7d85dec39bc4-env-overrides\") pod \"ovnkube-node-9v44z\" (UID: \"a8821bf6-e244-4b55-bfcc-7d85dec39bc4\") " pod="openshift-ovn-kubernetes/ovnkube-node-9v44z" Apr 21 15:35:06.464210 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.462229 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a7bb4a1e-4105-43ec-a600-43495885c030-host-var-lib-kubelet\") pod \"multus-rvv9j\" (UID: \"a7bb4a1e-4105-43ec-a600-43495885c030\") " pod="openshift-multus/multus-rvv9j" Apr 21 15:35:06.464210 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.462237 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/a7bb4a1e-4105-43ec-a600-43495885c030-hostroot\") pod \"multus-rvv9j\" (UID: \"a7bb4a1e-4105-43ec-a600-43495885c030\") " pod="openshift-multus/multus-rvv9j" Apr 21 15:35:06.464210 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.462238 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a7bb4a1e-4105-43ec-a600-43495885c030-host-run-netns\") pod \"multus-rvv9j\" (UID: \"a7bb4a1e-4105-43ec-a600-43495885c030\") " pod="openshift-multus/multus-rvv9j" Apr 21 15:35:06.464210 ip-10-0-136-123 kubenswrapper[2573]: E0421 15:35:06.462290 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 15:35:06.464210 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.462347 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a7bb4a1e-4105-43ec-a600-43495885c030-multus-conf-dir\") pod \"multus-rvv9j\" (UID: \"a7bb4a1e-4105-43ec-a600-43495885c030\") " pod="openshift-multus/multus-rvv9j" Apr 21 15:35:06.464210 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.462376 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a7bb4a1e-4105-43ec-a600-43495885c030-multus-conf-dir\") pod \"multus-rvv9j\" (UID: \"a7bb4a1e-4105-43ec-a600-43495885c030\") " pod="openshift-multus/multus-rvv9j" Apr 21 15:35:06.464210 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.462378 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a7bb4a1e-4105-43ec-a600-43495885c030-etc-kubernetes\") pod \"multus-rvv9j\" (UID: \"a7bb4a1e-4105-43ec-a600-43495885c030\") " pod="openshift-multus/multus-rvv9j" Apr 21 15:35:06.464748 ip-10-0-136-123 kubenswrapper[2573]: E0421 15:35:06.462421 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9c107ca7-f14c-4f8c-a8d4-4e08e3acb233-metrics-certs podName:9c107ca7-f14c-4f8c-a8d4-4e08e3acb233 nodeName:}" failed. No retries permitted until 2026-04-21 15:35:06.962385997 +0000 UTC m=+2.092439625 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9c107ca7-f14c-4f8c-a8d4-4e08e3acb233-metrics-certs") pod "network-metrics-daemon-28b7m" (UID: "9c107ca7-f14c-4f8c-a8d4-4e08e3acb233") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 15:35:06.464748 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.462349 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a7bb4a1e-4105-43ec-a600-43495885c030-host-run-netns\") pod \"multus-rvv9j\" (UID: \"a7bb4a1e-4105-43ec-a600-43495885c030\") " pod="openshift-multus/multus-rvv9j" Apr 21 15:35:06.464748 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.462453 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a7bb4a1e-4105-43ec-a600-43495885c030-etc-kubernetes\") pod \"multus-rvv9j\" (UID: \"a7bb4a1e-4105-43ec-a600-43495885c030\") " pod="openshift-multus/multus-rvv9j" Apr 21 15:35:06.464748 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.463173 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/a7bb4a1e-4105-43ec-a600-43495885c030-multus-daemon-config\") pod \"multus-rvv9j\" (UID: \"a7bb4a1e-4105-43ec-a600-43495885c030\") " pod="openshift-multus/multus-rvv9j" Apr 21 15:35:06.464748 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.463292 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a8821bf6-e244-4b55-bfcc-7d85dec39bc4-env-overrides\") pod \"ovnkube-node-9v44z\" (UID: \"a8821bf6-e244-4b55-bfcc-7d85dec39bc4\") " pod="openshift-ovn-kubernetes/ovnkube-node-9v44z" Apr 21 15:35:06.464748 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.463573 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a8821bf6-e244-4b55-bfcc-7d85dec39bc4-ovn-node-metrics-cert\") pod \"ovnkube-node-9v44z\" (UID: \"a8821bf6-e244-4b55-bfcc-7d85dec39bc4\") " pod="openshift-ovn-kubernetes/ovnkube-node-9v44z" Apr 21 15:35:06.464748 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.463688 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/63dd9652-ce6c-4395-ae74-cba66c5a8c72-agent-certs\") pod \"konnectivity-agent-xph98\" (UID: \"63dd9652-ce6c-4395-ae74-cba66c5a8c72\") " pod="kube-system/konnectivity-agent-xph98" Apr 21 15:35:06.464748 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.464571 2573 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-4ft7t" Apr 21 15:35:06.469695 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.469675 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rxtw6\" (UniqueName: \"kubernetes.io/projected/39fccf23-7816-40f1-9d1a-0711aca322c8-kube-api-access-rxtw6\") pod \"node-ca-m9xpg\" (UID: \"39fccf23-7816-40f1-9d1a-0711aca322c8\") " pod="openshift-image-registry/node-ca-m9xpg" Apr 21 15:35:06.473635 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.473615 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z6gbm\" (UniqueName: \"kubernetes.io/projected/6a9ce3f2-ec7b-4278-88b3-b6eba783ca7b-kube-api-access-z6gbm\") pod \"aws-ebs-csi-driver-node-44f8h\" (UID: \"6a9ce3f2-ec7b-4278-88b3-b6eba783ca7b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-44f8h" Apr 21 15:35:06.474941 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.474919 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8hnb7\" (UniqueName: \"kubernetes.io/projected/31c04054-fa66-445a-9246-9c32b20cd60d-kube-api-access-8hnb7\") pod \"multus-additional-cni-plugins-gkvb2\" (UID: \"31c04054-fa66-445a-9246-9c32b20cd60d\") " pod="openshift-multus/multus-additional-cni-plugins-gkvb2" Apr 21 15:35:06.475245 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.475228 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n724m\" (UniqueName: \"kubernetes.io/projected/a7bb4a1e-4105-43ec-a600-43495885c030-kube-api-access-n724m\") pod \"multus-rvv9j\" (UID: \"a7bb4a1e-4105-43ec-a600-43495885c030\") " pod="openshift-multus/multus-rvv9j" Apr 21 15:35:06.475486 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.475470 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ggmm6\" (UniqueName: \"kubernetes.io/projected/7c31902c-772d-4d70-a92a-d6c21f8a1a17-kube-api-access-ggmm6\") pod \"iptables-alerter-4qs56\" (UID: \"7c31902c-772d-4d70-a92a-d6c21f8a1a17\") " pod="openshift-network-operator/iptables-alerter-4qs56" Apr 21 15:35:06.476014 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.475996 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4xnkv\" (UniqueName: \"kubernetes.io/projected/a8821bf6-e244-4b55-bfcc-7d85dec39bc4-kube-api-access-4xnkv\") pod \"ovnkube-node-9v44z\" (UID: \"a8821bf6-e244-4b55-bfcc-7d85dec39bc4\") " pod="openshift-ovn-kubernetes/ovnkube-node-9v44z" Apr 21 15:35:06.476139 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.476123 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jz2sw\" (UniqueName: \"kubernetes.io/projected/9c107ca7-f14c-4f8c-a8d4-4e08e3acb233-kube-api-access-jz2sw\") pod \"network-metrics-daemon-28b7m\" (UID: \"9c107ca7-f14c-4f8c-a8d4-4e08e3acb233\") " pod="openshift-multus/network-metrics-daemon-28b7m" Apr 21 15:35:06.478956 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.478918 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-123.ec2.internal" event={"ID":"9a041855c13fefe5756acf5eec467e79","Type":"ContainerStarted","Data":"049d321842388ea8f8a2d6b98881cb53480ac93c90b59fb50b6f31cd4d952fdf"} Apr 21 15:35:06.479826 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.479790 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-136-123.ec2.internal" event={"ID":"e27d8eec64ea5a1e06e35be5839e2c48","Type":"ContainerStarted","Data":"552de36deba0dd5e48892b2c5f4568c3006c4f767fd2a99e4c00d372b5f785eb"} Apr 21 15:35:06.533905 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.533876 2573 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 15:35:06.563159 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.563129 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/a45717a3-b37a-442d-822c-c0485d21bf6b-etc-systemd\") pod \"tuned-2flj5\" (UID: \"a45717a3-b37a-442d-822c-c0485d21bf6b\") " pod="openshift-cluster-node-tuning-operator/tuned-2flj5" Apr 21 15:35:06.563159 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.563159 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a45717a3-b37a-442d-822c-c0485d21bf6b-sys\") pod \"tuned-2flj5\" (UID: \"a45717a3-b37a-442d-822c-c0485d21bf6b\") " pod="openshift-cluster-node-tuning-operator/tuned-2flj5" Apr 21 15:35:06.563375 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.563175 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/a45717a3-b37a-442d-822c-c0485d21bf6b-etc-sysconfig\") pod \"tuned-2flj5\" (UID: \"a45717a3-b37a-442d-822c-c0485d21bf6b\") " pod="openshift-cluster-node-tuning-operator/tuned-2flj5" Apr 21 15:35:06.563375 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.563190 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ht65t\" (UniqueName: \"kubernetes.io/projected/a45717a3-b37a-442d-822c-c0485d21bf6b-kube-api-access-ht65t\") pod \"tuned-2flj5\" (UID: \"a45717a3-b37a-442d-822c-c0485d21bf6b\") " pod="openshift-cluster-node-tuning-operator/tuned-2flj5" Apr 21 15:35:06.563375 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.563205 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a45717a3-b37a-442d-822c-c0485d21bf6b-host\") pod \"tuned-2flj5\" (UID: \"a45717a3-b37a-442d-822c-c0485d21bf6b\") " pod="openshift-cluster-node-tuning-operator/tuned-2flj5" Apr 21 15:35:06.563375 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.563243 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t5phk\" (UniqueName: \"kubernetes.io/projected/8ea4d113-155e-4fa2-b765-c12d26b37fa1-kube-api-access-t5phk\") pod \"network-check-target-ntgnx\" (UID: \"8ea4d113-155e-4fa2-b765-c12d26b37fa1\") " pod="openshift-network-diagnostics/network-check-target-ntgnx" Apr 21 15:35:06.563375 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.563258 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a45717a3-b37a-442d-822c-c0485d21bf6b-sys\") pod \"tuned-2flj5\" (UID: \"a45717a3-b37a-442d-822c-c0485d21bf6b\") " pod="openshift-cluster-node-tuning-operator/tuned-2flj5" Apr 21 15:35:06.563375 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.563271 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/a45717a3-b37a-442d-822c-c0485d21bf6b-etc-sysctl-conf\") pod \"tuned-2flj5\" (UID: \"a45717a3-b37a-442d-822c-c0485d21bf6b\") " pod="openshift-cluster-node-tuning-operator/tuned-2flj5" Apr 21 15:35:06.563375 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.563276 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/a45717a3-b37a-442d-822c-c0485d21bf6b-etc-sysconfig\") pod \"tuned-2flj5\" (UID: \"a45717a3-b37a-442d-822c-c0485d21bf6b\") " pod="openshift-cluster-node-tuning-operator/tuned-2flj5" Apr 21 15:35:06.563375 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.563266 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/a45717a3-b37a-442d-822c-c0485d21bf6b-etc-systemd\") pod \"tuned-2flj5\" (UID: \"a45717a3-b37a-442d-822c-c0485d21bf6b\") " pod="openshift-cluster-node-tuning-operator/tuned-2flj5" Apr 21 15:35:06.563375 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.563310 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a45717a3-b37a-442d-822c-c0485d21bf6b-host\") pod \"tuned-2flj5\" (UID: \"a45717a3-b37a-442d-822c-c0485d21bf6b\") " pod="openshift-cluster-node-tuning-operator/tuned-2flj5" Apr 21 15:35:06.563375 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.563318 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/96e29eb1-d270-4d82-a139-d970d1863b1c-hosts-file\") pod \"node-resolver-lkj85\" (UID: \"96e29eb1-d270-4d82-a139-d970d1863b1c\") " pod="openshift-dns/node-resolver-lkj85" Apr 21 15:35:06.563375 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.563358 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/96e29eb1-d270-4d82-a139-d970d1863b1c-hosts-file\") pod \"node-resolver-lkj85\" (UID: \"96e29eb1-d270-4d82-a139-d970d1863b1c\") " pod="openshift-dns/node-resolver-lkj85" Apr 21 15:35:06.563375 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.563359 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/a45717a3-b37a-442d-822c-c0485d21bf6b-run\") pod \"tuned-2flj5\" (UID: \"a45717a3-b37a-442d-822c-c0485d21bf6b\") " pod="openshift-cluster-node-tuning-operator/tuned-2flj5" Apr 21 15:35:06.563951 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.563390 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a45717a3-b37a-442d-822c-c0485d21bf6b-etc-kubernetes\") pod \"tuned-2flj5\" (UID: \"a45717a3-b37a-442d-822c-c0485d21bf6b\") " pod="openshift-cluster-node-tuning-operator/tuned-2flj5" Apr 21 15:35:06.563951 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.563396 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/a45717a3-b37a-442d-822c-c0485d21bf6b-run\") pod \"tuned-2flj5\" (UID: \"a45717a3-b37a-442d-822c-c0485d21bf6b\") " pod="openshift-cluster-node-tuning-operator/tuned-2flj5" Apr 21 15:35:06.563951 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.563407 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jf6kg\" (UniqueName: \"kubernetes.io/projected/96e29eb1-d270-4d82-a139-d970d1863b1c-kube-api-access-jf6kg\") pod \"node-resolver-lkj85\" (UID: \"96e29eb1-d270-4d82-a139-d970d1863b1c\") " pod="openshift-dns/node-resolver-lkj85" Apr 21 15:35:06.563951 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.563435 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/a45717a3-b37a-442d-822c-c0485d21bf6b-etc-sysctl-d\") pod \"tuned-2flj5\" (UID: \"a45717a3-b37a-442d-822c-c0485d21bf6b\") " pod="openshift-cluster-node-tuning-operator/tuned-2flj5" Apr 21 15:35:06.563951 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.563456 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a45717a3-b37a-442d-822c-c0485d21bf6b-var-lib-kubelet\") pod \"tuned-2flj5\" (UID: \"a45717a3-b37a-442d-822c-c0485d21bf6b\") " pod="openshift-cluster-node-tuning-operator/tuned-2flj5" Apr 21 15:35:06.563951 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.563472 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/96e29eb1-d270-4d82-a139-d970d1863b1c-tmp-dir\") pod \"node-resolver-lkj85\" (UID: \"96e29eb1-d270-4d82-a139-d970d1863b1c\") " pod="openshift-dns/node-resolver-lkj85" Apr 21 15:35:06.563951 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.563483 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a45717a3-b37a-442d-822c-c0485d21bf6b-etc-kubernetes\") pod \"tuned-2flj5\" (UID: \"a45717a3-b37a-442d-822c-c0485d21bf6b\") " pod="openshift-cluster-node-tuning-operator/tuned-2flj5" Apr 21 15:35:06.563951 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.563493 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a45717a3-b37a-442d-822c-c0485d21bf6b-lib-modules\") pod \"tuned-2flj5\" (UID: \"a45717a3-b37a-442d-822c-c0485d21bf6b\") " pod="openshift-cluster-node-tuning-operator/tuned-2flj5" Apr 21 15:35:06.563951 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.563526 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a45717a3-b37a-442d-822c-c0485d21bf6b-var-lib-kubelet\") pod \"tuned-2flj5\" (UID: \"a45717a3-b37a-442d-822c-c0485d21bf6b\") " pod="openshift-cluster-node-tuning-operator/tuned-2flj5" Apr 21 15:35:06.563951 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.563575 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/a45717a3-b37a-442d-822c-c0485d21bf6b-etc-sysctl-conf\") pod \"tuned-2flj5\" (UID: \"a45717a3-b37a-442d-822c-c0485d21bf6b\") " pod="openshift-cluster-node-tuning-operator/tuned-2flj5" Apr 21 15:35:06.563951 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.563615 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a45717a3-b37a-442d-822c-c0485d21bf6b-lib-modules\") pod \"tuned-2flj5\" (UID: \"a45717a3-b37a-442d-822c-c0485d21bf6b\") " pod="openshift-cluster-node-tuning-operator/tuned-2flj5" Apr 21 15:35:06.563951 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.563558 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/a45717a3-b37a-442d-822c-c0485d21bf6b-tmp\") pod \"tuned-2flj5\" (UID: \"a45717a3-b37a-442d-822c-c0485d21bf6b\") " pod="openshift-cluster-node-tuning-operator/tuned-2flj5" Apr 21 15:35:06.563951 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.563652 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/a45717a3-b37a-442d-822c-c0485d21bf6b-etc-modprobe-d\") pod \"tuned-2flj5\" (UID: \"a45717a3-b37a-442d-822c-c0485d21bf6b\") " pod="openshift-cluster-node-tuning-operator/tuned-2flj5" Apr 21 15:35:06.563951 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.563677 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/a45717a3-b37a-442d-822c-c0485d21bf6b-etc-tuned\") pod \"tuned-2flj5\" (UID: \"a45717a3-b37a-442d-822c-c0485d21bf6b\") " pod="openshift-cluster-node-tuning-operator/tuned-2flj5" Apr 21 15:35:06.563951 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.563629 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/a45717a3-b37a-442d-822c-c0485d21bf6b-etc-sysctl-d\") pod \"tuned-2flj5\" (UID: \"a45717a3-b37a-442d-822c-c0485d21bf6b\") " pod="openshift-cluster-node-tuning-operator/tuned-2flj5" Apr 21 15:35:06.563951 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.563768 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/a45717a3-b37a-442d-822c-c0485d21bf6b-etc-modprobe-d\") pod \"tuned-2flj5\" (UID: \"a45717a3-b37a-442d-822c-c0485d21bf6b\") " pod="openshift-cluster-node-tuning-operator/tuned-2flj5" Apr 21 15:35:06.563951 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.563849 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/96e29eb1-d270-4d82-a139-d970d1863b1c-tmp-dir\") pod \"node-resolver-lkj85\" (UID: \"96e29eb1-d270-4d82-a139-d970d1863b1c\") " pod="openshift-dns/node-resolver-lkj85" Apr 21 15:35:06.565614 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.565599 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/a45717a3-b37a-442d-822c-c0485d21bf6b-tmp\") pod \"tuned-2flj5\" (UID: \"a45717a3-b37a-442d-822c-c0485d21bf6b\") " pod="openshift-cluster-node-tuning-operator/tuned-2flj5" Apr 21 15:35:06.565676 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.565661 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/a45717a3-b37a-442d-822c-c0485d21bf6b-etc-tuned\") pod \"tuned-2flj5\" (UID: \"a45717a3-b37a-442d-822c-c0485d21bf6b\") " pod="openshift-cluster-node-tuning-operator/tuned-2flj5" Apr 21 15:35:06.570376 ip-10-0-136-123 kubenswrapper[2573]: E0421 15:35:06.570350 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 15:35:06.570376 ip-10-0-136-123 kubenswrapper[2573]: E0421 15:35:06.570371 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 15:35:06.570492 ip-10-0-136-123 kubenswrapper[2573]: E0421 15:35:06.570383 2573 projected.go:194] Error preparing data for projected volume kube-api-access-t5phk for pod openshift-network-diagnostics/network-check-target-ntgnx: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 15:35:06.570492 ip-10-0-136-123 kubenswrapper[2573]: E0421 15:35:06.570440 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8ea4d113-155e-4fa2-b765-c12d26b37fa1-kube-api-access-t5phk podName:8ea4d113-155e-4fa2-b765-c12d26b37fa1 nodeName:}" failed. No retries permitted until 2026-04-21 15:35:07.070426406 +0000 UTC m=+2.200480021 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-t5phk" (UniqueName: "kubernetes.io/projected/8ea4d113-155e-4fa2-b765-c12d26b37fa1-kube-api-access-t5phk") pod "network-check-target-ntgnx" (UID: "8ea4d113-155e-4fa2-b765-c12d26b37fa1") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 15:35:06.571981 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.571963 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jf6kg\" (UniqueName: \"kubernetes.io/projected/96e29eb1-d270-4d82-a139-d970d1863b1c-kube-api-access-jf6kg\") pod \"node-resolver-lkj85\" (UID: \"96e29eb1-d270-4d82-a139-d970d1863b1c\") " pod="openshift-dns/node-resolver-lkj85" Apr 21 15:35:06.572054 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.571991 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ht65t\" (UniqueName: \"kubernetes.io/projected/a45717a3-b37a-442d-822c-c0485d21bf6b-kube-api-access-ht65t\") pod \"tuned-2flj5\" (UID: \"a45717a3-b37a-442d-822c-c0485d21bf6b\") " pod="openshift-cluster-node-tuning-operator/tuned-2flj5" Apr 21 15:35:06.669556 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.669515 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4qs56" Apr 21 15:35:06.674388 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.674362 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-9v44z" Apr 21 15:35:06.677754 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:06.677727 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7c31902c_772d_4d70_a92a_d6c21f8a1a17.slice/crio-53ec22cbf0385ec7893e95058589a77680e9ecb782db128e14edbaf4bdfde789 WatchSource:0}: Error finding container 53ec22cbf0385ec7893e95058589a77680e9ecb782db128e14edbaf4bdfde789: Status 404 returned error can't find the container with id 53ec22cbf0385ec7893e95058589a77680e9ecb782db128e14edbaf4bdfde789 Apr 21 15:35:06.680859 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:06.680831 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda8821bf6_e244_4b55_bfcc_7d85dec39bc4.slice/crio-9e80c5171bf3ad0def4724f17fbf2dd32569fb19fc4cc769eafd319f70707631 WatchSource:0}: Error finding container 9e80c5171bf3ad0def4724f17fbf2dd32569fb19fc4cc769eafd319f70707631: Status 404 returned error can't find the container with id 9e80c5171bf3ad0def4724f17fbf2dd32569fb19fc4cc769eafd319f70707631 Apr 21 15:35:06.695981 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.695961 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-xph98" Apr 21 15:35:06.701542 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:06.701517 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod63dd9652_ce6c_4395_ae74_cba66c5a8c72.slice/crio-1a9fa72eb76334df9cc123378681dcba40130516befbc3c0bc8e3c260d903ebd WatchSource:0}: Error finding container 1a9fa72eb76334df9cc123378681dcba40130516befbc3c0bc8e3c260d903ebd: Status 404 returned error can't find the container with id 1a9fa72eb76334df9cc123378681dcba40130516befbc3c0bc8e3c260d903ebd Apr 21 15:35:06.706311 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.706295 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-m9xpg" Apr 21 15:35:06.711753 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:06.711732 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod39fccf23_7816_40f1_9d1a_0711aca322c8.slice/crio-095668a641c0b15c5b3cd9b056b1f991270228b47f8e45d75415709a910fda18 WatchSource:0}: Error finding container 095668a641c0b15c5b3cd9b056b1f991270228b47f8e45d75415709a910fda18: Status 404 returned error can't find the container with id 095668a641c0b15c5b3cd9b056b1f991270228b47f8e45d75415709a910fda18 Apr 21 15:35:06.731428 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.731400 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-gkvb2" Apr 21 15:35:06.737499 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:06.737479 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod31c04054_fa66_445a_9246_9c32b20cd60d.slice/crio-4b06f34333fd727dadfef1d668f63fdb089cfa2a7013d65d3664cd85b8e02a9e WatchSource:0}: Error finding container 4b06f34333fd727dadfef1d668f63fdb089cfa2a7013d65d3664cd85b8e02a9e: Status 404 returned error can't find the container with id 4b06f34333fd727dadfef1d668f63fdb089cfa2a7013d65d3664cd85b8e02a9e Apr 21 15:35:06.738197 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.738173 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-rvv9j" Apr 21 15:35:06.744481 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.744458 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-44f8h" Apr 21 15:35:06.744682 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:06.744661 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda7bb4a1e_4105_43ec_a600_43495885c030.slice/crio-50bbe1d688a6f7470d3e050972ad38b2f8dc85e9d55c3126860fcce5d88c7879 WatchSource:0}: Error finding container 50bbe1d688a6f7470d3e050972ad38b2f8dc85e9d55c3126860fcce5d88c7879: Status 404 returned error can't find the container with id 50bbe1d688a6f7470d3e050972ad38b2f8dc85e9d55c3126860fcce5d88c7879 Apr 21 15:35:06.751362 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.751195 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-2flj5" Apr 21 15:35:06.756229 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.755702 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-lkj85" Apr 21 15:35:06.758925 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:06.758903 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda45717a3_b37a_442d_822c_c0485d21bf6b.slice/crio-4f6a99a525e1240fa4bf9830c920566fbf731e23e480387192335a2533c3956d WatchSource:0}: Error finding container 4f6a99a525e1240fa4bf9830c920566fbf731e23e480387192335a2533c3956d: Status 404 returned error can't find the container with id 4f6a99a525e1240fa4bf9830c920566fbf731e23e480387192335a2533c3956d Apr 21 15:35:06.763290 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:06.763270 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod96e29eb1_d270_4d82_a139_d970d1863b1c.slice/crio-f8aefeaf835a172a21252209932a0094a96b46ef335994e6c0a92b3c160ae334 WatchSource:0}: Error finding container f8aefeaf835a172a21252209932a0094a96b46ef335994e6c0a92b3c160ae334: Status 404 returned error can't find the container with id f8aefeaf835a172a21252209932a0094a96b46ef335994e6c0a92b3c160ae334 Apr 21 15:35:06.966209 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:06.966125 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9c107ca7-f14c-4f8c-a8d4-4e08e3acb233-metrics-certs\") pod \"network-metrics-daemon-28b7m\" (UID: \"9c107ca7-f14c-4f8c-a8d4-4e08e3acb233\") " pod="openshift-multus/network-metrics-daemon-28b7m" Apr 21 15:35:06.966365 ip-10-0-136-123 kubenswrapper[2573]: E0421 15:35:06.966281 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 15:35:06.966365 ip-10-0-136-123 kubenswrapper[2573]: E0421 15:35:06.966343 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9c107ca7-f14c-4f8c-a8d4-4e08e3acb233-metrics-certs podName:9c107ca7-f14c-4f8c-a8d4-4e08e3acb233 nodeName:}" failed. No retries permitted until 2026-04-21 15:35:07.966326272 +0000 UTC m=+3.096379884 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9c107ca7-f14c-4f8c-a8d4-4e08e3acb233-metrics-certs") pod "network-metrics-daemon-28b7m" (UID: "9c107ca7-f14c-4f8c-a8d4-4e08e3acb233") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 15:35:07.156868 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:07.156652 2573 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 15:35:07.168700 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:07.168093 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t5phk\" (UniqueName: \"kubernetes.io/projected/8ea4d113-155e-4fa2-b765-c12d26b37fa1-kube-api-access-t5phk\") pod \"network-check-target-ntgnx\" (UID: \"8ea4d113-155e-4fa2-b765-c12d26b37fa1\") " pod="openshift-network-diagnostics/network-check-target-ntgnx" Apr 21 15:35:07.168700 ip-10-0-136-123 kubenswrapper[2573]: E0421 15:35:07.168279 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 15:35:07.168700 ip-10-0-136-123 kubenswrapper[2573]: E0421 15:35:07.168299 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 15:35:07.168700 ip-10-0-136-123 kubenswrapper[2573]: E0421 15:35:07.168312 2573 projected.go:194] Error preparing data for projected volume kube-api-access-t5phk for pod openshift-network-diagnostics/network-check-target-ntgnx: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 15:35:07.168700 ip-10-0-136-123 kubenswrapper[2573]: E0421 15:35:07.168366 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8ea4d113-155e-4fa2-b765-c12d26b37fa1-kube-api-access-t5phk podName:8ea4d113-155e-4fa2-b765-c12d26b37fa1 nodeName:}" failed. No retries permitted until 2026-04-21 15:35:08.168348503 +0000 UTC m=+3.298402120 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-t5phk" (UniqueName: "kubernetes.io/projected/8ea4d113-155e-4fa2-b765-c12d26b37fa1-kube-api-access-t5phk") pod "network-check-target-ntgnx" (UID: "8ea4d113-155e-4fa2-b765-c12d26b37fa1") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 15:35:07.465445 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:07.465399 2573 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-20 15:30:06 +0000 UTC" deadline="2027-11-03 20:31:59.657910391 +0000 UTC" Apr 21 15:35:07.465445 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:07.465443 2573 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13468h56m52.19247233s" Apr 21 15:35:07.477620 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:07.477555 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ntgnx" Apr 21 15:35:07.477753 ip-10-0-136-123 kubenswrapper[2573]: E0421 15:35:07.477670 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ntgnx" podUID="8ea4d113-155e-4fa2-b765-c12d26b37fa1" Apr 21 15:35:07.511456 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:07.511415 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-lkj85" event={"ID":"96e29eb1-d270-4d82-a139-d970d1863b1c","Type":"ContainerStarted","Data":"f8aefeaf835a172a21252209932a0094a96b46ef335994e6c0a92b3c160ae334"} Apr 21 15:35:07.523504 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:07.523453 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-44f8h" event={"ID":"6a9ce3f2-ec7b-4278-88b3-b6eba783ca7b","Type":"ContainerStarted","Data":"23a7cf2198f611187dc7eb6e6a3ec09340179b79f46e78c6a3a467accbe217cf"} Apr 21 15:35:07.539879 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:07.539839 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-rvv9j" event={"ID":"a7bb4a1e-4105-43ec-a600-43495885c030","Type":"ContainerStarted","Data":"50bbe1d688a6f7470d3e050972ad38b2f8dc85e9d55c3126860fcce5d88c7879"} Apr 21 15:35:07.543030 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:07.543001 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-gkvb2" event={"ID":"31c04054-fa66-445a-9246-9c32b20cd60d","Type":"ContainerStarted","Data":"4b06f34333fd727dadfef1d668f63fdb089cfa2a7013d65d3664cd85b8e02a9e"} Apr 21 15:35:07.545364 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:07.545338 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4qs56" event={"ID":"7c31902c-772d-4d70-a92a-d6c21f8a1a17","Type":"ContainerStarted","Data":"53ec22cbf0385ec7893e95058589a77680e9ecb782db128e14edbaf4bdfde789"} Apr 21 15:35:07.561752 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:07.561721 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-2flj5" event={"ID":"a45717a3-b37a-442d-822c-c0485d21bf6b","Type":"ContainerStarted","Data":"4f6a99a525e1240fa4bf9830c920566fbf731e23e480387192335a2533c3956d"} Apr 21 15:35:07.580313 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:07.580279 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-m9xpg" event={"ID":"39fccf23-7816-40f1-9d1a-0711aca322c8","Type":"ContainerStarted","Data":"095668a641c0b15c5b3cd9b056b1f991270228b47f8e45d75415709a910fda18"} Apr 21 15:35:07.588146 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:07.588111 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-xph98" event={"ID":"63dd9652-ce6c-4395-ae74-cba66c5a8c72","Type":"ContainerStarted","Data":"1a9fa72eb76334df9cc123378681dcba40130516befbc3c0bc8e3c260d903ebd"} Apr 21 15:35:07.599846 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:07.599756 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9v44z" event={"ID":"a8821bf6-e244-4b55-bfcc-7d85dec39bc4","Type":"ContainerStarted","Data":"9e80c5171bf3ad0def4724f17fbf2dd32569fb19fc4cc769eafd319f70707631"} Apr 21 15:35:07.977504 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:07.977418 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9c107ca7-f14c-4f8c-a8d4-4e08e3acb233-metrics-certs\") pod \"network-metrics-daemon-28b7m\" (UID: \"9c107ca7-f14c-4f8c-a8d4-4e08e3acb233\") " pod="openshift-multus/network-metrics-daemon-28b7m" Apr 21 15:35:07.977654 ip-10-0-136-123 kubenswrapper[2573]: E0421 15:35:07.977640 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 15:35:07.977721 ip-10-0-136-123 kubenswrapper[2573]: E0421 15:35:07.977704 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9c107ca7-f14c-4f8c-a8d4-4e08e3acb233-metrics-certs podName:9c107ca7-f14c-4f8c-a8d4-4e08e3acb233 nodeName:}" failed. No retries permitted until 2026-04-21 15:35:09.977685311 +0000 UTC m=+5.107738928 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9c107ca7-f14c-4f8c-a8d4-4e08e3acb233-metrics-certs") pod "network-metrics-daemon-28b7m" (UID: "9c107ca7-f14c-4f8c-a8d4-4e08e3acb233") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 15:35:08.178968 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:08.178908 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t5phk\" (UniqueName: \"kubernetes.io/projected/8ea4d113-155e-4fa2-b765-c12d26b37fa1-kube-api-access-t5phk\") pod \"network-check-target-ntgnx\" (UID: \"8ea4d113-155e-4fa2-b765-c12d26b37fa1\") " pod="openshift-network-diagnostics/network-check-target-ntgnx" Apr 21 15:35:08.179165 ip-10-0-136-123 kubenswrapper[2573]: E0421 15:35:08.179118 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 15:35:08.179165 ip-10-0-136-123 kubenswrapper[2573]: E0421 15:35:08.179137 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 15:35:08.179165 ip-10-0-136-123 kubenswrapper[2573]: E0421 15:35:08.179150 2573 projected.go:194] Error preparing data for projected volume kube-api-access-t5phk for pod openshift-network-diagnostics/network-check-target-ntgnx: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 15:35:08.179332 ip-10-0-136-123 kubenswrapper[2573]: E0421 15:35:08.179211 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8ea4d113-155e-4fa2-b765-c12d26b37fa1-kube-api-access-t5phk podName:8ea4d113-155e-4fa2-b765-c12d26b37fa1 nodeName:}" failed. No retries permitted until 2026-04-21 15:35:10.17919217 +0000 UTC m=+5.309245810 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-t5phk" (UniqueName: "kubernetes.io/projected/8ea4d113-155e-4fa2-b765-c12d26b37fa1-kube-api-access-t5phk") pod "network-check-target-ntgnx" (UID: "8ea4d113-155e-4fa2-b765-c12d26b37fa1") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 15:35:08.466009 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:08.465916 2573 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-20 15:30:06 +0000 UTC" deadline="2028-01-01 23:54:07.85520979 +0000 UTC" Apr 21 15:35:08.466009 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:08.465955 2573 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14888h18m59.389257536s" Apr 21 15:35:08.476276 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:08.476245 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-28b7m" Apr 21 15:35:08.476445 ip-10-0-136-123 kubenswrapper[2573]: E0421 15:35:08.476373 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-28b7m" podUID="9c107ca7-f14c-4f8c-a8d4-4e08e3acb233" Apr 21 15:35:08.552476 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:08.552443 2573 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 15:35:09.477113 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:09.477079 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ntgnx" Apr 21 15:35:09.477577 ip-10-0-136-123 kubenswrapper[2573]: E0421 15:35:09.477209 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ntgnx" podUID="8ea4d113-155e-4fa2-b765-c12d26b37fa1" Apr 21 15:35:09.996000 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:09.995963 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9c107ca7-f14c-4f8c-a8d4-4e08e3acb233-metrics-certs\") pod \"network-metrics-daemon-28b7m\" (UID: \"9c107ca7-f14c-4f8c-a8d4-4e08e3acb233\") " pod="openshift-multus/network-metrics-daemon-28b7m" Apr 21 15:35:09.996178 ip-10-0-136-123 kubenswrapper[2573]: E0421 15:35:09.996130 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 15:35:09.996236 ip-10-0-136-123 kubenswrapper[2573]: E0421 15:35:09.996197 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9c107ca7-f14c-4f8c-a8d4-4e08e3acb233-metrics-certs podName:9c107ca7-f14c-4f8c-a8d4-4e08e3acb233 nodeName:}" failed. No retries permitted until 2026-04-21 15:35:13.996178571 +0000 UTC m=+9.126232184 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9c107ca7-f14c-4f8c-a8d4-4e08e3acb233-metrics-certs") pod "network-metrics-daemon-28b7m" (UID: "9c107ca7-f14c-4f8c-a8d4-4e08e3acb233") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 15:35:10.198168 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:10.198130 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t5phk\" (UniqueName: \"kubernetes.io/projected/8ea4d113-155e-4fa2-b765-c12d26b37fa1-kube-api-access-t5phk\") pod \"network-check-target-ntgnx\" (UID: \"8ea4d113-155e-4fa2-b765-c12d26b37fa1\") " pod="openshift-network-diagnostics/network-check-target-ntgnx" Apr 21 15:35:10.198366 ip-10-0-136-123 kubenswrapper[2573]: E0421 15:35:10.198314 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 15:35:10.198366 ip-10-0-136-123 kubenswrapper[2573]: E0421 15:35:10.198333 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 15:35:10.198366 ip-10-0-136-123 kubenswrapper[2573]: E0421 15:35:10.198346 2573 projected.go:194] Error preparing data for projected volume kube-api-access-t5phk for pod openshift-network-diagnostics/network-check-target-ntgnx: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 15:35:10.198579 ip-10-0-136-123 kubenswrapper[2573]: E0421 15:35:10.198403 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8ea4d113-155e-4fa2-b765-c12d26b37fa1-kube-api-access-t5phk podName:8ea4d113-155e-4fa2-b765-c12d26b37fa1 nodeName:}" failed. No retries permitted until 2026-04-21 15:35:14.198386423 +0000 UTC m=+9.328440039 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-t5phk" (UniqueName: "kubernetes.io/projected/8ea4d113-155e-4fa2-b765-c12d26b37fa1-kube-api-access-t5phk") pod "network-check-target-ntgnx" (UID: "8ea4d113-155e-4fa2-b765-c12d26b37fa1") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 15:35:10.476494 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:10.476409 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-28b7m" Apr 21 15:35:10.476661 ip-10-0-136-123 kubenswrapper[2573]: E0421 15:35:10.476545 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-28b7m" podUID="9c107ca7-f14c-4f8c-a8d4-4e08e3acb233" Apr 21 15:35:11.480342 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:11.479906 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ntgnx" Apr 21 15:35:11.480342 ip-10-0-136-123 kubenswrapper[2573]: E0421 15:35:11.480264 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ntgnx" podUID="8ea4d113-155e-4fa2-b765-c12d26b37fa1" Apr 21 15:35:12.477342 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:12.476838 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-28b7m" Apr 21 15:35:12.477342 ip-10-0-136-123 kubenswrapper[2573]: E0421 15:35:12.476967 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-28b7m" podUID="9c107ca7-f14c-4f8c-a8d4-4e08e3acb233" Apr 21 15:35:13.476648 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:13.476549 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ntgnx" Apr 21 15:35:13.477139 ip-10-0-136-123 kubenswrapper[2573]: E0421 15:35:13.476679 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ntgnx" podUID="8ea4d113-155e-4fa2-b765-c12d26b37fa1" Apr 21 15:35:14.029427 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:14.029374 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9c107ca7-f14c-4f8c-a8d4-4e08e3acb233-metrics-certs\") pod \"network-metrics-daemon-28b7m\" (UID: \"9c107ca7-f14c-4f8c-a8d4-4e08e3acb233\") " pod="openshift-multus/network-metrics-daemon-28b7m" Apr 21 15:35:14.029575 ip-10-0-136-123 kubenswrapper[2573]: E0421 15:35:14.029533 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 15:35:14.029637 ip-10-0-136-123 kubenswrapper[2573]: E0421 15:35:14.029600 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9c107ca7-f14c-4f8c-a8d4-4e08e3acb233-metrics-certs podName:9c107ca7-f14c-4f8c-a8d4-4e08e3acb233 nodeName:}" failed. No retries permitted until 2026-04-21 15:35:22.029580577 +0000 UTC m=+17.159634204 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9c107ca7-f14c-4f8c-a8d4-4e08e3acb233-metrics-certs") pod "network-metrics-daemon-28b7m" (UID: "9c107ca7-f14c-4f8c-a8d4-4e08e3acb233") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 15:35:14.231333 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:14.231288 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t5phk\" (UniqueName: \"kubernetes.io/projected/8ea4d113-155e-4fa2-b765-c12d26b37fa1-kube-api-access-t5phk\") pod \"network-check-target-ntgnx\" (UID: \"8ea4d113-155e-4fa2-b765-c12d26b37fa1\") " pod="openshift-network-diagnostics/network-check-target-ntgnx" Apr 21 15:35:14.231504 ip-10-0-136-123 kubenswrapper[2573]: E0421 15:35:14.231471 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 15:35:14.231504 ip-10-0-136-123 kubenswrapper[2573]: E0421 15:35:14.231493 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 15:35:14.231623 ip-10-0-136-123 kubenswrapper[2573]: E0421 15:35:14.231505 2573 projected.go:194] Error preparing data for projected volume kube-api-access-t5phk for pod openshift-network-diagnostics/network-check-target-ntgnx: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 15:35:14.231623 ip-10-0-136-123 kubenswrapper[2573]: E0421 15:35:14.231572 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8ea4d113-155e-4fa2-b765-c12d26b37fa1-kube-api-access-t5phk podName:8ea4d113-155e-4fa2-b765-c12d26b37fa1 nodeName:}" failed. No retries permitted until 2026-04-21 15:35:22.231553058 +0000 UTC m=+17.361606672 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-t5phk" (UniqueName: "kubernetes.io/projected/8ea4d113-155e-4fa2-b765-c12d26b37fa1-kube-api-access-t5phk") pod "network-check-target-ntgnx" (UID: "8ea4d113-155e-4fa2-b765-c12d26b37fa1") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 15:35:14.476913 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:14.476812 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-28b7m" Apr 21 15:35:14.477357 ip-10-0-136-123 kubenswrapper[2573]: E0421 15:35:14.476964 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-28b7m" podUID="9c107ca7-f14c-4f8c-a8d4-4e08e3acb233" Apr 21 15:35:15.477457 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:15.477403 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ntgnx" Apr 21 15:35:15.477943 ip-10-0-136-123 kubenswrapper[2573]: E0421 15:35:15.477533 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ntgnx" podUID="8ea4d113-155e-4fa2-b765-c12d26b37fa1" Apr 21 15:35:16.476569 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:16.476529 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-28b7m" Apr 21 15:35:16.476735 ip-10-0-136-123 kubenswrapper[2573]: E0421 15:35:16.476690 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-28b7m" podUID="9c107ca7-f14c-4f8c-a8d4-4e08e3acb233" Apr 21 15:35:17.480253 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:17.480224 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ntgnx" Apr 21 15:35:17.480689 ip-10-0-136-123 kubenswrapper[2573]: E0421 15:35:17.480331 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ntgnx" podUID="8ea4d113-155e-4fa2-b765-c12d26b37fa1" Apr 21 15:35:18.476828 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:18.476770 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-28b7m" Apr 21 15:35:18.477024 ip-10-0-136-123 kubenswrapper[2573]: E0421 15:35:18.476903 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-28b7m" podUID="9c107ca7-f14c-4f8c-a8d4-4e08e3acb233" Apr 21 15:35:19.478906 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:19.478876 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ntgnx" Apr 21 15:35:19.479316 ip-10-0-136-123 kubenswrapper[2573]: E0421 15:35:19.478985 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ntgnx" podUID="8ea4d113-155e-4fa2-b765-c12d26b37fa1" Apr 21 15:35:20.476474 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:20.476439 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-28b7m" Apr 21 15:35:20.476671 ip-10-0-136-123 kubenswrapper[2573]: E0421 15:35:20.476567 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-28b7m" podUID="9c107ca7-f14c-4f8c-a8d4-4e08e3acb233" Apr 21 15:35:21.479972 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:21.479940 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ntgnx" Apr 21 15:35:21.480446 ip-10-0-136-123 kubenswrapper[2573]: E0421 15:35:21.480048 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ntgnx" podUID="8ea4d113-155e-4fa2-b765-c12d26b37fa1" Apr 21 15:35:22.086315 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:22.086279 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9c107ca7-f14c-4f8c-a8d4-4e08e3acb233-metrics-certs\") pod \"network-metrics-daemon-28b7m\" (UID: \"9c107ca7-f14c-4f8c-a8d4-4e08e3acb233\") " pod="openshift-multus/network-metrics-daemon-28b7m" Apr 21 15:35:22.086517 ip-10-0-136-123 kubenswrapper[2573]: E0421 15:35:22.086440 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 15:35:22.086517 ip-10-0-136-123 kubenswrapper[2573]: E0421 15:35:22.086512 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9c107ca7-f14c-4f8c-a8d4-4e08e3acb233-metrics-certs podName:9c107ca7-f14c-4f8c-a8d4-4e08e3acb233 nodeName:}" failed. No retries permitted until 2026-04-21 15:35:38.08648988 +0000 UTC m=+33.216543493 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9c107ca7-f14c-4f8c-a8d4-4e08e3acb233-metrics-certs") pod "network-metrics-daemon-28b7m" (UID: "9c107ca7-f14c-4f8c-a8d4-4e08e3acb233") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 15:35:22.287726 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:22.287689 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t5phk\" (UniqueName: \"kubernetes.io/projected/8ea4d113-155e-4fa2-b765-c12d26b37fa1-kube-api-access-t5phk\") pod \"network-check-target-ntgnx\" (UID: \"8ea4d113-155e-4fa2-b765-c12d26b37fa1\") " pod="openshift-network-diagnostics/network-check-target-ntgnx" Apr 21 15:35:22.287897 ip-10-0-136-123 kubenswrapper[2573]: E0421 15:35:22.287855 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 15:35:22.287897 ip-10-0-136-123 kubenswrapper[2573]: E0421 15:35:22.287876 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 15:35:22.287897 ip-10-0-136-123 kubenswrapper[2573]: E0421 15:35:22.287889 2573 projected.go:194] Error preparing data for projected volume kube-api-access-t5phk for pod openshift-network-diagnostics/network-check-target-ntgnx: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 15:35:22.288012 ip-10-0-136-123 kubenswrapper[2573]: E0421 15:35:22.287954 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8ea4d113-155e-4fa2-b765-c12d26b37fa1-kube-api-access-t5phk podName:8ea4d113-155e-4fa2-b765-c12d26b37fa1 nodeName:}" failed. No retries permitted until 2026-04-21 15:35:38.287934106 +0000 UTC m=+33.417987720 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-t5phk" (UniqueName: "kubernetes.io/projected/8ea4d113-155e-4fa2-b765-c12d26b37fa1-kube-api-access-t5phk") pod "network-check-target-ntgnx" (UID: "8ea4d113-155e-4fa2-b765-c12d26b37fa1") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 15:35:22.476848 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:22.476755 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-28b7m" Apr 21 15:35:22.477013 ip-10-0-136-123 kubenswrapper[2573]: E0421 15:35:22.476913 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-28b7m" podUID="9c107ca7-f14c-4f8c-a8d4-4e08e3acb233" Apr 21 15:35:23.480096 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:23.480069 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ntgnx" Apr 21 15:35:23.480465 ip-10-0-136-123 kubenswrapper[2573]: E0421 15:35:23.480171 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ntgnx" podUID="8ea4d113-155e-4fa2-b765-c12d26b37fa1" Apr 21 15:35:24.476633 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:24.476607 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-28b7m" Apr 21 15:35:24.476746 ip-10-0-136-123 kubenswrapper[2573]: E0421 15:35:24.476716 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-28b7m" podUID="9c107ca7-f14c-4f8c-a8d4-4e08e3acb233" Apr 21 15:35:25.275253 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:25.275086 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-4pjd9"] Apr 21 15:35:25.277993 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:25.277976 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-4pjd9" Apr 21 15:35:25.278069 ip-10-0-136-123 kubenswrapper[2573]: E0421 15:35:25.278052 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-4pjd9" podUID="97199e5c-4c05-4197-84c9-e95b525f3ae1" Apr 21 15:35:25.409255 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:25.408986 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/97199e5c-4c05-4197-84c9-e95b525f3ae1-original-pull-secret\") pod \"global-pull-secret-syncer-4pjd9\" (UID: \"97199e5c-4c05-4197-84c9-e95b525f3ae1\") " pod="kube-system/global-pull-secret-syncer-4pjd9" Apr 21 15:35:25.409399 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:25.409294 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/97199e5c-4c05-4197-84c9-e95b525f3ae1-kubelet-config\") pod \"global-pull-secret-syncer-4pjd9\" (UID: \"97199e5c-4c05-4197-84c9-e95b525f3ae1\") " pod="kube-system/global-pull-secret-syncer-4pjd9" Apr 21 15:35:25.409399 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:25.409349 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/97199e5c-4c05-4197-84c9-e95b525f3ae1-dbus\") pod \"global-pull-secret-syncer-4pjd9\" (UID: \"97199e5c-4c05-4197-84c9-e95b525f3ae1\") " pod="kube-system/global-pull-secret-syncer-4pjd9" Apr 21 15:35:25.481657 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:25.481630 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ntgnx" Apr 21 15:35:25.481787 ip-10-0-136-123 kubenswrapper[2573]: E0421 15:35:25.481719 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ntgnx" podUID="8ea4d113-155e-4fa2-b765-c12d26b37fa1" Apr 21 15:35:25.509764 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:25.509730 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/97199e5c-4c05-4197-84c9-e95b525f3ae1-original-pull-secret\") pod \"global-pull-secret-syncer-4pjd9\" (UID: \"97199e5c-4c05-4197-84c9-e95b525f3ae1\") " pod="kube-system/global-pull-secret-syncer-4pjd9" Apr 21 15:35:25.509764 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:25.509768 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/97199e5c-4c05-4197-84c9-e95b525f3ae1-kubelet-config\") pod \"global-pull-secret-syncer-4pjd9\" (UID: \"97199e5c-4c05-4197-84c9-e95b525f3ae1\") " pod="kube-system/global-pull-secret-syncer-4pjd9" Apr 21 15:35:25.509941 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:25.509813 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/97199e5c-4c05-4197-84c9-e95b525f3ae1-dbus\") pod \"global-pull-secret-syncer-4pjd9\" (UID: \"97199e5c-4c05-4197-84c9-e95b525f3ae1\") " pod="kube-system/global-pull-secret-syncer-4pjd9" Apr 21 15:35:25.509941 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:25.509877 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/97199e5c-4c05-4197-84c9-e95b525f3ae1-kubelet-config\") pod \"global-pull-secret-syncer-4pjd9\" (UID: \"97199e5c-4c05-4197-84c9-e95b525f3ae1\") " pod="kube-system/global-pull-secret-syncer-4pjd9" Apr 21 15:35:25.509941 ip-10-0-136-123 kubenswrapper[2573]: E0421 15:35:25.509894 2573 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 21 15:35:25.509941 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:25.509940 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/97199e5c-4c05-4197-84c9-e95b525f3ae1-dbus\") pod \"global-pull-secret-syncer-4pjd9\" (UID: \"97199e5c-4c05-4197-84c9-e95b525f3ae1\") " pod="kube-system/global-pull-secret-syncer-4pjd9" Apr 21 15:35:25.510060 ip-10-0-136-123 kubenswrapper[2573]: E0421 15:35:25.509953 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/97199e5c-4c05-4197-84c9-e95b525f3ae1-original-pull-secret podName:97199e5c-4c05-4197-84c9-e95b525f3ae1 nodeName:}" failed. No retries permitted until 2026-04-21 15:35:26.009938404 +0000 UTC m=+21.139992020 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/97199e5c-4c05-4197-84c9-e95b525f3ae1-original-pull-secret") pod "global-pull-secret-syncer-4pjd9" (UID: "97199e5c-4c05-4197-84c9-e95b525f3ae1") : object "kube-system"/"original-pull-secret" not registered Apr 21 15:35:25.631394 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:25.631363 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-rvv9j" event={"ID":"a7bb4a1e-4105-43ec-a600-43495885c030","Type":"ContainerStarted","Data":"51b66810d2efc81b7eaadcff8f0142c38de195ed749553176d571cf286070e55"} Apr 21 15:35:25.632613 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:25.632591 2573 generic.go:358] "Generic (PLEG): container finished" podID="31c04054-fa66-445a-9246-9c32b20cd60d" containerID="77c725d26ef3c11ce14a524c1ddbdccb75b2e4e02eb788c42587cb46f9219a2c" exitCode=0 Apr 21 15:35:25.632710 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:25.632650 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-gkvb2" event={"ID":"31c04054-fa66-445a-9246-9c32b20cd60d","Type":"ContainerDied","Data":"77c725d26ef3c11ce14a524c1ddbdccb75b2e4e02eb788c42587cb46f9219a2c"} Apr 21 15:35:25.633982 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:25.633857 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4qs56" event={"ID":"7c31902c-772d-4d70-a92a-d6c21f8a1a17","Type":"ContainerStarted","Data":"88961bde6215af3ff7eeef59c1022cf862323ac90b6d1538bfd72ffb4cfdad74"} Apr 21 15:35:25.635069 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:25.635045 2573 generic.go:358] "Generic (PLEG): container finished" podID="9a041855c13fefe5756acf5eec467e79" containerID="5b97252b0a9e65a3c44121b6e73779d8558245d311b5522c8d12c72fc0b2b916" exitCode=0 Apr 21 15:35:25.635161 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:25.635117 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-123.ec2.internal" event={"ID":"9a041855c13fefe5756acf5eec467e79","Type":"ContainerDied","Data":"5b97252b0a9e65a3c44121b6e73779d8558245d311b5522c8d12c72fc0b2b916"} Apr 21 15:35:25.636395 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:25.636377 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-136-123.ec2.internal" event={"ID":"e27d8eec64ea5a1e06e35be5839e2c48","Type":"ContainerStarted","Data":"4647fab72a0205a854d4b0ec5e4ecbe03339be22f30557adef29a82604ec34b8"} Apr 21 15:35:25.637717 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:25.637689 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-2flj5" event={"ID":"a45717a3-b37a-442d-822c-c0485d21bf6b","Type":"ContainerStarted","Data":"6d1879caebdc5d135e2d096500bb58f8a66df8f6b2d4b6b90e65c6eaa7e259dd"} Apr 21 15:35:25.638978 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:25.638954 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-m9xpg" event={"ID":"39fccf23-7816-40f1-9d1a-0711aca322c8","Type":"ContainerStarted","Data":"6c4516c25400bc7a6558bd6b5f57843071ee75ebac183e1b22fab0495a6d1821"} Apr 21 15:35:25.640169 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:25.640139 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-xph98" event={"ID":"63dd9652-ce6c-4395-ae74-cba66c5a8c72","Type":"ContainerStarted","Data":"bae6792f80d7e769973852209cf31a256915c1628d8ffa269e811a1bc6bd6f72"} Apr 21 15:35:25.642233 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:25.642216 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9v44z_a8821bf6-e244-4b55-bfcc-7d85dec39bc4/ovn-acl-logging/0.log" Apr 21 15:35:25.642524 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:25.642507 2573 generic.go:358] "Generic (PLEG): container finished" podID="a8821bf6-e244-4b55-bfcc-7d85dec39bc4" containerID="3d68e7050a770a0f75118afc6a2c640995fcd88a1114eb476f711a235bec650f" exitCode=1 Apr 21 15:35:25.642594 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:25.642562 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9v44z" event={"ID":"a8821bf6-e244-4b55-bfcc-7d85dec39bc4","Type":"ContainerStarted","Data":"af160179ce281c3e0a5307fc9cfb374cdfec455ce78dfeae06a26f7719a909d7"} Apr 21 15:35:25.642594 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:25.642584 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9v44z" event={"ID":"a8821bf6-e244-4b55-bfcc-7d85dec39bc4","Type":"ContainerStarted","Data":"0d7161901724c82e105ff1c79f59ca25c4a8d99b774122569d18880e694c43c1"} Apr 21 15:35:25.642683 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:25.642593 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9v44z" event={"ID":"a8821bf6-e244-4b55-bfcc-7d85dec39bc4","Type":"ContainerStarted","Data":"50d39af2842e2c0845c8bb613146bf1fd9345782d18e59135f871dd1abccc36d"} Apr 21 15:35:25.642683 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:25.642606 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9v44z" event={"ID":"a8821bf6-e244-4b55-bfcc-7d85dec39bc4","Type":"ContainerStarted","Data":"088f87e76a2ddce5b71b9e54aa4bc1c8216015ede52bdae0505deeb3d7e249b6"} Apr 21 15:35:25.642683 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:25.642615 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9v44z" event={"ID":"a8821bf6-e244-4b55-bfcc-7d85dec39bc4","Type":"ContainerDied","Data":"3d68e7050a770a0f75118afc6a2c640995fcd88a1114eb476f711a235bec650f"} Apr 21 15:35:25.642683 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:25.642625 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9v44z" event={"ID":"a8821bf6-e244-4b55-bfcc-7d85dec39bc4","Type":"ContainerStarted","Data":"a623da552eba58c0388e57b4f262770a100693db351344aaca1f37640ade060f"} Apr 21 15:35:25.643638 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:25.643610 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-lkj85" event={"ID":"96e29eb1-d270-4d82-a139-d970d1863b1c","Type":"ContainerStarted","Data":"47acd9e295dbbe6ecbe75686a7b67d1594ef43b9beeba9c1d973163336c3703b"} Apr 21 15:35:25.644681 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:25.644663 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-44f8h" event={"ID":"6a9ce3f2-ec7b-4278-88b3-b6eba783ca7b","Type":"ContainerStarted","Data":"1418b6c2a6b69f38b82161a470b7be0dd5306fa1354170123b71546cc4e6a796"} Apr 21 15:35:25.649500 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:25.649463 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-rvv9j" podStartSLOduration=2.889687803 podStartE2EDuration="20.64945293s" podCreationTimestamp="2026-04-21 15:35:05 +0000 UTC" firstStartedPulling="2026-04-21 15:35:06.747208303 +0000 UTC m=+1.877261914" lastFinishedPulling="2026-04-21 15:35:24.506973421 +0000 UTC m=+19.637027041" observedRunningTime="2026-04-21 15:35:25.649285237 +0000 UTC m=+20.779338887" watchObservedRunningTime="2026-04-21 15:35:25.64945293 +0000 UTC m=+20.779506563" Apr 21 15:35:25.665788 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:25.665743 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-4qs56" podStartSLOduration=2.839269557 podStartE2EDuration="20.665732325s" podCreationTimestamp="2026-04-21 15:35:05 +0000 UTC" firstStartedPulling="2026-04-21 15:35:06.679709364 +0000 UTC m=+1.809762975" lastFinishedPulling="2026-04-21 15:35:24.506172111 +0000 UTC m=+19.636225743" observedRunningTime="2026-04-21 15:35:25.665476047 +0000 UTC m=+20.795529681" watchObservedRunningTime="2026-04-21 15:35:25.665732325 +0000 UTC m=+20.795785984" Apr 21 15:35:25.677810 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:25.677754 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-xph98" podStartSLOduration=2.984524395 podStartE2EDuration="20.677744267s" podCreationTimestamp="2026-04-21 15:35:05 +0000 UTC" firstStartedPulling="2026-04-21 15:35:06.702899281 +0000 UTC m=+1.832952892" lastFinishedPulling="2026-04-21 15:35:24.396119131 +0000 UTC m=+19.526172764" observedRunningTime="2026-04-21 15:35:25.677720641 +0000 UTC m=+20.807774269" watchObservedRunningTime="2026-04-21 15:35:25.677744267 +0000 UTC m=+20.807797900" Apr 21 15:35:25.717186 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:25.717138 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-2flj5" podStartSLOduration=2.969982035 podStartE2EDuration="20.717123902s" podCreationTimestamp="2026-04-21 15:35:05 +0000 UTC" firstStartedPulling="2026-04-21 15:35:06.760552698 +0000 UTC m=+1.890606313" lastFinishedPulling="2026-04-21 15:35:24.507694561 +0000 UTC m=+19.637748180" observedRunningTime="2026-04-21 15:35:25.717063062 +0000 UTC m=+20.847116696" watchObservedRunningTime="2026-04-21 15:35:25.717123902 +0000 UTC m=+20.847177535" Apr 21 15:35:25.730283 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:25.730238 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-136-123.ec2.internal" podStartSLOduration=19.730222341 podStartE2EDuration="19.730222341s" podCreationTimestamp="2026-04-21 15:35:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 15:35:25.729830988 +0000 UTC m=+20.859884623" watchObservedRunningTime="2026-04-21 15:35:25.730222341 +0000 UTC m=+20.860275976" Apr 21 15:35:25.757624 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:25.757574 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-m9xpg" podStartSLOduration=7.48982162 podStartE2EDuration="20.757560767s" podCreationTimestamp="2026-04-21 15:35:05 +0000 UTC" firstStartedPulling="2026-04-21 15:35:06.713265588 +0000 UTC m=+1.843319200" lastFinishedPulling="2026-04-21 15:35:19.981004731 +0000 UTC m=+15.111058347" observedRunningTime="2026-04-21 15:35:25.757311055 +0000 UTC m=+20.887364689" watchObservedRunningTime="2026-04-21 15:35:25.757560767 +0000 UTC m=+20.887614398" Apr 21 15:35:25.779218 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:25.779117 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-lkj85" podStartSLOduration=7.562947925 podStartE2EDuration="20.779099245s" podCreationTimestamp="2026-04-21 15:35:05 +0000 UTC" firstStartedPulling="2026-04-21 15:35:06.764849998 +0000 UTC m=+1.894903623" lastFinishedPulling="2026-04-21 15:35:19.981001319 +0000 UTC m=+15.111054943" observedRunningTime="2026-04-21 15:35:25.778917588 +0000 UTC m=+20.908971225" watchObservedRunningTime="2026-04-21 15:35:25.779099245 +0000 UTC m=+20.909152882" Apr 21 15:35:26.014253 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:26.014199 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/97199e5c-4c05-4197-84c9-e95b525f3ae1-original-pull-secret\") pod \"global-pull-secret-syncer-4pjd9\" (UID: \"97199e5c-4c05-4197-84c9-e95b525f3ae1\") " pod="kube-system/global-pull-secret-syncer-4pjd9" Apr 21 15:35:26.014505 ip-10-0-136-123 kubenswrapper[2573]: E0421 15:35:26.014353 2573 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 21 15:35:26.014505 ip-10-0-136-123 kubenswrapper[2573]: E0421 15:35:26.014423 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/97199e5c-4c05-4197-84c9-e95b525f3ae1-original-pull-secret podName:97199e5c-4c05-4197-84c9-e95b525f3ae1 nodeName:}" failed. No retries permitted until 2026-04-21 15:35:27.014403587 +0000 UTC m=+22.144457217 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/97199e5c-4c05-4197-84c9-e95b525f3ae1-original-pull-secret") pod "global-pull-secret-syncer-4pjd9" (UID: "97199e5c-4c05-4197-84c9-e95b525f3ae1") : object "kube-system"/"original-pull-secret" not registered Apr 21 15:35:26.231307 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:26.231256 2573 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 21 15:35:26.421031 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:26.420814 2573 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-21T15:35:26.231276593Z","UUID":"a2cf87f3-12bc-4fd3-87d9-f9a252b60392","Handler":null,"Name":"","Endpoint":""} Apr 21 15:35:26.422597 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:26.422571 2573 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 21 15:35:26.422597 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:26.422604 2573 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 21 15:35:26.476989 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:26.476953 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-28b7m" Apr 21 15:35:26.477234 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:26.476969 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-4pjd9" Apr 21 15:35:26.477234 ip-10-0-136-123 kubenswrapper[2573]: E0421 15:35:26.477092 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-28b7m" podUID="9c107ca7-f14c-4f8c-a8d4-4e08e3acb233" Apr 21 15:35:26.477234 ip-10-0-136-123 kubenswrapper[2573]: E0421 15:35:26.477148 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-4pjd9" podUID="97199e5c-4c05-4197-84c9-e95b525f3ae1" Apr 21 15:35:26.650473 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:26.650428 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-44f8h" event={"ID":"6a9ce3f2-ec7b-4278-88b3-b6eba783ca7b","Type":"ContainerStarted","Data":"3d9cccdf1e0523b4758c65e9c8bb6918a2e70688db1fcf55a761353411280537"} Apr 21 15:35:27.022228 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:27.022123 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/97199e5c-4c05-4197-84c9-e95b525f3ae1-original-pull-secret\") pod \"global-pull-secret-syncer-4pjd9\" (UID: \"97199e5c-4c05-4197-84c9-e95b525f3ae1\") " pod="kube-system/global-pull-secret-syncer-4pjd9" Apr 21 15:35:27.022402 ip-10-0-136-123 kubenswrapper[2573]: E0421 15:35:27.022292 2573 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 21 15:35:27.022402 ip-10-0-136-123 kubenswrapper[2573]: E0421 15:35:27.022372 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/97199e5c-4c05-4197-84c9-e95b525f3ae1-original-pull-secret podName:97199e5c-4c05-4197-84c9-e95b525f3ae1 nodeName:}" failed. No retries permitted until 2026-04-21 15:35:29.022350509 +0000 UTC m=+24.152404122 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/97199e5c-4c05-4197-84c9-e95b525f3ae1-original-pull-secret") pod "global-pull-secret-syncer-4pjd9" (UID: "97199e5c-4c05-4197-84c9-e95b525f3ae1") : object "kube-system"/"original-pull-secret" not registered Apr 21 15:35:27.479709 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:27.479676 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ntgnx" Apr 21 15:35:27.480169 ip-10-0-136-123 kubenswrapper[2573]: E0421 15:35:27.479812 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ntgnx" podUID="8ea4d113-155e-4fa2-b765-c12d26b37fa1" Apr 21 15:35:27.654459 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:27.654364 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-123.ec2.internal" event={"ID":"9a041855c13fefe5756acf5eec467e79","Type":"ContainerStarted","Data":"5cb058a0f170e6944150084a4079b851013f32debcc4b8b071c6a65c392a5277"} Apr 21 15:35:27.657479 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:27.657456 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9v44z_a8821bf6-e244-4b55-bfcc-7d85dec39bc4/ovn-acl-logging/0.log" Apr 21 15:35:27.657859 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:27.657824 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9v44z" event={"ID":"a8821bf6-e244-4b55-bfcc-7d85dec39bc4","Type":"ContainerStarted","Data":"f946623b751db05352a0ec2f4210cdfab3076ceb5df316cb13edf7e4eefe1919"} Apr 21 15:35:27.659855 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:27.659830 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-44f8h" event={"ID":"6a9ce3f2-ec7b-4278-88b3-b6eba783ca7b","Type":"ContainerStarted","Data":"ba2d6a8f7ef16f1e53ff2d493d2410d298c1b0a443c77bcf5b63ac31cfb6442e"} Apr 21 15:35:27.685766 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:27.685710 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-123.ec2.internal" podStartSLOduration=21.685691773 podStartE2EDuration="21.685691773s" podCreationTimestamp="2026-04-21 15:35:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 15:35:27.684972768 +0000 UTC m=+22.815026403" watchObservedRunningTime="2026-04-21 15:35:27.685691773 +0000 UTC m=+22.815745408" Apr 21 15:35:27.705280 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:27.705216 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-44f8h" podStartSLOduration=2.246108336 podStartE2EDuration="22.705196897s" podCreationTimestamp="2026-04-21 15:35:05 +0000 UTC" firstStartedPulling="2026-04-21 15:35:06.752660178 +0000 UTC m=+1.882713798" lastFinishedPulling="2026-04-21 15:35:27.211748735 +0000 UTC m=+22.341802359" observedRunningTime="2026-04-21 15:35:27.704599145 +0000 UTC m=+22.834652780" watchObservedRunningTime="2026-04-21 15:35:27.705196897 +0000 UTC m=+22.835250558" Apr 21 15:35:28.477253 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:28.477057 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-4pjd9" Apr 21 15:35:28.477437 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:28.477074 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-28b7m" Apr 21 15:35:28.477437 ip-10-0-136-123 kubenswrapper[2573]: E0421 15:35:28.477333 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-4pjd9" podUID="97199e5c-4c05-4197-84c9-e95b525f3ae1" Apr 21 15:35:28.477437 ip-10-0-136-123 kubenswrapper[2573]: E0421 15:35:28.477421 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-28b7m" podUID="9c107ca7-f14c-4f8c-a8d4-4e08e3acb233" Apr 21 15:35:29.036869 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:29.036813 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/97199e5c-4c05-4197-84c9-e95b525f3ae1-original-pull-secret\") pod \"global-pull-secret-syncer-4pjd9\" (UID: \"97199e5c-4c05-4197-84c9-e95b525f3ae1\") " pod="kube-system/global-pull-secret-syncer-4pjd9" Apr 21 15:35:29.037253 ip-10-0-136-123 kubenswrapper[2573]: E0421 15:35:29.036982 2573 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 21 15:35:29.037253 ip-10-0-136-123 kubenswrapper[2573]: E0421 15:35:29.037064 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/97199e5c-4c05-4197-84c9-e95b525f3ae1-original-pull-secret podName:97199e5c-4c05-4197-84c9-e95b525f3ae1 nodeName:}" failed. No retries permitted until 2026-04-21 15:35:33.037043427 +0000 UTC m=+28.167097043 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/97199e5c-4c05-4197-84c9-e95b525f3ae1-original-pull-secret") pod "global-pull-secret-syncer-4pjd9" (UID: "97199e5c-4c05-4197-84c9-e95b525f3ae1") : object "kube-system"/"original-pull-secret" not registered Apr 21 15:35:29.477075 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:29.477039 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ntgnx" Apr 21 15:35:29.477256 ip-10-0-136-123 kubenswrapper[2573]: E0421 15:35:29.477185 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ntgnx" podUID="8ea4d113-155e-4fa2-b765-c12d26b37fa1" Apr 21 15:35:30.006459 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:30.006423 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-xph98" Apr 21 15:35:30.007163 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:30.007143 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-xph98" Apr 21 15:35:30.477116 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:30.476931 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-4pjd9" Apr 21 15:35:30.477820 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:30.476953 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-28b7m" Apr 21 15:35:30.477820 ip-10-0-136-123 kubenswrapper[2573]: E0421 15:35:30.477192 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-4pjd9" podUID="97199e5c-4c05-4197-84c9-e95b525f3ae1" Apr 21 15:35:30.477820 ip-10-0-136-123 kubenswrapper[2573]: E0421 15:35:30.477276 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-28b7m" podUID="9c107ca7-f14c-4f8c-a8d4-4e08e3acb233" Apr 21 15:35:30.667756 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:30.667725 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9v44z_a8821bf6-e244-4b55-bfcc-7d85dec39bc4/ovn-acl-logging/0.log" Apr 21 15:35:30.668089 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:30.668061 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9v44z" event={"ID":"a8821bf6-e244-4b55-bfcc-7d85dec39bc4","Type":"ContainerStarted","Data":"5085de52b0a88794a16dbffe2c59932326aab338cb8d736258eac557240b8e5e"} Apr 21 15:35:30.668345 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:30.668325 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-9v44z" Apr 21 15:35:30.668609 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:30.668584 2573 scope.go:117] "RemoveContainer" containerID="3d68e7050a770a0f75118afc6a2c640995fcd88a1114eb476f711a235bec650f" Apr 21 15:35:30.669727 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:30.669704 2573 generic.go:358] "Generic (PLEG): container finished" podID="31c04054-fa66-445a-9246-9c32b20cd60d" containerID="68314b785337cd3b14f4d6480a236662727de5764189bfe094ff8d66411bc45d" exitCode=0 Apr 21 15:35:30.669843 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:30.669734 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-gkvb2" event={"ID":"31c04054-fa66-445a-9246-9c32b20cd60d","Type":"ContainerDied","Data":"68314b785337cd3b14f4d6480a236662727de5764189bfe094ff8d66411bc45d"} Apr 21 15:35:30.670057 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:30.670037 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-xph98" Apr 21 15:35:30.670607 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:30.670583 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-xph98" Apr 21 15:35:30.683734 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:30.683717 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-9v44z" Apr 21 15:35:31.479621 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:31.479584 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ntgnx" Apr 21 15:35:31.479991 ip-10-0-136-123 kubenswrapper[2573]: E0421 15:35:31.479674 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ntgnx" podUID="8ea4d113-155e-4fa2-b765-c12d26b37fa1" Apr 21 15:35:31.675189 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:31.675169 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9v44z_a8821bf6-e244-4b55-bfcc-7d85dec39bc4/ovn-acl-logging/0.log" Apr 21 15:35:31.675585 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:31.675554 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9v44z" event={"ID":"a8821bf6-e244-4b55-bfcc-7d85dec39bc4","Type":"ContainerStarted","Data":"80db79616ad3a3a21fbe95fc35af44b7a5b6fe92e6fa732c7a92ed9b84fcf41b"} Apr 21 15:35:31.675702 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:31.675684 2573 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 21 15:35:31.676375 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:31.676344 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-9v44z" Apr 21 15:35:31.678155 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:31.678133 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-gkvb2" event={"ID":"31c04054-fa66-445a-9246-9c32b20cd60d","Type":"ContainerStarted","Data":"7066a82e0a5eb7820e61ae5979dd5d6f42cc237169d187fc83c901ed5921b6a5"} Apr 21 15:35:31.691079 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:31.691050 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-9v44z" Apr 21 15:35:31.708496 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:31.708441 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-9v44z" podStartSLOduration=8.702641734 podStartE2EDuration="26.708424814s" podCreationTimestamp="2026-04-21 15:35:05 +0000 UTC" firstStartedPulling="2026-04-21 15:35:06.682555693 +0000 UTC m=+1.812609305" lastFinishedPulling="2026-04-21 15:35:24.68833877 +0000 UTC m=+19.818392385" observedRunningTime="2026-04-21 15:35:31.705295571 +0000 UTC m=+26.835349204" watchObservedRunningTime="2026-04-21 15:35:31.708424814 +0000 UTC m=+26.838478450" Apr 21 15:35:31.960009 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:31.959971 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-4pjd9"] Apr 21 15:35:31.960155 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:31.960141 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-4pjd9" Apr 21 15:35:31.960274 ip-10-0-136-123 kubenswrapper[2573]: E0421 15:35:31.960250 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-4pjd9" podUID="97199e5c-4c05-4197-84c9-e95b525f3ae1" Apr 21 15:35:31.963454 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:31.963429 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-28b7m"] Apr 21 15:35:31.963565 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:31.963557 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-28b7m" Apr 21 15:35:31.963691 ip-10-0-136-123 kubenswrapper[2573]: E0421 15:35:31.963670 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-28b7m" podUID="9c107ca7-f14c-4f8c-a8d4-4e08e3acb233" Apr 21 15:35:31.967215 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:31.967190 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-ntgnx"] Apr 21 15:35:31.967309 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:31.967268 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ntgnx" Apr 21 15:35:31.967352 ip-10-0-136-123 kubenswrapper[2573]: E0421 15:35:31.967330 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ntgnx" podUID="8ea4d113-155e-4fa2-b765-c12d26b37fa1" Apr 21 15:35:32.681123 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:32.681087 2573 generic.go:358] "Generic (PLEG): container finished" podID="31c04054-fa66-445a-9246-9c32b20cd60d" containerID="7066a82e0a5eb7820e61ae5979dd5d6f42cc237169d187fc83c901ed5921b6a5" exitCode=0 Apr 21 15:35:32.681527 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:32.681176 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-gkvb2" event={"ID":"31c04054-fa66-445a-9246-9c32b20cd60d","Type":"ContainerDied","Data":"7066a82e0a5eb7820e61ae5979dd5d6f42cc237169d187fc83c901ed5921b6a5"} Apr 21 15:35:32.681527 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:32.681249 2573 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 21 15:35:32.803514 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:32.803465 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-9v44z" Apr 21 15:35:33.070495 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:33.070453 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/97199e5c-4c05-4197-84c9-e95b525f3ae1-original-pull-secret\") pod \"global-pull-secret-syncer-4pjd9\" (UID: \"97199e5c-4c05-4197-84c9-e95b525f3ae1\") " pod="kube-system/global-pull-secret-syncer-4pjd9" Apr 21 15:35:33.070656 ip-10-0-136-123 kubenswrapper[2573]: E0421 15:35:33.070571 2573 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 21 15:35:33.070656 ip-10-0-136-123 kubenswrapper[2573]: E0421 15:35:33.070624 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/97199e5c-4c05-4197-84c9-e95b525f3ae1-original-pull-secret podName:97199e5c-4c05-4197-84c9-e95b525f3ae1 nodeName:}" failed. No retries permitted until 2026-04-21 15:35:41.070611029 +0000 UTC m=+36.200664641 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/97199e5c-4c05-4197-84c9-e95b525f3ae1-original-pull-secret") pod "global-pull-secret-syncer-4pjd9" (UID: "97199e5c-4c05-4197-84c9-e95b525f3ae1") : object "kube-system"/"original-pull-secret" not registered Apr 21 15:35:33.480446 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:33.480420 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ntgnx" Apr 21 15:35:33.480446 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:33.480421 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-4pjd9" Apr 21 15:35:33.480630 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:33.480470 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-28b7m" Apr 21 15:35:33.480630 ip-10-0-136-123 kubenswrapper[2573]: E0421 15:35:33.480543 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-4pjd9" podUID="97199e5c-4c05-4197-84c9-e95b525f3ae1" Apr 21 15:35:33.480630 ip-10-0-136-123 kubenswrapper[2573]: E0421 15:35:33.480609 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-28b7m" podUID="9c107ca7-f14c-4f8c-a8d4-4e08e3acb233" Apr 21 15:35:33.480723 ip-10-0-136-123 kubenswrapper[2573]: E0421 15:35:33.480642 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ntgnx" podUID="8ea4d113-155e-4fa2-b765-c12d26b37fa1" Apr 21 15:35:33.686206 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:33.686106 2573 generic.go:358] "Generic (PLEG): container finished" podID="31c04054-fa66-445a-9246-9c32b20cd60d" containerID="69c532750984f84bbb6126bd1e85486e145189fa4c352803d48683f440d8611c" exitCode=0 Apr 21 15:35:33.686206 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:33.686194 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-gkvb2" event={"ID":"31c04054-fa66-445a-9246-9c32b20cd60d","Type":"ContainerDied","Data":"69c532750984f84bbb6126bd1e85486e145189fa4c352803d48683f440d8611c"} Apr 21 15:35:34.700189 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:34.700141 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-ovn-kubernetes/ovnkube-node-9v44z" podUID="a8821bf6-e244-4b55-bfcc-7d85dec39bc4" containerName="ovnkube-controller" probeResult="failure" output="" Apr 21 15:35:35.480923 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:35.480862 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ntgnx" Apr 21 15:35:35.481106 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:35.480952 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-4pjd9" Apr 21 15:35:35.481106 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:35.480980 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-28b7m" Apr 21 15:35:35.481106 ip-10-0-136-123 kubenswrapper[2573]: E0421 15:35:35.481017 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ntgnx" podUID="8ea4d113-155e-4fa2-b765-c12d26b37fa1" Apr 21 15:35:35.481232 ip-10-0-136-123 kubenswrapper[2573]: E0421 15:35:35.481061 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-28b7m" podUID="9c107ca7-f14c-4f8c-a8d4-4e08e3acb233" Apr 21 15:35:35.481280 ip-10-0-136-123 kubenswrapper[2573]: E0421 15:35:35.481264 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-4pjd9" podUID="97199e5c-4c05-4197-84c9-e95b525f3ae1" Apr 21 15:35:37.477025 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:37.476755 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ntgnx" Apr 21 15:35:37.477464 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:37.476763 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-4pjd9" Apr 21 15:35:37.477464 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:37.476763 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-28b7m" Apr 21 15:35:37.477464 ip-10-0-136-123 kubenswrapper[2573]: E0421 15:35:37.477203 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ntgnx" podUID="8ea4d113-155e-4fa2-b765-c12d26b37fa1" Apr 21 15:35:37.477464 ip-10-0-136-123 kubenswrapper[2573]: E0421 15:35:37.477336 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-4pjd9" podUID="97199e5c-4c05-4197-84c9-e95b525f3ae1" Apr 21 15:35:37.477464 ip-10-0-136-123 kubenswrapper[2573]: E0421 15:35:37.477450 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-28b7m" podUID="9c107ca7-f14c-4f8c-a8d4-4e08e3acb233" Apr 21 15:35:37.717376 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:37.717332 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-123.ec2.internal" event="NodeReady" Apr 21 15:35:37.717554 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:37.717484 2573 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 21 15:35:37.753632 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:37.753602 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-fd595c69c-6flrk"] Apr 21 15:35:37.782438 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:37.782403 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-5b8765b5cd-hk87t"] Apr 21 15:35:37.782611 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:37.782580 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-fd595c69c-6flrk" Apr 21 15:35:37.785751 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:37.785342 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 21 15:35:37.785751 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:37.785396 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-hub-kubeconfig\"" Apr 21 15:35:37.785751 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:37.785492 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-dockercfg-r86gr\"" Apr 21 15:35:37.785751 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:37.785588 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 21 15:35:37.785751 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:37.785651 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 21 15:35:37.796895 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:37.796865 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5fc6686f8-g584l"] Apr 21 15:35:37.797061 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:37.797041 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5b8765b5cd-hk87t" Apr 21 15:35:37.800123 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:37.800097 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"work-manager-hub-kubeconfig\"" Apr 21 15:35:37.813787 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:37.813752 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-789c9c8b96-xcctw"] Apr 21 15:35:37.813949 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:37.813925 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5fc6686f8-g584l" Apr 21 15:35:37.818701 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:37.818680 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-open-cluster-management.io-proxy-agent-signer-client-cert\"" Apr 21 15:35:37.818995 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:37.818972 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-hub-kubeconfig\"" Apr 21 15:35:37.819095 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:37.819044 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-ca\"" Apr 21 15:35:37.819154 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:37.819122 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-service-proxy-server-certificates\"" Apr 21 15:35:37.835164 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:37.835139 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-fd595c69c-6flrk"] Apr 21 15:35:37.835278 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:37.835258 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-5b8765b5cd-hk87t"] Apr 21 15:35:37.835326 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:37.835288 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-p6s7m"] Apr 21 15:35:37.835370 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:37.835260 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-789c9c8b96-xcctw" Apr 21 15:35:37.838304 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:37.838283 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 21 15:35:37.838419 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:37.838312 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 21 15:35:37.838419 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:37.838291 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-kwwxq\"" Apr 21 15:35:37.838687 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:37.838665 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 21 15:35:37.850760 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:37.850740 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5fc6686f8-g584l"] Apr 21 15:35:37.850875 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:37.850817 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-789c9c8b96-xcctw"] Apr 21 15:35:37.850875 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:37.850852 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-zswf8"] Apr 21 15:35:37.850981 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:37.850946 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-p6s7m" Apr 21 15:35:37.851318 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:37.851300 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 21 15:35:37.855978 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:37.855952 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 21 15:35:37.856460 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:37.856443 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 21 15:35:37.856548 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:37.856451 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-7ld2z\"" Apr 21 15:35:37.863073 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:37.863052 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-p6s7m"] Apr 21 15:35:37.863170 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:37.863085 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-zswf8"] Apr 21 15:35:37.863228 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:37.863212 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-zswf8" Apr 21 15:35:37.868317 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:37.868284 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 21 15:35:37.868751 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:37.868727 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 21 15:35:37.869633 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:37.869125 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 21 15:35:37.869633 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:37.869168 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-xkwb5\"" Apr 21 15:35:37.905852 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:37.905822 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hpv6t\" (UniqueName: \"kubernetes.io/projected/e86d6919-a690-43d0-bce1-125ffa4e89a0-kube-api-access-hpv6t\") pod \"klusterlet-addon-workmgr-5b8765b5cd-hk87t\" (UID: \"e86d6919-a690-43d0-bce1-125ffa4e89a0\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5b8765b5cd-hk87t" Apr 21 15:35:37.905852 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:37.905863 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/0916d2fa-b831-41db-8365-4d6cf0182f90-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-5fc6686f8-g584l\" (UID: \"0916d2fa-b831-41db-8365-4d6cf0182f90\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5fc6686f8-g584l" Apr 21 15:35:37.906093 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:37.905892 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/0916d2fa-b831-41db-8365-4d6cf0182f90-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-5fc6686f8-g584l\" (UID: \"0916d2fa-b831-41db-8365-4d6cf0182f90\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5fc6686f8-g584l" Apr 21 15:35:37.906093 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:37.905919 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/755e24aa-0cb9-4572-b0bc-7fa61c3b2a7e-trusted-ca\") pod \"image-registry-789c9c8b96-xcctw\" (UID: \"755e24aa-0cb9-4572-b0bc-7fa61c3b2a7e\") " pod="openshift-image-registry/image-registry-789c9c8b96-xcctw" Apr 21 15:35:37.906093 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:37.905991 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/755e24aa-0cb9-4572-b0bc-7fa61c3b2a7e-ca-trust-extracted\") pod \"image-registry-789c9c8b96-xcctw\" (UID: \"755e24aa-0cb9-4572-b0bc-7fa61c3b2a7e\") " pod="openshift-image-registry/image-registry-789c9c8b96-xcctw" Apr 21 15:35:37.906093 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:37.906015 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/755e24aa-0cb9-4572-b0bc-7fa61c3b2a7e-bound-sa-token\") pod \"image-registry-789c9c8b96-xcctw\" (UID: \"755e24aa-0cb9-4572-b0bc-7fa61c3b2a7e\") " pod="openshift-image-registry/image-registry-789c9c8b96-xcctw" Apr 21 15:35:37.906093 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:37.906056 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/0916d2fa-b831-41db-8365-4d6cf0182f90-hub\") pod \"cluster-proxy-proxy-agent-5fc6686f8-g584l\" (UID: \"0916d2fa-b831-41db-8365-4d6cf0182f90\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5fc6686f8-g584l" Apr 21 15:35:37.906093 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:37.906084 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/755e24aa-0cb9-4572-b0bc-7fa61c3b2a7e-image-registry-private-configuration\") pod \"image-registry-789c9c8b96-xcctw\" (UID: \"755e24aa-0cb9-4572-b0bc-7fa61c3b2a7e\") " pod="openshift-image-registry/image-registry-789c9c8b96-xcctw" Apr 21 15:35:37.906360 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:37.906121 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/f01c48ea-2a23-4c74-b151-71433151d77e-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-fd595c69c-6flrk\" (UID: \"f01c48ea-2a23-4c74-b151-71433151d77e\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-fd595c69c-6flrk" Apr 21 15:35:37.906360 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:37.906148 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/755e24aa-0cb9-4572-b0bc-7fa61c3b2a7e-registry-certificates\") pod \"image-registry-789c9c8b96-xcctw\" (UID: \"755e24aa-0cb9-4572-b0bc-7fa61c3b2a7e\") " pod="openshift-image-registry/image-registry-789c9c8b96-xcctw" Apr 21 15:35:37.906360 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:37.906230 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6h4w7\" (UniqueName: \"kubernetes.io/projected/0916d2fa-b831-41db-8365-4d6cf0182f90-kube-api-access-6h4w7\") pod \"cluster-proxy-proxy-agent-5fc6686f8-g584l\" (UID: \"0916d2fa-b831-41db-8365-4d6cf0182f90\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5fc6686f8-g584l" Apr 21 15:35:37.906360 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:37.906270 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/0916d2fa-b831-41db-8365-4d6cf0182f90-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-5fc6686f8-g584l\" (UID: \"0916d2fa-b831-41db-8365-4d6cf0182f90\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5fc6686f8-g584l" Apr 21 15:35:37.906526 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:37.906356 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b8875\" (UniqueName: \"kubernetes.io/projected/755e24aa-0cb9-4572-b0bc-7fa61c3b2a7e-kube-api-access-b8875\") pod \"image-registry-789c9c8b96-xcctw\" (UID: \"755e24aa-0cb9-4572-b0bc-7fa61c3b2a7e\") " pod="openshift-image-registry/image-registry-789c9c8b96-xcctw" Apr 21 15:35:37.906526 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:37.906407 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/e86d6919-a690-43d0-bce1-125ffa4e89a0-klusterlet-config\") pod \"klusterlet-addon-workmgr-5b8765b5cd-hk87t\" (UID: \"e86d6919-a690-43d0-bce1-125ffa4e89a0\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5b8765b5cd-hk87t" Apr 21 15:35:37.906526 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:37.906430 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9wbst\" (UniqueName: \"kubernetes.io/projected/f01c48ea-2a23-4c74-b151-71433151d77e-kube-api-access-9wbst\") pod \"managed-serviceaccount-addon-agent-fd595c69c-6flrk\" (UID: \"f01c48ea-2a23-4c74-b151-71433151d77e\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-fd595c69c-6flrk" Apr 21 15:35:37.906526 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:37.906446 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/755e24aa-0cb9-4572-b0bc-7fa61c3b2a7e-installation-pull-secrets\") pod \"image-registry-789c9c8b96-xcctw\" (UID: \"755e24aa-0cb9-4572-b0bc-7fa61c3b2a7e\") " pod="openshift-image-registry/image-registry-789c9c8b96-xcctw" Apr 21 15:35:37.906526 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:37.906464 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/e86d6919-a690-43d0-bce1-125ffa4e89a0-tmp\") pod \"klusterlet-addon-workmgr-5b8765b5cd-hk87t\" (UID: \"e86d6919-a690-43d0-bce1-125ffa4e89a0\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5b8765b5cd-hk87t" Apr 21 15:35:37.906719 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:37.906561 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/0916d2fa-b831-41db-8365-4d6cf0182f90-ca\") pod \"cluster-proxy-proxy-agent-5fc6686f8-g584l\" (UID: \"0916d2fa-b831-41db-8365-4d6cf0182f90\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5fc6686f8-g584l" Apr 21 15:35:37.906719 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:37.906591 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/755e24aa-0cb9-4572-b0bc-7fa61c3b2a7e-registry-tls\") pod \"image-registry-789c9c8b96-xcctw\" (UID: \"755e24aa-0cb9-4572-b0bc-7fa61c3b2a7e\") " pod="openshift-image-registry/image-registry-789c9c8b96-xcctw" Apr 21 15:35:38.007475 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:38.007394 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/e86d6919-a690-43d0-bce1-125ffa4e89a0-klusterlet-config\") pod \"klusterlet-addon-workmgr-5b8765b5cd-hk87t\" (UID: \"e86d6919-a690-43d0-bce1-125ffa4e89a0\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5b8765b5cd-hk87t" Apr 21 15:35:38.007475 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:38.007438 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9wbst\" (UniqueName: \"kubernetes.io/projected/f01c48ea-2a23-4c74-b151-71433151d77e-kube-api-access-9wbst\") pod \"managed-serviceaccount-addon-agent-fd595c69c-6flrk\" (UID: \"f01c48ea-2a23-4c74-b151-71433151d77e\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-fd595c69c-6flrk" Apr 21 15:35:38.007735 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:38.007518 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/755e24aa-0cb9-4572-b0bc-7fa61c3b2a7e-installation-pull-secrets\") pod \"image-registry-789c9c8b96-xcctw\" (UID: \"755e24aa-0cb9-4572-b0bc-7fa61c3b2a7e\") " pod="openshift-image-registry/image-registry-789c9c8b96-xcctw" Apr 21 15:35:38.007735 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:38.007569 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/e86d6919-a690-43d0-bce1-125ffa4e89a0-tmp\") pod \"klusterlet-addon-workmgr-5b8765b5cd-hk87t\" (UID: \"e86d6919-a690-43d0-bce1-125ffa4e89a0\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5b8765b5cd-hk87t" Apr 21 15:35:38.007735 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:38.007613 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/0916d2fa-b831-41db-8365-4d6cf0182f90-ca\") pod \"cluster-proxy-proxy-agent-5fc6686f8-g584l\" (UID: \"0916d2fa-b831-41db-8365-4d6cf0182f90\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5fc6686f8-g584l" Apr 21 15:35:38.007735 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:38.007639 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/755e24aa-0cb9-4572-b0bc-7fa61c3b2a7e-registry-tls\") pod \"image-registry-789c9c8b96-xcctw\" (UID: \"755e24aa-0cb9-4572-b0bc-7fa61c3b2a7e\") " pod="openshift-image-registry/image-registry-789c9c8b96-xcctw" Apr 21 15:35:38.007735 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:38.007670 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b5tgv\" (UniqueName: \"kubernetes.io/projected/ecc2bf4d-8668-46f7-a489-514b0b505d8c-kube-api-access-b5tgv\") pod \"ingress-canary-zswf8\" (UID: \"ecc2bf4d-8668-46f7-a489-514b0b505d8c\") " pod="openshift-ingress-canary/ingress-canary-zswf8" Apr 21 15:35:38.007735 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:38.007692 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/965e7720-2b43-4a79-9af6-74b4a24a9047-config-volume\") pod \"dns-default-p6s7m\" (UID: \"965e7720-2b43-4a79-9af6-74b4a24a9047\") " pod="openshift-dns/dns-default-p6s7m" Apr 21 15:35:38.007735 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:38.007722 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hpv6t\" (UniqueName: \"kubernetes.io/projected/e86d6919-a690-43d0-bce1-125ffa4e89a0-kube-api-access-hpv6t\") pod \"klusterlet-addon-workmgr-5b8765b5cd-hk87t\" (UID: \"e86d6919-a690-43d0-bce1-125ffa4e89a0\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5b8765b5cd-hk87t" Apr 21 15:35:38.008086 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:38.007758 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/0916d2fa-b831-41db-8365-4d6cf0182f90-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-5fc6686f8-g584l\" (UID: \"0916d2fa-b831-41db-8365-4d6cf0182f90\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5fc6686f8-g584l" Apr 21 15:35:38.008086 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:38.007791 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/0916d2fa-b831-41db-8365-4d6cf0182f90-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-5fc6686f8-g584l\" (UID: \"0916d2fa-b831-41db-8365-4d6cf0182f90\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5fc6686f8-g584l" Apr 21 15:35:38.008086 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:38.007830 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/755e24aa-0cb9-4572-b0bc-7fa61c3b2a7e-trusted-ca\") pod \"image-registry-789c9c8b96-xcctw\" (UID: \"755e24aa-0cb9-4572-b0bc-7fa61c3b2a7e\") " pod="openshift-image-registry/image-registry-789c9c8b96-xcctw" Apr 21 15:35:38.008086 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:38.007868 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/755e24aa-0cb9-4572-b0bc-7fa61c3b2a7e-ca-trust-extracted\") pod \"image-registry-789c9c8b96-xcctw\" (UID: \"755e24aa-0cb9-4572-b0bc-7fa61c3b2a7e\") " pod="openshift-image-registry/image-registry-789c9c8b96-xcctw" Apr 21 15:35:38.008086 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:38.007889 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/755e24aa-0cb9-4572-b0bc-7fa61c3b2a7e-bound-sa-token\") pod \"image-registry-789c9c8b96-xcctw\" (UID: \"755e24aa-0cb9-4572-b0bc-7fa61c3b2a7e\") " pod="openshift-image-registry/image-registry-789c9c8b96-xcctw" Apr 21 15:35:38.008086 ip-10-0-136-123 kubenswrapper[2573]: E0421 15:35:38.007891 2573 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 21 15:35:38.008086 ip-10-0-136-123 kubenswrapper[2573]: E0421 15:35:38.007909 2573 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-789c9c8b96-xcctw: secret "image-registry-tls" not found Apr 21 15:35:38.008086 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:38.007916 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hh7gl\" (UniqueName: \"kubernetes.io/projected/965e7720-2b43-4a79-9af6-74b4a24a9047-kube-api-access-hh7gl\") pod \"dns-default-p6s7m\" (UID: \"965e7720-2b43-4a79-9af6-74b4a24a9047\") " pod="openshift-dns/dns-default-p6s7m" Apr 21 15:35:38.008086 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:38.007954 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/0916d2fa-b831-41db-8365-4d6cf0182f90-hub\") pod \"cluster-proxy-proxy-agent-5fc6686f8-g584l\" (UID: \"0916d2fa-b831-41db-8365-4d6cf0182f90\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5fc6686f8-g584l" Apr 21 15:35:38.008086 ip-10-0-136-123 kubenswrapper[2573]: E0421 15:35:38.007964 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/755e24aa-0cb9-4572-b0bc-7fa61c3b2a7e-registry-tls podName:755e24aa-0cb9-4572-b0bc-7fa61c3b2a7e nodeName:}" failed. No retries permitted until 2026-04-21 15:35:38.507947915 +0000 UTC m=+33.638001526 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/755e24aa-0cb9-4572-b0bc-7fa61c3b2a7e-registry-tls") pod "image-registry-789c9c8b96-xcctw" (UID: "755e24aa-0cb9-4572-b0bc-7fa61c3b2a7e") : secret "image-registry-tls" not found Apr 21 15:35:38.008086 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:38.008067 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/e86d6919-a690-43d0-bce1-125ffa4e89a0-tmp\") pod \"klusterlet-addon-workmgr-5b8765b5cd-hk87t\" (UID: \"e86d6919-a690-43d0-bce1-125ffa4e89a0\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5b8765b5cd-hk87t" Apr 21 15:35:38.008657 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:38.008502 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/755e24aa-0cb9-4572-b0bc-7fa61c3b2a7e-ca-trust-extracted\") pod \"image-registry-789c9c8b96-xcctw\" (UID: \"755e24aa-0cb9-4572-b0bc-7fa61c3b2a7e\") " pod="openshift-image-registry/image-registry-789c9c8b96-xcctw" Apr 21 15:35:38.008704 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:38.008653 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/755e24aa-0cb9-4572-b0bc-7fa61c3b2a7e-image-registry-private-configuration\") pod \"image-registry-789c9c8b96-xcctw\" (UID: \"755e24aa-0cb9-4572-b0bc-7fa61c3b2a7e\") " pod="openshift-image-registry/image-registry-789c9c8b96-xcctw" Apr 21 15:35:38.008704 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:38.008698 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ecc2bf4d-8668-46f7-a489-514b0b505d8c-cert\") pod \"ingress-canary-zswf8\" (UID: \"ecc2bf4d-8668-46f7-a489-514b0b505d8c\") " pod="openshift-ingress-canary/ingress-canary-zswf8" Apr 21 15:35:38.008790 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:38.008736 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/f01c48ea-2a23-4c74-b151-71433151d77e-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-fd595c69c-6flrk\" (UID: \"f01c48ea-2a23-4c74-b151-71433151d77e\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-fd595c69c-6flrk" Apr 21 15:35:38.008790 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:38.008768 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/755e24aa-0cb9-4572-b0bc-7fa61c3b2a7e-registry-certificates\") pod \"image-registry-789c9c8b96-xcctw\" (UID: \"755e24aa-0cb9-4572-b0bc-7fa61c3b2a7e\") " pod="openshift-image-registry/image-registry-789c9c8b96-xcctw" Apr 21 15:35:38.008909 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:38.008838 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6h4w7\" (UniqueName: \"kubernetes.io/projected/0916d2fa-b831-41db-8365-4d6cf0182f90-kube-api-access-6h4w7\") pod \"cluster-proxy-proxy-agent-5fc6686f8-g584l\" (UID: \"0916d2fa-b831-41db-8365-4d6cf0182f90\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5fc6686f8-g584l" Apr 21 15:35:38.009232 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:38.009191 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/0916d2fa-b831-41db-8365-4d6cf0182f90-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-5fc6686f8-g584l\" (UID: \"0916d2fa-b831-41db-8365-4d6cf0182f90\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5fc6686f8-g584l" Apr 21 15:35:38.009665 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:38.009642 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/755e24aa-0cb9-4572-b0bc-7fa61c3b2a7e-trusted-ca\") pod \"image-registry-789c9c8b96-xcctw\" (UID: \"755e24aa-0cb9-4572-b0bc-7fa61c3b2a7e\") " pod="openshift-image-registry/image-registry-789c9c8b96-xcctw" Apr 21 15:35:38.009759 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:38.009729 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/755e24aa-0cb9-4572-b0bc-7fa61c3b2a7e-registry-certificates\") pod \"image-registry-789c9c8b96-xcctw\" (UID: \"755e24aa-0cb9-4572-b0bc-7fa61c3b2a7e\") " pod="openshift-image-registry/image-registry-789c9c8b96-xcctw" Apr 21 15:35:38.009855 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:38.009732 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/0916d2fa-b831-41db-8365-4d6cf0182f90-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-5fc6686f8-g584l\" (UID: \"0916d2fa-b831-41db-8365-4d6cf0182f90\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5fc6686f8-g584l" Apr 21 15:35:38.009916 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:38.009883 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/965e7720-2b43-4a79-9af6-74b4a24a9047-metrics-tls\") pod \"dns-default-p6s7m\" (UID: \"965e7720-2b43-4a79-9af6-74b4a24a9047\") " pod="openshift-dns/dns-default-p6s7m" Apr 21 15:35:38.009972 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:38.009920 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b8875\" (UniqueName: \"kubernetes.io/projected/755e24aa-0cb9-4572-b0bc-7fa61c3b2a7e-kube-api-access-b8875\") pod \"image-registry-789c9c8b96-xcctw\" (UID: \"755e24aa-0cb9-4572-b0bc-7fa61c3b2a7e\") " pod="openshift-image-registry/image-registry-789c9c8b96-xcctw" Apr 21 15:35:38.009972 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:38.009949 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/965e7720-2b43-4a79-9af6-74b4a24a9047-tmp-dir\") pod \"dns-default-p6s7m\" (UID: \"965e7720-2b43-4a79-9af6-74b4a24a9047\") " pod="openshift-dns/dns-default-p6s7m" Apr 21 15:35:38.012948 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:38.012895 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub\" (UniqueName: \"kubernetes.io/secret/0916d2fa-b831-41db-8365-4d6cf0182f90-hub\") pod \"cluster-proxy-proxy-agent-5fc6686f8-g584l\" (UID: \"0916d2fa-b831-41db-8365-4d6cf0182f90\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5fc6686f8-g584l" Apr 21 15:35:38.012948 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:38.012913 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/755e24aa-0cb9-4572-b0bc-7fa61c3b2a7e-installation-pull-secrets\") pod \"image-registry-789c9c8b96-xcctw\" (UID: \"755e24aa-0cb9-4572-b0bc-7fa61c3b2a7e\") " pod="openshift-image-registry/image-registry-789c9c8b96-xcctw" Apr 21 15:35:38.013122 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:38.013022 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/0916d2fa-b831-41db-8365-4d6cf0182f90-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-5fc6686f8-g584l\" (UID: \"0916d2fa-b831-41db-8365-4d6cf0182f90\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5fc6686f8-g584l" Apr 21 15:35:38.013294 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:38.013269 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/e86d6919-a690-43d0-bce1-125ffa4e89a0-klusterlet-config\") pod \"klusterlet-addon-workmgr-5b8765b5cd-hk87t\" (UID: \"e86d6919-a690-43d0-bce1-125ffa4e89a0\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5b8765b5cd-hk87t" Apr 21 15:35:38.013695 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:38.013670 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/0916d2fa-b831-41db-8365-4d6cf0182f90-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-5fc6686f8-g584l\" (UID: \"0916d2fa-b831-41db-8365-4d6cf0182f90\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5fc6686f8-g584l" Apr 21 15:35:38.014073 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:38.014048 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca\" (UniqueName: \"kubernetes.io/secret/0916d2fa-b831-41db-8365-4d6cf0182f90-ca\") pod \"cluster-proxy-proxy-agent-5fc6686f8-g584l\" (UID: \"0916d2fa-b831-41db-8365-4d6cf0182f90\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5fc6686f8-g584l" Apr 21 15:35:38.014314 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:38.014293 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/755e24aa-0cb9-4572-b0bc-7fa61c3b2a7e-image-registry-private-configuration\") pod \"image-registry-789c9c8b96-xcctw\" (UID: \"755e24aa-0cb9-4572-b0bc-7fa61c3b2a7e\") " pod="openshift-image-registry/image-registry-789c9c8b96-xcctw" Apr 21 15:35:38.014376 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:38.014364 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/f01c48ea-2a23-4c74-b151-71433151d77e-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-fd595c69c-6flrk\" (UID: \"f01c48ea-2a23-4c74-b151-71433151d77e\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-fd595c69c-6flrk" Apr 21 15:35:38.020908 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:38.020888 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/755e24aa-0cb9-4572-b0bc-7fa61c3b2a7e-bound-sa-token\") pod \"image-registry-789c9c8b96-xcctw\" (UID: \"755e24aa-0cb9-4572-b0bc-7fa61c3b2a7e\") " pod="openshift-image-registry/image-registry-789c9c8b96-xcctw" Apr 21 15:35:38.021018 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:38.020952 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hpv6t\" (UniqueName: \"kubernetes.io/projected/e86d6919-a690-43d0-bce1-125ffa4e89a0-kube-api-access-hpv6t\") pod \"klusterlet-addon-workmgr-5b8765b5cd-hk87t\" (UID: \"e86d6919-a690-43d0-bce1-125ffa4e89a0\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5b8765b5cd-hk87t" Apr 21 15:35:38.021280 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:38.021257 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9wbst\" (UniqueName: \"kubernetes.io/projected/f01c48ea-2a23-4c74-b151-71433151d77e-kube-api-access-9wbst\") pod \"managed-serviceaccount-addon-agent-fd595c69c-6flrk\" (UID: \"f01c48ea-2a23-4c74-b151-71433151d77e\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-fd595c69c-6flrk" Apr 21 15:35:38.022159 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:38.022136 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6h4w7\" (UniqueName: \"kubernetes.io/projected/0916d2fa-b831-41db-8365-4d6cf0182f90-kube-api-access-6h4w7\") pod \"cluster-proxy-proxy-agent-5fc6686f8-g584l\" (UID: \"0916d2fa-b831-41db-8365-4d6cf0182f90\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5fc6686f8-g584l" Apr 21 15:35:38.023424 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:38.023403 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b8875\" (UniqueName: \"kubernetes.io/projected/755e24aa-0cb9-4572-b0bc-7fa61c3b2a7e-kube-api-access-b8875\") pod \"image-registry-789c9c8b96-xcctw\" (UID: \"755e24aa-0cb9-4572-b0bc-7fa61c3b2a7e\") " pod="openshift-image-registry/image-registry-789c9c8b96-xcctw" Apr 21 15:35:38.101468 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:38.101425 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-fd595c69c-6flrk" Apr 21 15:35:38.110364 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:38.110329 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b5tgv\" (UniqueName: \"kubernetes.io/projected/ecc2bf4d-8668-46f7-a489-514b0b505d8c-kube-api-access-b5tgv\") pod \"ingress-canary-zswf8\" (UID: \"ecc2bf4d-8668-46f7-a489-514b0b505d8c\") " pod="openshift-ingress-canary/ingress-canary-zswf8" Apr 21 15:35:38.110505 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:38.110374 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/965e7720-2b43-4a79-9af6-74b4a24a9047-config-volume\") pod \"dns-default-p6s7m\" (UID: \"965e7720-2b43-4a79-9af6-74b4a24a9047\") " pod="openshift-dns/dns-default-p6s7m" Apr 21 15:35:38.110572 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:38.110503 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5b8765b5cd-hk87t" Apr 21 15:35:38.110572 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:38.110529 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9c107ca7-f14c-4f8c-a8d4-4e08e3acb233-metrics-certs\") pod \"network-metrics-daemon-28b7m\" (UID: \"9c107ca7-f14c-4f8c-a8d4-4e08e3acb233\") " pod="openshift-multus/network-metrics-daemon-28b7m" Apr 21 15:35:38.110675 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:38.110576 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hh7gl\" (UniqueName: \"kubernetes.io/projected/965e7720-2b43-4a79-9af6-74b4a24a9047-kube-api-access-hh7gl\") pod \"dns-default-p6s7m\" (UID: \"965e7720-2b43-4a79-9af6-74b4a24a9047\") " pod="openshift-dns/dns-default-p6s7m" Apr 21 15:35:38.110675 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:38.110619 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ecc2bf4d-8668-46f7-a489-514b0b505d8c-cert\") pod \"ingress-canary-zswf8\" (UID: \"ecc2bf4d-8668-46f7-a489-514b0b505d8c\") " pod="openshift-ingress-canary/ingress-canary-zswf8" Apr 21 15:35:38.110675 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:38.110671 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/965e7720-2b43-4a79-9af6-74b4a24a9047-metrics-tls\") pod \"dns-default-p6s7m\" (UID: \"965e7720-2b43-4a79-9af6-74b4a24a9047\") " pod="openshift-dns/dns-default-p6s7m" Apr 21 15:35:38.110848 ip-10-0-136-123 kubenswrapper[2573]: E0421 15:35:38.110686 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 15:35:38.110848 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:38.110697 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/965e7720-2b43-4a79-9af6-74b4a24a9047-tmp-dir\") pod \"dns-default-p6s7m\" (UID: \"965e7720-2b43-4a79-9af6-74b4a24a9047\") " pod="openshift-dns/dns-default-p6s7m" Apr 21 15:35:38.110848 ip-10-0-136-123 kubenswrapper[2573]: E0421 15:35:38.110777 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9c107ca7-f14c-4f8c-a8d4-4e08e3acb233-metrics-certs podName:9c107ca7-f14c-4f8c-a8d4-4e08e3acb233 nodeName:}" failed. No retries permitted until 2026-04-21 15:36:10.110754602 +0000 UTC m=+65.240808228 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9c107ca7-f14c-4f8c-a8d4-4e08e3acb233-metrics-certs") pod "network-metrics-daemon-28b7m" (UID: "9c107ca7-f14c-4f8c-a8d4-4e08e3acb233") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 15:35:38.110848 ip-10-0-136-123 kubenswrapper[2573]: E0421 15:35:38.110841 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 15:35:38.111035 ip-10-0-136-123 kubenswrapper[2573]: E0421 15:35:38.110901 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ecc2bf4d-8668-46f7-a489-514b0b505d8c-cert podName:ecc2bf4d-8668-46f7-a489-514b0b505d8c nodeName:}" failed. No retries permitted until 2026-04-21 15:35:38.6108842 +0000 UTC m=+33.740937825 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ecc2bf4d-8668-46f7-a489-514b0b505d8c-cert") pod "ingress-canary-zswf8" (UID: "ecc2bf4d-8668-46f7-a489-514b0b505d8c") : secret "canary-serving-cert" not found Apr 21 15:35:38.111035 ip-10-0-136-123 kubenswrapper[2573]: E0421 15:35:38.110934 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 15:35:38.111035 ip-10-0-136-123 kubenswrapper[2573]: E0421 15:35:38.110987 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/965e7720-2b43-4a79-9af6-74b4a24a9047-metrics-tls podName:965e7720-2b43-4a79-9af6-74b4a24a9047 nodeName:}" failed. No retries permitted until 2026-04-21 15:35:38.610972236 +0000 UTC m=+33.741025852 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/965e7720-2b43-4a79-9af6-74b4a24a9047-metrics-tls") pod "dns-default-p6s7m" (UID: "965e7720-2b43-4a79-9af6-74b4a24a9047") : secret "dns-default-metrics-tls" not found Apr 21 15:35:38.111035 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:38.111015 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/965e7720-2b43-4a79-9af6-74b4a24a9047-tmp-dir\") pod \"dns-default-p6s7m\" (UID: \"965e7720-2b43-4a79-9af6-74b4a24a9047\") " pod="openshift-dns/dns-default-p6s7m" Apr 21 15:35:38.111224 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:38.111160 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/965e7720-2b43-4a79-9af6-74b4a24a9047-config-volume\") pod \"dns-default-p6s7m\" (UID: \"965e7720-2b43-4a79-9af6-74b4a24a9047\") " pod="openshift-dns/dns-default-p6s7m" Apr 21 15:35:38.119412 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:38.119388 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hh7gl\" (UniqueName: \"kubernetes.io/projected/965e7720-2b43-4a79-9af6-74b4a24a9047-kube-api-access-hh7gl\") pod \"dns-default-p6s7m\" (UID: \"965e7720-2b43-4a79-9af6-74b4a24a9047\") " pod="openshift-dns/dns-default-p6s7m" Apr 21 15:35:38.119869 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:38.119848 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b5tgv\" (UniqueName: \"kubernetes.io/projected/ecc2bf4d-8668-46f7-a489-514b0b505d8c-kube-api-access-b5tgv\") pod \"ingress-canary-zswf8\" (UID: \"ecc2bf4d-8668-46f7-a489-514b0b505d8c\") " pod="openshift-ingress-canary/ingress-canary-zswf8" Apr 21 15:35:38.124138 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:38.124117 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5fc6686f8-g584l" Apr 21 15:35:38.313185 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:38.313145 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t5phk\" (UniqueName: \"kubernetes.io/projected/8ea4d113-155e-4fa2-b765-c12d26b37fa1-kube-api-access-t5phk\") pod \"network-check-target-ntgnx\" (UID: \"8ea4d113-155e-4fa2-b765-c12d26b37fa1\") " pod="openshift-network-diagnostics/network-check-target-ntgnx" Apr 21 15:35:38.313374 ip-10-0-136-123 kubenswrapper[2573]: E0421 15:35:38.313336 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 15:35:38.313374 ip-10-0-136-123 kubenswrapper[2573]: E0421 15:35:38.313362 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 15:35:38.313374 ip-10-0-136-123 kubenswrapper[2573]: E0421 15:35:38.313375 2573 projected.go:194] Error preparing data for projected volume kube-api-access-t5phk for pod openshift-network-diagnostics/network-check-target-ntgnx: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 15:35:38.313536 ip-10-0-136-123 kubenswrapper[2573]: E0421 15:35:38.313442 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8ea4d113-155e-4fa2-b765-c12d26b37fa1-kube-api-access-t5phk podName:8ea4d113-155e-4fa2-b765-c12d26b37fa1 nodeName:}" failed. No retries permitted until 2026-04-21 15:36:10.313423667 +0000 UTC m=+65.443477312 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-t5phk" (UniqueName: "kubernetes.io/projected/8ea4d113-155e-4fa2-b765-c12d26b37fa1-kube-api-access-t5phk") pod "network-check-target-ntgnx" (UID: "8ea4d113-155e-4fa2-b765-c12d26b37fa1") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 15:35:38.515023 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:38.514978 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/755e24aa-0cb9-4572-b0bc-7fa61c3b2a7e-registry-tls\") pod \"image-registry-789c9c8b96-xcctw\" (UID: \"755e24aa-0cb9-4572-b0bc-7fa61c3b2a7e\") " pod="openshift-image-registry/image-registry-789c9c8b96-xcctw" Apr 21 15:35:38.515619 ip-10-0-136-123 kubenswrapper[2573]: E0421 15:35:38.515237 2573 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 21 15:35:38.515619 ip-10-0-136-123 kubenswrapper[2573]: E0421 15:35:38.515252 2573 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-789c9c8b96-xcctw: secret "image-registry-tls" not found Apr 21 15:35:38.515619 ip-10-0-136-123 kubenswrapper[2573]: E0421 15:35:38.515304 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/755e24aa-0cb9-4572-b0bc-7fa61c3b2a7e-registry-tls podName:755e24aa-0cb9-4572-b0bc-7fa61c3b2a7e nodeName:}" failed. No retries permitted until 2026-04-21 15:35:39.515290407 +0000 UTC m=+34.645344019 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/755e24aa-0cb9-4572-b0bc-7fa61c3b2a7e-registry-tls") pod "image-registry-789c9c8b96-xcctw" (UID: "755e24aa-0cb9-4572-b0bc-7fa61c3b2a7e") : secret "image-registry-tls" not found Apr 21 15:35:38.616584 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:38.616496 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ecc2bf4d-8668-46f7-a489-514b0b505d8c-cert\") pod \"ingress-canary-zswf8\" (UID: \"ecc2bf4d-8668-46f7-a489-514b0b505d8c\") " pod="openshift-ingress-canary/ingress-canary-zswf8" Apr 21 15:35:38.616584 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:38.616567 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/965e7720-2b43-4a79-9af6-74b4a24a9047-metrics-tls\") pod \"dns-default-p6s7m\" (UID: \"965e7720-2b43-4a79-9af6-74b4a24a9047\") " pod="openshift-dns/dns-default-p6s7m" Apr 21 15:35:38.616875 ip-10-0-136-123 kubenswrapper[2573]: E0421 15:35:38.616663 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 15:35:38.616875 ip-10-0-136-123 kubenswrapper[2573]: E0421 15:35:38.616696 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 15:35:38.616875 ip-10-0-136-123 kubenswrapper[2573]: E0421 15:35:38.616734 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ecc2bf4d-8668-46f7-a489-514b0b505d8c-cert podName:ecc2bf4d-8668-46f7-a489-514b0b505d8c nodeName:}" failed. No retries permitted until 2026-04-21 15:35:39.616716751 +0000 UTC m=+34.746770374 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ecc2bf4d-8668-46f7-a489-514b0b505d8c-cert") pod "ingress-canary-zswf8" (UID: "ecc2bf4d-8668-46f7-a489-514b0b505d8c") : secret "canary-serving-cert" not found Apr 21 15:35:38.616875 ip-10-0-136-123 kubenswrapper[2573]: E0421 15:35:38.616750 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/965e7720-2b43-4a79-9af6-74b4a24a9047-metrics-tls podName:965e7720-2b43-4a79-9af6-74b4a24a9047 nodeName:}" failed. No retries permitted until 2026-04-21 15:35:39.616744687 +0000 UTC m=+34.746798298 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/965e7720-2b43-4a79-9af6-74b4a24a9047-metrics-tls") pod "dns-default-p6s7m" (UID: "965e7720-2b43-4a79-9af6-74b4a24a9047") : secret "dns-default-metrics-tls" not found Apr 21 15:35:39.272945 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:39.272915 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-fd595c69c-6flrk"] Apr 21 15:35:39.276935 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:39.276903 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-5b8765b5cd-hk87t"] Apr 21 15:35:39.277933 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:39.277912 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5fc6686f8-g584l"] Apr 21 15:35:39.386321 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:39.386244 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf01c48ea_2a23_4c74_b151_71433151d77e.slice/crio-f131d2dd7d1cab0d77bbf924e22a525b52bb0c686cde96f4bef6062c7e54022f WatchSource:0}: Error finding container f131d2dd7d1cab0d77bbf924e22a525b52bb0c686cde96f4bef6062c7e54022f: Status 404 returned error can't find the container with id f131d2dd7d1cab0d77bbf924e22a525b52bb0c686cde96f4bef6062c7e54022f Apr 21 15:35:39.387271 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:39.387244 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0916d2fa_b831_41db_8365_4d6cf0182f90.slice/crio-a499a7c2ccebf23b5c16f2b87610f852ed6f6f93e4ec8b5faf0e336d8e1c0ce2 WatchSource:0}: Error finding container a499a7c2ccebf23b5c16f2b87610f852ed6f6f93e4ec8b5faf0e336d8e1c0ce2: Status 404 returned error can't find the container with id a499a7c2ccebf23b5c16f2b87610f852ed6f6f93e4ec8b5faf0e336d8e1c0ce2 Apr 21 15:35:39.387856 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:39.387831 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode86d6919_a690_43d0_bce1_125ffa4e89a0.slice/crio-89c283ee93883058ee508a60cd4213451e2c0660a42457b3a5081973698883c9 WatchSource:0}: Error finding container 89c283ee93883058ee508a60cd4213451e2c0660a42457b3a5081973698883c9: Status 404 returned error can't find the container with id 89c283ee93883058ee508a60cd4213451e2c0660a42457b3a5081973698883c9 Apr 21 15:35:39.476931 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:39.476902 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ntgnx" Apr 21 15:35:39.477089 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:39.476902 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-4pjd9" Apr 21 15:35:39.477147 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:39.476911 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-28b7m" Apr 21 15:35:39.480075 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:39.480054 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-ndcm7\"" Apr 21 15:35:39.480744 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:39.480720 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 21 15:35:39.480931 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:39.480742 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 21 15:35:39.481221 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:39.481202 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 21 15:35:39.481316 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:39.481229 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 21 15:35:39.483269 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:39.483247 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-sjbkh\"" Apr 21 15:35:39.524614 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:39.524590 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/755e24aa-0cb9-4572-b0bc-7fa61c3b2a7e-registry-tls\") pod \"image-registry-789c9c8b96-xcctw\" (UID: \"755e24aa-0cb9-4572-b0bc-7fa61c3b2a7e\") " pod="openshift-image-registry/image-registry-789c9c8b96-xcctw" Apr 21 15:35:39.525107 ip-10-0-136-123 kubenswrapper[2573]: E0421 15:35:39.524702 2573 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 21 15:35:39.525107 ip-10-0-136-123 kubenswrapper[2573]: E0421 15:35:39.524714 2573 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-789c9c8b96-xcctw: secret "image-registry-tls" not found Apr 21 15:35:39.525107 ip-10-0-136-123 kubenswrapper[2573]: E0421 15:35:39.524765 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/755e24aa-0cb9-4572-b0bc-7fa61c3b2a7e-registry-tls podName:755e24aa-0cb9-4572-b0bc-7fa61c3b2a7e nodeName:}" failed. No retries permitted until 2026-04-21 15:35:41.5247508 +0000 UTC m=+36.654804412 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/755e24aa-0cb9-4572-b0bc-7fa61c3b2a7e-registry-tls") pod "image-registry-789c9c8b96-xcctw" (UID: "755e24aa-0cb9-4572-b0bc-7fa61c3b2a7e") : secret "image-registry-tls" not found Apr 21 15:35:39.625895 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:39.625683 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ecc2bf4d-8668-46f7-a489-514b0b505d8c-cert\") pod \"ingress-canary-zswf8\" (UID: \"ecc2bf4d-8668-46f7-a489-514b0b505d8c\") " pod="openshift-ingress-canary/ingress-canary-zswf8" Apr 21 15:35:39.626048 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:39.625930 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/965e7720-2b43-4a79-9af6-74b4a24a9047-metrics-tls\") pod \"dns-default-p6s7m\" (UID: \"965e7720-2b43-4a79-9af6-74b4a24a9047\") " pod="openshift-dns/dns-default-p6s7m" Apr 21 15:35:39.626048 ip-10-0-136-123 kubenswrapper[2573]: E0421 15:35:39.625844 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 15:35:39.626048 ip-10-0-136-123 kubenswrapper[2573]: E0421 15:35:39.626031 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ecc2bf4d-8668-46f7-a489-514b0b505d8c-cert podName:ecc2bf4d-8668-46f7-a489-514b0b505d8c nodeName:}" failed. No retries permitted until 2026-04-21 15:35:41.62600861 +0000 UTC m=+36.756062222 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ecc2bf4d-8668-46f7-a489-514b0b505d8c-cert") pod "ingress-canary-zswf8" (UID: "ecc2bf4d-8668-46f7-a489-514b0b505d8c") : secret "canary-serving-cert" not found Apr 21 15:35:39.626209 ip-10-0-136-123 kubenswrapper[2573]: E0421 15:35:39.626057 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 15:35:39.626209 ip-10-0-136-123 kubenswrapper[2573]: E0421 15:35:39.626120 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/965e7720-2b43-4a79-9af6-74b4a24a9047-metrics-tls podName:965e7720-2b43-4a79-9af6-74b4a24a9047 nodeName:}" failed. No retries permitted until 2026-04-21 15:35:41.626107612 +0000 UTC m=+36.756161223 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/965e7720-2b43-4a79-9af6-74b4a24a9047-metrics-tls") pod "dns-default-p6s7m" (UID: "965e7720-2b43-4a79-9af6-74b4a24a9047") : secret "dns-default-metrics-tls" not found Apr 21 15:35:39.701085 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:39.701050 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-gkvb2" event={"ID":"31c04054-fa66-445a-9246-9c32b20cd60d","Type":"ContainerStarted","Data":"9be030205098a498fff0c54e5660b78ef64897b52bbe8e4b70e02e3426f9bbd5"} Apr 21 15:35:39.704513 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:39.704472 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5b8765b5cd-hk87t" event={"ID":"e86d6919-a690-43d0-bce1-125ffa4e89a0","Type":"ContainerStarted","Data":"89c283ee93883058ee508a60cd4213451e2c0660a42457b3a5081973698883c9"} Apr 21 15:35:39.705396 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:39.705366 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-fd595c69c-6flrk" event={"ID":"f01c48ea-2a23-4c74-b151-71433151d77e","Type":"ContainerStarted","Data":"f131d2dd7d1cab0d77bbf924e22a525b52bb0c686cde96f4bef6062c7e54022f"} Apr 21 15:35:39.706461 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:39.706434 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5fc6686f8-g584l" event={"ID":"0916d2fa-b831-41db-8365-4d6cf0182f90","Type":"ContainerStarted","Data":"a499a7c2ccebf23b5c16f2b87610f852ed6f6f93e4ec8b5faf0e336d8e1c0ce2"} Apr 21 15:35:40.717953 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:40.717910 2573 generic.go:358] "Generic (PLEG): container finished" podID="31c04054-fa66-445a-9246-9c32b20cd60d" containerID="9be030205098a498fff0c54e5660b78ef64897b52bbe8e4b70e02e3426f9bbd5" exitCode=0 Apr 21 15:35:40.718430 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:40.717963 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-gkvb2" event={"ID":"31c04054-fa66-445a-9246-9c32b20cd60d","Type":"ContainerDied","Data":"9be030205098a498fff0c54e5660b78ef64897b52bbe8e4b70e02e3426f9bbd5"} Apr 21 15:35:41.141900 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:41.141219 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/97199e5c-4c05-4197-84c9-e95b525f3ae1-original-pull-secret\") pod \"global-pull-secret-syncer-4pjd9\" (UID: \"97199e5c-4c05-4197-84c9-e95b525f3ae1\") " pod="kube-system/global-pull-secret-syncer-4pjd9" Apr 21 15:35:41.149303 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:41.149240 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/97199e5c-4c05-4197-84c9-e95b525f3ae1-original-pull-secret\") pod \"global-pull-secret-syncer-4pjd9\" (UID: \"97199e5c-4c05-4197-84c9-e95b525f3ae1\") " pod="kube-system/global-pull-secret-syncer-4pjd9" Apr 21 15:35:41.299632 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:41.299551 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-4pjd9" Apr 21 15:35:41.502650 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:41.502592 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-4pjd9"] Apr 21 15:35:41.547187 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:41.546987 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/755e24aa-0cb9-4572-b0bc-7fa61c3b2a7e-registry-tls\") pod \"image-registry-789c9c8b96-xcctw\" (UID: \"755e24aa-0cb9-4572-b0bc-7fa61c3b2a7e\") " pod="openshift-image-registry/image-registry-789c9c8b96-xcctw" Apr 21 15:35:41.547187 ip-10-0-136-123 kubenswrapper[2573]: E0421 15:35:41.547185 2573 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 21 15:35:41.547415 ip-10-0-136-123 kubenswrapper[2573]: E0421 15:35:41.547202 2573 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-789c9c8b96-xcctw: secret "image-registry-tls" not found Apr 21 15:35:41.547415 ip-10-0-136-123 kubenswrapper[2573]: E0421 15:35:41.547273 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/755e24aa-0cb9-4572-b0bc-7fa61c3b2a7e-registry-tls podName:755e24aa-0cb9-4572-b0bc-7fa61c3b2a7e nodeName:}" failed. No retries permitted until 2026-04-21 15:35:45.547246811 +0000 UTC m=+40.677300425 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/755e24aa-0cb9-4572-b0bc-7fa61c3b2a7e-registry-tls") pod "image-registry-789c9c8b96-xcctw" (UID: "755e24aa-0cb9-4572-b0bc-7fa61c3b2a7e") : secret "image-registry-tls" not found Apr 21 15:35:41.648571 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:41.648499 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ecc2bf4d-8668-46f7-a489-514b0b505d8c-cert\") pod \"ingress-canary-zswf8\" (UID: \"ecc2bf4d-8668-46f7-a489-514b0b505d8c\") " pod="openshift-ingress-canary/ingress-canary-zswf8" Apr 21 15:35:41.648571 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:41.648567 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/965e7720-2b43-4a79-9af6-74b4a24a9047-metrics-tls\") pod \"dns-default-p6s7m\" (UID: \"965e7720-2b43-4a79-9af6-74b4a24a9047\") " pod="openshift-dns/dns-default-p6s7m" Apr 21 15:35:41.648861 ip-10-0-136-123 kubenswrapper[2573]: E0421 15:35:41.648641 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 15:35:41.648861 ip-10-0-136-123 kubenswrapper[2573]: E0421 15:35:41.648724 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ecc2bf4d-8668-46f7-a489-514b0b505d8c-cert podName:ecc2bf4d-8668-46f7-a489-514b0b505d8c nodeName:}" failed. No retries permitted until 2026-04-21 15:35:45.648701957 +0000 UTC m=+40.778755576 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ecc2bf4d-8668-46f7-a489-514b0b505d8c-cert") pod "ingress-canary-zswf8" (UID: "ecc2bf4d-8668-46f7-a489-514b0b505d8c") : secret "canary-serving-cert" not found Apr 21 15:35:41.648861 ip-10-0-136-123 kubenswrapper[2573]: E0421 15:35:41.648730 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 15:35:41.648861 ip-10-0-136-123 kubenswrapper[2573]: E0421 15:35:41.648809 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/965e7720-2b43-4a79-9af6-74b4a24a9047-metrics-tls podName:965e7720-2b43-4a79-9af6-74b4a24a9047 nodeName:}" failed. No retries permitted until 2026-04-21 15:35:45.648775099 +0000 UTC m=+40.778828718 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/965e7720-2b43-4a79-9af6-74b4a24a9047-metrics-tls") pod "dns-default-p6s7m" (UID: "965e7720-2b43-4a79-9af6-74b4a24a9047") : secret "dns-default-metrics-tls" not found Apr 21 15:35:41.725737 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:41.725698 2573 generic.go:358] "Generic (PLEG): container finished" podID="31c04054-fa66-445a-9246-9c32b20cd60d" containerID="3bad79f474b5fe50d580a9e22ae1ea4cf0ce8895481f93384e6b6c4c245bc8e9" exitCode=0 Apr 21 15:35:41.726250 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:41.725746 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-gkvb2" event={"ID":"31c04054-fa66-445a-9246-9c32b20cd60d","Type":"ContainerDied","Data":"3bad79f474b5fe50d580a9e22ae1ea4cf0ce8895481f93384e6b6c4c245bc8e9"} Apr 21 15:35:42.453335 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:35:42.453296 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod97199e5c_4c05_4197_84c9_e95b525f3ae1.slice/crio-6d6e0066a36f9ce34ae9fbadb8f45054f2de60e9e452e8f5782c78e23f3d2f78 WatchSource:0}: Error finding container 6d6e0066a36f9ce34ae9fbadb8f45054f2de60e9e452e8f5782c78e23f3d2f78: Status 404 returned error can't find the container with id 6d6e0066a36f9ce34ae9fbadb8f45054f2de60e9e452e8f5782c78e23f3d2f78 Apr 21 15:35:42.729706 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:42.729444 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-4pjd9" event={"ID":"97199e5c-4c05-4197-84c9-e95b525f3ae1","Type":"ContainerStarted","Data":"6d6e0066a36f9ce34ae9fbadb8f45054f2de60e9e452e8f5782c78e23f3d2f78"} Apr 21 15:35:45.581294 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:45.581258 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/755e24aa-0cb9-4572-b0bc-7fa61c3b2a7e-registry-tls\") pod \"image-registry-789c9c8b96-xcctw\" (UID: \"755e24aa-0cb9-4572-b0bc-7fa61c3b2a7e\") " pod="openshift-image-registry/image-registry-789c9c8b96-xcctw" Apr 21 15:35:45.581699 ip-10-0-136-123 kubenswrapper[2573]: E0421 15:35:45.581390 2573 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 21 15:35:45.581699 ip-10-0-136-123 kubenswrapper[2573]: E0421 15:35:45.581405 2573 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-789c9c8b96-xcctw: secret "image-registry-tls" not found Apr 21 15:35:45.581699 ip-10-0-136-123 kubenswrapper[2573]: E0421 15:35:45.581457 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/755e24aa-0cb9-4572-b0bc-7fa61c3b2a7e-registry-tls podName:755e24aa-0cb9-4572-b0bc-7fa61c3b2a7e nodeName:}" failed. No retries permitted until 2026-04-21 15:35:53.58144353 +0000 UTC m=+48.711497142 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/755e24aa-0cb9-4572-b0bc-7fa61c3b2a7e-registry-tls") pod "image-registry-789c9c8b96-xcctw" (UID: "755e24aa-0cb9-4572-b0bc-7fa61c3b2a7e") : secret "image-registry-tls" not found Apr 21 15:35:45.682789 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:45.682352 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ecc2bf4d-8668-46f7-a489-514b0b505d8c-cert\") pod \"ingress-canary-zswf8\" (UID: \"ecc2bf4d-8668-46f7-a489-514b0b505d8c\") " pod="openshift-ingress-canary/ingress-canary-zswf8" Apr 21 15:35:45.682789 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:45.682408 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/965e7720-2b43-4a79-9af6-74b4a24a9047-metrics-tls\") pod \"dns-default-p6s7m\" (UID: \"965e7720-2b43-4a79-9af6-74b4a24a9047\") " pod="openshift-dns/dns-default-p6s7m" Apr 21 15:35:45.682789 ip-10-0-136-123 kubenswrapper[2573]: E0421 15:35:45.682669 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 15:35:45.682789 ip-10-0-136-123 kubenswrapper[2573]: E0421 15:35:45.682680 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 15:35:45.682789 ip-10-0-136-123 kubenswrapper[2573]: E0421 15:35:45.682735 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/965e7720-2b43-4a79-9af6-74b4a24a9047-metrics-tls podName:965e7720-2b43-4a79-9af6-74b4a24a9047 nodeName:}" failed. No retries permitted until 2026-04-21 15:35:53.682716178 +0000 UTC m=+48.812769789 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/965e7720-2b43-4a79-9af6-74b4a24a9047-metrics-tls") pod "dns-default-p6s7m" (UID: "965e7720-2b43-4a79-9af6-74b4a24a9047") : secret "dns-default-metrics-tls" not found Apr 21 15:35:45.682789 ip-10-0-136-123 kubenswrapper[2573]: E0421 15:35:45.682753 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ecc2bf4d-8668-46f7-a489-514b0b505d8c-cert podName:ecc2bf4d-8668-46f7-a489-514b0b505d8c nodeName:}" failed. No retries permitted until 2026-04-21 15:35:53.68274438 +0000 UTC m=+48.812797993 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ecc2bf4d-8668-46f7-a489-514b0b505d8c-cert") pod "ingress-canary-zswf8" (UID: "ecc2bf4d-8668-46f7-a489-514b0b505d8c") : secret "canary-serving-cert" not found Apr 21 15:35:46.739195 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:46.739152 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5b8765b5cd-hk87t" event={"ID":"e86d6919-a690-43d0-bce1-125ffa4e89a0","Type":"ContainerStarted","Data":"73f5964fa8a89ff5c8858947efe64f8b0a9efe92f2a6f20e87b3331c7a47bf5a"} Apr 21 15:35:46.739724 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:46.739595 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5b8765b5cd-hk87t" Apr 21 15:35:46.740895 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:46.740870 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-fd595c69c-6flrk" event={"ID":"f01c48ea-2a23-4c74-b151-71433151d77e","Type":"ContainerStarted","Data":"602e078599e28f58a1ae76a6a47b9f6fc88896b1957e35b0e1a5df8947de8f14"} Apr 21 15:35:46.741416 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:46.741392 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5b8765b5cd-hk87t" Apr 21 15:35:46.742322 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:46.742288 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5fc6686f8-g584l" event={"ID":"0916d2fa-b831-41db-8365-4d6cf0182f90","Type":"ContainerStarted","Data":"2880a7728c65fccbdb620cddcdee41d0d04a4bd92f63750828c49f738e81bfeb"} Apr 21 15:35:46.745447 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:46.745424 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-gkvb2" event={"ID":"31c04054-fa66-445a-9246-9c32b20cd60d","Type":"ContainerStarted","Data":"64e38951d3e7e1618de425a791dc2385c67f40c9a4df282a12f8a4b302af3bd5"} Apr 21 15:35:46.802000 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:46.801944 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5b8765b5cd-hk87t" podStartSLOduration=24.555713904 podStartE2EDuration="30.80192437s" podCreationTimestamp="2026-04-21 15:35:16 +0000 UTC" firstStartedPulling="2026-04-21 15:35:39.409014358 +0000 UTC m=+34.539067980" lastFinishedPulling="2026-04-21 15:35:45.655224815 +0000 UTC m=+40.785278446" observedRunningTime="2026-04-21 15:35:46.764951981 +0000 UTC m=+41.895005609" watchObservedRunningTime="2026-04-21 15:35:46.80192437 +0000 UTC m=+41.931978006" Apr 21 15:35:46.802217 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:46.802049 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-gkvb2" podStartSLOduration=9.108609755 podStartE2EDuration="41.802042072s" podCreationTimestamp="2026-04-21 15:35:05 +0000 UTC" firstStartedPulling="2026-04-21 15:35:06.739328922 +0000 UTC m=+1.869382534" lastFinishedPulling="2026-04-21 15:35:39.432761238 +0000 UTC m=+34.562814851" observedRunningTime="2026-04-21 15:35:46.801125155 +0000 UTC m=+41.931178789" watchObservedRunningTime="2026-04-21 15:35:46.802042072 +0000 UTC m=+41.932095706" Apr 21 15:35:46.852893 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:46.852834 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-fd595c69c-6flrk" podStartSLOduration=24.630682046 podStartE2EDuration="30.852813882s" podCreationTimestamp="2026-04-21 15:35:16 +0000 UTC" firstStartedPulling="2026-04-21 15:35:39.409247871 +0000 UTC m=+34.539301489" lastFinishedPulling="2026-04-21 15:35:45.631379699 +0000 UTC m=+40.761433325" observedRunningTime="2026-04-21 15:35:46.852502439 +0000 UTC m=+41.982556086" watchObservedRunningTime="2026-04-21 15:35:46.852813882 +0000 UTC m=+41.982867523" Apr 21 15:35:48.753582 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:48.753541 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-4pjd9" event={"ID":"97199e5c-4c05-4197-84c9-e95b525f3ae1","Type":"ContainerStarted","Data":"43cf5fa8e1d58d1530eb9d282a1c3122879405caca090c6af6394beef5865b5c"} Apr 21 15:35:48.773830 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:48.773002 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-4pjd9" podStartSLOduration=18.683146404 podStartE2EDuration="23.77298283s" podCreationTimestamp="2026-04-21 15:35:25 +0000 UTC" firstStartedPulling="2026-04-21 15:35:42.455255805 +0000 UTC m=+37.585309422" lastFinishedPulling="2026-04-21 15:35:47.545092231 +0000 UTC m=+42.675145848" observedRunningTime="2026-04-21 15:35:48.772067502 +0000 UTC m=+43.902121136" watchObservedRunningTime="2026-04-21 15:35:48.77298283 +0000 UTC m=+43.903036460" Apr 21 15:35:49.758163 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:49.758127 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5fc6686f8-g584l" event={"ID":"0916d2fa-b831-41db-8365-4d6cf0182f90","Type":"ContainerStarted","Data":"aab1108f3fd677c7c6884d1f44626b0dfb249ff149a41ad32ee3e26b10b0737b"} Apr 21 15:35:49.758163 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:49.758165 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5fc6686f8-g584l" event={"ID":"0916d2fa-b831-41db-8365-4d6cf0182f90","Type":"ContainerStarted","Data":"69167c6ce04c628d92a10a3373fd004385c76c3623a4c1e0913d620f625b05cb"} Apr 21 15:35:49.779039 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:49.778982 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5fc6686f8-g584l" podStartSLOduration=24.003384975 podStartE2EDuration="33.778967525s" podCreationTimestamp="2026-04-21 15:35:16 +0000 UTC" firstStartedPulling="2026-04-21 15:35:39.409177276 +0000 UTC m=+34.539230898" lastFinishedPulling="2026-04-21 15:35:49.184759835 +0000 UTC m=+44.314813448" observedRunningTime="2026-04-21 15:35:49.777414127 +0000 UTC m=+44.907467760" watchObservedRunningTime="2026-04-21 15:35:49.778967525 +0000 UTC m=+44.909021159" Apr 21 15:35:53.651178 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:53.651132 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/755e24aa-0cb9-4572-b0bc-7fa61c3b2a7e-registry-tls\") pod \"image-registry-789c9c8b96-xcctw\" (UID: \"755e24aa-0cb9-4572-b0bc-7fa61c3b2a7e\") " pod="openshift-image-registry/image-registry-789c9c8b96-xcctw" Apr 21 15:35:53.651563 ip-10-0-136-123 kubenswrapper[2573]: E0421 15:35:53.651261 2573 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 21 15:35:53.651563 ip-10-0-136-123 kubenswrapper[2573]: E0421 15:35:53.651274 2573 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-789c9c8b96-xcctw: secret "image-registry-tls" not found Apr 21 15:35:53.651563 ip-10-0-136-123 kubenswrapper[2573]: E0421 15:35:53.651332 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/755e24aa-0cb9-4572-b0bc-7fa61c3b2a7e-registry-tls podName:755e24aa-0cb9-4572-b0bc-7fa61c3b2a7e nodeName:}" failed. No retries permitted until 2026-04-21 15:36:09.651313908 +0000 UTC m=+64.781367520 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/755e24aa-0cb9-4572-b0bc-7fa61c3b2a7e-registry-tls") pod "image-registry-789c9c8b96-xcctw" (UID: "755e24aa-0cb9-4572-b0bc-7fa61c3b2a7e") : secret "image-registry-tls" not found Apr 21 15:35:53.751987 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:53.751946 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ecc2bf4d-8668-46f7-a489-514b0b505d8c-cert\") pod \"ingress-canary-zswf8\" (UID: \"ecc2bf4d-8668-46f7-a489-514b0b505d8c\") " pod="openshift-ingress-canary/ingress-canary-zswf8" Apr 21 15:35:53.752137 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:35:53.751998 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/965e7720-2b43-4a79-9af6-74b4a24a9047-metrics-tls\") pod \"dns-default-p6s7m\" (UID: \"965e7720-2b43-4a79-9af6-74b4a24a9047\") " pod="openshift-dns/dns-default-p6s7m" Apr 21 15:35:53.752137 ip-10-0-136-123 kubenswrapper[2573]: E0421 15:35:53.752103 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 15:35:53.752137 ip-10-0-136-123 kubenswrapper[2573]: E0421 15:35:53.752107 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 15:35:53.752239 ip-10-0-136-123 kubenswrapper[2573]: E0421 15:35:53.752161 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/965e7720-2b43-4a79-9af6-74b4a24a9047-metrics-tls podName:965e7720-2b43-4a79-9af6-74b4a24a9047 nodeName:}" failed. No retries permitted until 2026-04-21 15:36:09.752142813 +0000 UTC m=+64.882196429 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/965e7720-2b43-4a79-9af6-74b4a24a9047-metrics-tls") pod "dns-default-p6s7m" (UID: "965e7720-2b43-4a79-9af6-74b4a24a9047") : secret "dns-default-metrics-tls" not found Apr 21 15:35:53.752239 ip-10-0-136-123 kubenswrapper[2573]: E0421 15:35:53.752176 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ecc2bf4d-8668-46f7-a489-514b0b505d8c-cert podName:ecc2bf4d-8668-46f7-a489-514b0b505d8c nodeName:}" failed. No retries permitted until 2026-04-21 15:36:09.75216951 +0000 UTC m=+64.882223122 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ecc2bf4d-8668-46f7-a489-514b0b505d8c-cert") pod "ingress-canary-zswf8" (UID: "ecc2bf4d-8668-46f7-a489-514b0b505d8c") : secret "canary-serving-cert" not found Apr 21 15:36:04.698264 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:36:04.698234 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-9v44z" Apr 21 15:36:09.671573 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:36:09.671530 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/755e24aa-0cb9-4572-b0bc-7fa61c3b2a7e-registry-tls\") pod \"image-registry-789c9c8b96-xcctw\" (UID: \"755e24aa-0cb9-4572-b0bc-7fa61c3b2a7e\") " pod="openshift-image-registry/image-registry-789c9c8b96-xcctw" Apr 21 15:36:09.671997 ip-10-0-136-123 kubenswrapper[2573]: E0421 15:36:09.671652 2573 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 21 15:36:09.671997 ip-10-0-136-123 kubenswrapper[2573]: E0421 15:36:09.671664 2573 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-789c9c8b96-xcctw: secret "image-registry-tls" not found Apr 21 15:36:09.671997 ip-10-0-136-123 kubenswrapper[2573]: E0421 15:36:09.671716 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/755e24aa-0cb9-4572-b0bc-7fa61c3b2a7e-registry-tls podName:755e24aa-0cb9-4572-b0bc-7fa61c3b2a7e nodeName:}" failed. No retries permitted until 2026-04-21 15:36:41.671701261 +0000 UTC m=+96.801754873 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/755e24aa-0cb9-4572-b0bc-7fa61c3b2a7e-registry-tls") pod "image-registry-789c9c8b96-xcctw" (UID: "755e24aa-0cb9-4572-b0bc-7fa61c3b2a7e") : secret "image-registry-tls" not found Apr 21 15:36:09.772683 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:36:09.772639 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ecc2bf4d-8668-46f7-a489-514b0b505d8c-cert\") pod \"ingress-canary-zswf8\" (UID: \"ecc2bf4d-8668-46f7-a489-514b0b505d8c\") " pod="openshift-ingress-canary/ingress-canary-zswf8" Apr 21 15:36:09.772865 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:36:09.772696 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/965e7720-2b43-4a79-9af6-74b4a24a9047-metrics-tls\") pod \"dns-default-p6s7m\" (UID: \"965e7720-2b43-4a79-9af6-74b4a24a9047\") " pod="openshift-dns/dns-default-p6s7m" Apr 21 15:36:09.772865 ip-10-0-136-123 kubenswrapper[2573]: E0421 15:36:09.772791 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 15:36:09.772940 ip-10-0-136-123 kubenswrapper[2573]: E0421 15:36:09.772877 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ecc2bf4d-8668-46f7-a489-514b0b505d8c-cert podName:ecc2bf4d-8668-46f7-a489-514b0b505d8c nodeName:}" failed. No retries permitted until 2026-04-21 15:36:41.772863406 +0000 UTC m=+96.902917018 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ecc2bf4d-8668-46f7-a489-514b0b505d8c-cert") pod "ingress-canary-zswf8" (UID: "ecc2bf4d-8668-46f7-a489-514b0b505d8c") : secret "canary-serving-cert" not found Apr 21 15:36:09.772940 ip-10-0-136-123 kubenswrapper[2573]: E0421 15:36:09.772813 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 15:36:09.773019 ip-10-0-136-123 kubenswrapper[2573]: E0421 15:36:09.772953 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/965e7720-2b43-4a79-9af6-74b4a24a9047-metrics-tls podName:965e7720-2b43-4a79-9af6-74b4a24a9047 nodeName:}" failed. No retries permitted until 2026-04-21 15:36:41.772937208 +0000 UTC m=+96.902990834 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/965e7720-2b43-4a79-9af6-74b4a24a9047-metrics-tls") pod "dns-default-p6s7m" (UID: "965e7720-2b43-4a79-9af6-74b4a24a9047") : secret "dns-default-metrics-tls" not found Apr 21 15:36:10.176173 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:36:10.176127 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9c107ca7-f14c-4f8c-a8d4-4e08e3acb233-metrics-certs\") pod \"network-metrics-daemon-28b7m\" (UID: \"9c107ca7-f14c-4f8c-a8d4-4e08e3acb233\") " pod="openshift-multus/network-metrics-daemon-28b7m" Apr 21 15:36:10.181583 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:36:10.181555 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 21 15:36:10.187125 ip-10-0-136-123 kubenswrapper[2573]: E0421 15:36:10.187105 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 21 15:36:10.187220 ip-10-0-136-123 kubenswrapper[2573]: E0421 15:36:10.187166 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9c107ca7-f14c-4f8c-a8d4-4e08e3acb233-metrics-certs podName:9c107ca7-f14c-4f8c-a8d4-4e08e3acb233 nodeName:}" failed. No retries permitted until 2026-04-21 15:37:14.187147638 +0000 UTC m=+129.317201251 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9c107ca7-f14c-4f8c-a8d4-4e08e3acb233-metrics-certs") pod "network-metrics-daemon-28b7m" (UID: "9c107ca7-f14c-4f8c-a8d4-4e08e3acb233") : secret "metrics-daemon-secret" not found Apr 21 15:36:10.378127 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:36:10.378091 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t5phk\" (UniqueName: \"kubernetes.io/projected/8ea4d113-155e-4fa2-b765-c12d26b37fa1-kube-api-access-t5phk\") pod \"network-check-target-ntgnx\" (UID: \"8ea4d113-155e-4fa2-b765-c12d26b37fa1\") " pod="openshift-network-diagnostics/network-check-target-ntgnx" Apr 21 15:36:10.381207 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:36:10.381179 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 21 15:36:10.391006 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:36:10.390975 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 21 15:36:10.401960 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:36:10.401933 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t5phk\" (UniqueName: \"kubernetes.io/projected/8ea4d113-155e-4fa2-b765-c12d26b37fa1-kube-api-access-t5phk\") pod \"network-check-target-ntgnx\" (UID: \"8ea4d113-155e-4fa2-b765-c12d26b37fa1\") " pod="openshift-network-diagnostics/network-check-target-ntgnx" Apr 21 15:36:10.693374 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:36:10.693346 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-ndcm7\"" Apr 21 15:36:10.700904 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:36:10.700880 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ntgnx" Apr 21 15:36:10.829493 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:36:10.829459 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-ntgnx"] Apr 21 15:36:10.832515 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:36:10.832486 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8ea4d113_155e_4fa2_b765_c12d26b37fa1.slice/crio-8d24e702311769b4fef2902bb88b14e5afffce8166fe6d01cfdd928d48e91ef7 WatchSource:0}: Error finding container 8d24e702311769b4fef2902bb88b14e5afffce8166fe6d01cfdd928d48e91ef7: Status 404 returned error can't find the container with id 8d24e702311769b4fef2902bb88b14e5afffce8166fe6d01cfdd928d48e91ef7 Apr 21 15:36:11.814563 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:36:11.814518 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-ntgnx" event={"ID":"8ea4d113-155e-4fa2-b765-c12d26b37fa1","Type":"ContainerStarted","Data":"8d24e702311769b4fef2902bb88b14e5afffce8166fe6d01cfdd928d48e91ef7"} Apr 21 15:36:13.821001 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:36:13.820961 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-ntgnx" event={"ID":"8ea4d113-155e-4fa2-b765-c12d26b37fa1","Type":"ContainerStarted","Data":"f06e646161425d5abfb3eabc422faf990a0ed1fc2743cdddde15b2c95b41bf28"} Apr 21 15:36:13.821397 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:36:13.821090 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-ntgnx" Apr 21 15:36:13.846824 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:36:13.846759 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-ntgnx" podStartSLOduration=66.25022392 podStartE2EDuration="1m8.84674452s" podCreationTimestamp="2026-04-21 15:35:05 +0000 UTC" firstStartedPulling="2026-04-21 15:36:10.846056311 +0000 UTC m=+65.976109923" lastFinishedPulling="2026-04-21 15:36:13.442576895 +0000 UTC m=+68.572630523" observedRunningTime="2026-04-21 15:36:13.845654868 +0000 UTC m=+68.975708501" watchObservedRunningTime="2026-04-21 15:36:13.84674452 +0000 UTC m=+68.976798154" Apr 21 15:36:41.718046 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:36:41.717946 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/755e24aa-0cb9-4572-b0bc-7fa61c3b2a7e-registry-tls\") pod \"image-registry-789c9c8b96-xcctw\" (UID: \"755e24aa-0cb9-4572-b0bc-7fa61c3b2a7e\") " pod="openshift-image-registry/image-registry-789c9c8b96-xcctw" Apr 21 15:36:41.718543 ip-10-0-136-123 kubenswrapper[2573]: E0421 15:36:41.718111 2573 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 21 15:36:41.718543 ip-10-0-136-123 kubenswrapper[2573]: E0421 15:36:41.718135 2573 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-789c9c8b96-xcctw: secret "image-registry-tls" not found Apr 21 15:36:41.718543 ip-10-0-136-123 kubenswrapper[2573]: E0421 15:36:41.718225 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/755e24aa-0cb9-4572-b0bc-7fa61c3b2a7e-registry-tls podName:755e24aa-0cb9-4572-b0bc-7fa61c3b2a7e nodeName:}" failed. No retries permitted until 2026-04-21 15:37:45.718203437 +0000 UTC m=+160.848257065 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/755e24aa-0cb9-4572-b0bc-7fa61c3b2a7e-registry-tls") pod "image-registry-789c9c8b96-xcctw" (UID: "755e24aa-0cb9-4572-b0bc-7fa61c3b2a7e") : secret "image-registry-tls" not found Apr 21 15:36:41.818694 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:36:41.818663 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ecc2bf4d-8668-46f7-a489-514b0b505d8c-cert\") pod \"ingress-canary-zswf8\" (UID: \"ecc2bf4d-8668-46f7-a489-514b0b505d8c\") " pod="openshift-ingress-canary/ingress-canary-zswf8" Apr 21 15:36:41.818860 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:36:41.818703 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/965e7720-2b43-4a79-9af6-74b4a24a9047-metrics-tls\") pod \"dns-default-p6s7m\" (UID: \"965e7720-2b43-4a79-9af6-74b4a24a9047\") " pod="openshift-dns/dns-default-p6s7m" Apr 21 15:36:41.818860 ip-10-0-136-123 kubenswrapper[2573]: E0421 15:36:41.818830 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 15:36:41.818933 ip-10-0-136-123 kubenswrapper[2573]: E0421 15:36:41.818902 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ecc2bf4d-8668-46f7-a489-514b0b505d8c-cert podName:ecc2bf4d-8668-46f7-a489-514b0b505d8c nodeName:}" failed. No retries permitted until 2026-04-21 15:37:45.818883781 +0000 UTC m=+160.948937393 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ecc2bf4d-8668-46f7-a489-514b0b505d8c-cert") pod "ingress-canary-zswf8" (UID: "ecc2bf4d-8668-46f7-a489-514b0b505d8c") : secret "canary-serving-cert" not found Apr 21 15:36:41.818933 ip-10-0-136-123 kubenswrapper[2573]: E0421 15:36:41.818834 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 15:36:41.819044 ip-10-0-136-123 kubenswrapper[2573]: E0421 15:36:41.818989 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/965e7720-2b43-4a79-9af6-74b4a24a9047-metrics-tls podName:965e7720-2b43-4a79-9af6-74b4a24a9047 nodeName:}" failed. No retries permitted until 2026-04-21 15:37:45.818973941 +0000 UTC m=+160.949027558 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/965e7720-2b43-4a79-9af6-74b4a24a9047-metrics-tls") pod "dns-default-p6s7m" (UID: "965e7720-2b43-4a79-9af6-74b4a24a9047") : secret "dns-default-metrics-tls" not found Apr 21 15:36:44.826292 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:36:44.826256 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-ntgnx" Apr 21 15:37:14.263300 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:37:14.263252 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9c107ca7-f14c-4f8c-a8d4-4e08e3acb233-metrics-certs\") pod \"network-metrics-daemon-28b7m\" (UID: \"9c107ca7-f14c-4f8c-a8d4-4e08e3acb233\") " pod="openshift-multus/network-metrics-daemon-28b7m" Apr 21 15:37:14.263904 ip-10-0-136-123 kubenswrapper[2573]: E0421 15:37:14.263415 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 21 15:37:14.263904 ip-10-0-136-123 kubenswrapper[2573]: E0421 15:37:14.263517 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9c107ca7-f14c-4f8c-a8d4-4e08e3acb233-metrics-certs podName:9c107ca7-f14c-4f8c-a8d4-4e08e3acb233 nodeName:}" failed. No retries permitted until 2026-04-21 15:39:16.263492657 +0000 UTC m=+251.393546269 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9c107ca7-f14c-4f8c-a8d4-4e08e3acb233-metrics-certs") pod "network-metrics-daemon-28b7m" (UID: "9c107ca7-f14c-4f8c-a8d4-4e08e3acb233") : secret "metrics-daemon-secret" not found Apr 21 15:37:26.413004 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:37:26.412975 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-lkj85_96e29eb1-d270-4d82-a139-d970d1863b1c/dns-node-resolver/0.log" Apr 21 15:37:27.414697 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:37:27.414665 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-m9xpg_39fccf23-7816-40f1-9d1a-0711aca322c8/node-ca/0.log" Apr 21 15:37:40.853121 ip-10-0-136-123 kubenswrapper[2573]: E0421 15:37:40.853067 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[registry-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-image-registry/image-registry-789c9c8b96-xcctw" podUID="755e24aa-0cb9-4572-b0bc-7fa61c3b2a7e" Apr 21 15:37:40.861231 ip-10-0-136-123 kubenswrapper[2573]: E0421 15:37:40.861197 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-p6s7m" podUID="965e7720-2b43-4a79-9af6-74b4a24a9047" Apr 21 15:37:40.873453 ip-10-0-136-123 kubenswrapper[2573]: E0421 15:37:40.873419 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-zswf8" podUID="ecc2bf4d-8668-46f7-a489-514b0b505d8c" Apr 21 15:37:41.019413 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:37:41.019385 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-789c9c8b96-xcctw" Apr 21 15:37:41.019574 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:37:41.019385 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-p6s7m" Apr 21 15:37:42.506270 ip-10-0-136-123 kubenswrapper[2573]: E0421 15:37:42.506218 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-28b7m" podUID="9c107ca7-f14c-4f8c-a8d4-4e08e3acb233" Apr 21 15:37:45.574922 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:37:45.574884 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-htkxj"] Apr 21 15:37:45.578013 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:37:45.577993 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-htkxj" Apr 21 15:37:45.585438 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:37:45.585406 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 21 15:37:45.585567 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:37:45.585443 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 21 15:37:45.586749 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:37:45.586731 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-shndr\"" Apr 21 15:37:45.586950 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:37:45.586937 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 21 15:37:45.588372 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:37:45.588356 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 21 15:37:45.601394 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:37:45.601369 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-htkxj"] Apr 21 15:37:45.708716 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:37:45.708681 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mdtsh\" (UniqueName: \"kubernetes.io/projected/97c1233c-3be7-4359-982f-fb2aaa9a7fea-kube-api-access-mdtsh\") pod \"insights-runtime-extractor-htkxj\" (UID: \"97c1233c-3be7-4359-982f-fb2aaa9a7fea\") " pod="openshift-insights/insights-runtime-extractor-htkxj" Apr 21 15:37:45.708894 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:37:45.708737 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/97c1233c-3be7-4359-982f-fb2aaa9a7fea-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-htkxj\" (UID: \"97c1233c-3be7-4359-982f-fb2aaa9a7fea\") " pod="openshift-insights/insights-runtime-extractor-htkxj" Apr 21 15:37:45.708894 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:37:45.708863 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/97c1233c-3be7-4359-982f-fb2aaa9a7fea-data-volume\") pod \"insights-runtime-extractor-htkxj\" (UID: \"97c1233c-3be7-4359-982f-fb2aaa9a7fea\") " pod="openshift-insights/insights-runtime-extractor-htkxj" Apr 21 15:37:45.708973 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:37:45.708927 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/97c1233c-3be7-4359-982f-fb2aaa9a7fea-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-htkxj\" (UID: \"97c1233c-3be7-4359-982f-fb2aaa9a7fea\") " pod="openshift-insights/insights-runtime-extractor-htkxj" Apr 21 15:37:45.708973 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:37:45.708957 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/97c1233c-3be7-4359-982f-fb2aaa9a7fea-crio-socket\") pod \"insights-runtime-extractor-htkxj\" (UID: \"97c1233c-3be7-4359-982f-fb2aaa9a7fea\") " pod="openshift-insights/insights-runtime-extractor-htkxj" Apr 21 15:37:45.809857 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:37:45.809826 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/97c1233c-3be7-4359-982f-fb2aaa9a7fea-data-volume\") pod \"insights-runtime-extractor-htkxj\" (UID: \"97c1233c-3be7-4359-982f-fb2aaa9a7fea\") " pod="openshift-insights/insights-runtime-extractor-htkxj" Apr 21 15:37:45.809953 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:37:45.809917 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/97c1233c-3be7-4359-982f-fb2aaa9a7fea-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-htkxj\" (UID: \"97c1233c-3be7-4359-982f-fb2aaa9a7fea\") " pod="openshift-insights/insights-runtime-extractor-htkxj" Apr 21 15:37:45.809953 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:37:45.809937 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/97c1233c-3be7-4359-982f-fb2aaa9a7fea-crio-socket\") pod \"insights-runtime-extractor-htkxj\" (UID: \"97c1233c-3be7-4359-982f-fb2aaa9a7fea\") " pod="openshift-insights/insights-runtime-extractor-htkxj" Apr 21 15:37:45.810027 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:37:45.809967 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mdtsh\" (UniqueName: \"kubernetes.io/projected/97c1233c-3be7-4359-982f-fb2aaa9a7fea-kube-api-access-mdtsh\") pod \"insights-runtime-extractor-htkxj\" (UID: \"97c1233c-3be7-4359-982f-fb2aaa9a7fea\") " pod="openshift-insights/insights-runtime-extractor-htkxj" Apr 21 15:37:45.810027 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:37:45.810000 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/755e24aa-0cb9-4572-b0bc-7fa61c3b2a7e-registry-tls\") pod \"image-registry-789c9c8b96-xcctw\" (UID: \"755e24aa-0cb9-4572-b0bc-7fa61c3b2a7e\") " pod="openshift-image-registry/image-registry-789c9c8b96-xcctw" Apr 21 15:37:45.810099 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:37:45.810027 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/97c1233c-3be7-4359-982f-fb2aaa9a7fea-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-htkxj\" (UID: \"97c1233c-3be7-4359-982f-fb2aaa9a7fea\") " pod="openshift-insights/insights-runtime-extractor-htkxj" Apr 21 15:37:45.810099 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:37:45.810059 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/97c1233c-3be7-4359-982f-fb2aaa9a7fea-crio-socket\") pod \"insights-runtime-extractor-htkxj\" (UID: \"97c1233c-3be7-4359-982f-fb2aaa9a7fea\") " pod="openshift-insights/insights-runtime-extractor-htkxj" Apr 21 15:37:45.810178 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:37:45.810160 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/97c1233c-3be7-4359-982f-fb2aaa9a7fea-data-volume\") pod \"insights-runtime-extractor-htkxj\" (UID: \"97c1233c-3be7-4359-982f-fb2aaa9a7fea\") " pod="openshift-insights/insights-runtime-extractor-htkxj" Apr 21 15:37:45.810546 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:37:45.810521 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/97c1233c-3be7-4359-982f-fb2aaa9a7fea-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-htkxj\" (UID: \"97c1233c-3be7-4359-982f-fb2aaa9a7fea\") " pod="openshift-insights/insights-runtime-extractor-htkxj" Apr 21 15:37:45.812080 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:37:45.812057 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/97c1233c-3be7-4359-982f-fb2aaa9a7fea-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-htkxj\" (UID: \"97c1233c-3be7-4359-982f-fb2aaa9a7fea\") " pod="openshift-insights/insights-runtime-extractor-htkxj" Apr 21 15:37:45.812243 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:37:45.812224 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/755e24aa-0cb9-4572-b0bc-7fa61c3b2a7e-registry-tls\") pod \"image-registry-789c9c8b96-xcctw\" (UID: \"755e24aa-0cb9-4572-b0bc-7fa61c3b2a7e\") " pod="openshift-image-registry/image-registry-789c9c8b96-xcctw" Apr 21 15:37:45.824630 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:37:45.824610 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-kwwxq\"" Apr 21 15:37:45.830865 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:37:45.830818 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-789c9c8b96-xcctw" Apr 21 15:37:45.839691 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:37:45.839664 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mdtsh\" (UniqueName: \"kubernetes.io/projected/97c1233c-3be7-4359-982f-fb2aaa9a7fea-kube-api-access-mdtsh\") pod \"insights-runtime-extractor-htkxj\" (UID: \"97c1233c-3be7-4359-982f-fb2aaa9a7fea\") " pod="openshift-insights/insights-runtime-extractor-htkxj" Apr 21 15:37:45.887224 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:37:45.887195 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-htkxj" Apr 21 15:37:45.911446 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:37:45.911398 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/965e7720-2b43-4a79-9af6-74b4a24a9047-metrics-tls\") pod \"dns-default-p6s7m\" (UID: \"965e7720-2b43-4a79-9af6-74b4a24a9047\") " pod="openshift-dns/dns-default-p6s7m" Apr 21 15:37:45.911588 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:37:45.911503 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ecc2bf4d-8668-46f7-a489-514b0b505d8c-cert\") pod \"ingress-canary-zswf8\" (UID: \"ecc2bf4d-8668-46f7-a489-514b0b505d8c\") " pod="openshift-ingress-canary/ingress-canary-zswf8" Apr 21 15:37:45.913774 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:37:45.913751 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/965e7720-2b43-4a79-9af6-74b4a24a9047-metrics-tls\") pod \"dns-default-p6s7m\" (UID: \"965e7720-2b43-4a79-9af6-74b4a24a9047\") " pod="openshift-dns/dns-default-p6s7m" Apr 21 15:37:45.914492 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:37:45.914471 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ecc2bf4d-8668-46f7-a489-514b0b505d8c-cert\") pod \"ingress-canary-zswf8\" (UID: \"ecc2bf4d-8668-46f7-a489-514b0b505d8c\") " pod="openshift-ingress-canary/ingress-canary-zswf8" Apr 21 15:37:45.967246 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:37:45.967190 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-789c9c8b96-xcctw"] Apr 21 15:37:45.971348 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:37:45.971317 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod755e24aa_0cb9_4572_b0bc_7fa61c3b2a7e.slice/crio-c89dfdd2b5dec26aa78dd972dde87a9c7276121e072a0822b14690364207a3b1 WatchSource:0}: Error finding container c89dfdd2b5dec26aa78dd972dde87a9c7276121e072a0822b14690364207a3b1: Status 404 returned error can't find the container with id c89dfdd2b5dec26aa78dd972dde87a9c7276121e072a0822b14690364207a3b1 Apr 21 15:37:46.018107 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:37:46.018075 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-htkxj"] Apr 21 15:37:46.021679 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:37:46.021651 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod97c1233c_3be7_4359_982f_fb2aaa9a7fea.slice/crio-d14104624b176c71b4a75d04d7bcbfb26b6f76e3cbb7fcef6037a87f01efbef8 WatchSource:0}: Error finding container d14104624b176c71b4a75d04d7bcbfb26b6f76e3cbb7fcef6037a87f01efbef8: Status 404 returned error can't find the container with id d14104624b176c71b4a75d04d7bcbfb26b6f76e3cbb7fcef6037a87f01efbef8 Apr 21 15:37:46.032708 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:37:46.032677 2573 generic.go:358] "Generic (PLEG): container finished" podID="f01c48ea-2a23-4c74-b151-71433151d77e" containerID="602e078599e28f58a1ae76a6a47b9f6fc88896b1957e35b0e1a5df8947de8f14" exitCode=255 Apr 21 15:37:46.032895 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:37:46.032758 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-fd595c69c-6flrk" event={"ID":"f01c48ea-2a23-4c74-b151-71433151d77e","Type":"ContainerDied","Data":"602e078599e28f58a1ae76a6a47b9f6fc88896b1957e35b0e1a5df8947de8f14"} Apr 21 15:37:46.033157 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:37:46.033133 2573 scope.go:117] "RemoveContainer" containerID="602e078599e28f58a1ae76a6a47b9f6fc88896b1957e35b0e1a5df8947de8f14" Apr 21 15:37:46.033996 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:37:46.033973 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-htkxj" event={"ID":"97c1233c-3be7-4359-982f-fb2aaa9a7fea","Type":"ContainerStarted","Data":"d14104624b176c71b4a75d04d7bcbfb26b6f76e3cbb7fcef6037a87f01efbef8"} Apr 21 15:37:46.038233 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:37:46.038087 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-789c9c8b96-xcctw" event={"ID":"755e24aa-0cb9-4572-b0bc-7fa61c3b2a7e","Type":"ContainerStarted","Data":"0d1d98774f0330c2f838ce64e660c3e7f8b19a21dd68bafce48c22c4079e95a1"} Apr 21 15:37:46.038233 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:37:46.038118 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-789c9c8b96-xcctw" event={"ID":"755e24aa-0cb9-4572-b0bc-7fa61c3b2a7e","Type":"ContainerStarted","Data":"c89dfdd2b5dec26aa78dd972dde87a9c7276121e072a0822b14690364207a3b1"} Apr 21 15:37:46.038233 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:37:46.038194 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-789c9c8b96-xcctw" Apr 21 15:37:46.039580 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:37:46.039557 2573 generic.go:358] "Generic (PLEG): container finished" podID="e86d6919-a690-43d0-bce1-125ffa4e89a0" containerID="73f5964fa8a89ff5c8858947efe64f8b0a9efe92f2a6f20e87b3331c7a47bf5a" exitCode=1 Apr 21 15:37:46.039690 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:37:46.039598 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5b8765b5cd-hk87t" event={"ID":"e86d6919-a690-43d0-bce1-125ffa4e89a0","Type":"ContainerDied","Data":"73f5964fa8a89ff5c8858947efe64f8b0a9efe92f2a6f20e87b3331c7a47bf5a"} Apr 21 15:37:46.039911 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:37:46.039896 2573 scope.go:117] "RemoveContainer" containerID="73f5964fa8a89ff5c8858947efe64f8b0a9efe92f2a6f20e87b3331c7a47bf5a" Apr 21 15:37:46.123386 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:37:46.123358 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-7ld2z\"" Apr 21 15:37:46.128634 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:37:46.128589 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-789c9c8b96-xcctw" podStartSLOduration=160.128573013 podStartE2EDuration="2m40.128573013s" podCreationTimestamp="2026-04-21 15:35:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 15:37:46.125177148 +0000 UTC m=+161.255230783" watchObservedRunningTime="2026-04-21 15:37:46.128573013 +0000 UTC m=+161.258626646" Apr 21 15:37:46.131126 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:37:46.131076 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-p6s7m" Apr 21 15:37:46.268555 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:37:46.268522 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-p6s7m"] Apr 21 15:37:46.271604 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:37:46.271560 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod965e7720_2b43_4a79_9af6_74b4a24a9047.slice/crio-729f84420663bd1ae6171128e4b58f809384458fba1f706f27880f1a7a7147c6 WatchSource:0}: Error finding container 729f84420663bd1ae6171128e4b58f809384458fba1f706f27880f1a7a7147c6: Status 404 returned error can't find the container with id 729f84420663bd1ae6171128e4b58f809384458fba1f706f27880f1a7a7147c6 Apr 21 15:37:46.739444 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:37:46.739407 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5b8765b5cd-hk87t" Apr 21 15:37:47.044122 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:37:47.044033 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-p6s7m" event={"ID":"965e7720-2b43-4a79-9af6-74b4a24a9047","Type":"ContainerStarted","Data":"729f84420663bd1ae6171128e4b58f809384458fba1f706f27880f1a7a7147c6"} Apr 21 15:37:47.045771 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:37:47.045742 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-fd595c69c-6flrk" event={"ID":"f01c48ea-2a23-4c74-b151-71433151d77e","Type":"ContainerStarted","Data":"99743f697524c02c201094550dd8109f0b9244405ee3024de4d3e6fd73f3f131"} Apr 21 15:37:47.047407 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:37:47.047382 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-htkxj" event={"ID":"97c1233c-3be7-4359-982f-fb2aaa9a7fea","Type":"ContainerStarted","Data":"0d7cf0964224ad60b2424fdbccc941159f846135f1a54e3d755e0c69ae45a594"} Apr 21 15:37:47.047524 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:37:47.047413 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-htkxj" event={"ID":"97c1233c-3be7-4359-982f-fb2aaa9a7fea","Type":"ContainerStarted","Data":"113022a6d557986fbcaf56ce9deacc369ff8e8739387df36da6dc93619a522db"} Apr 21 15:37:47.049036 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:37:47.048996 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5b8765b5cd-hk87t" event={"ID":"e86d6919-a690-43d0-bce1-125ffa4e89a0","Type":"ContainerStarted","Data":"6ea32e8e33b480e6d5181bb5d8343a0227b27c41bcdbe5d7a6b18ab28fffb3bc"} Apr 21 15:37:48.053918 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:37:48.053824 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-p6s7m" event={"ID":"965e7720-2b43-4a79-9af6-74b4a24a9047","Type":"ContainerStarted","Data":"a5709c2b984bbdf442e182ca9d553c0886e068e66a7eab45c660ee17bda276ca"} Apr 21 15:37:48.054375 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:37:48.054173 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5b8765b5cd-hk87t" Apr 21 15:37:48.055026 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:37:48.055001 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5b8765b5cd-hk87t" Apr 21 15:37:49.058026 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:37:49.057988 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-p6s7m" event={"ID":"965e7720-2b43-4a79-9af6-74b4a24a9047","Type":"ContainerStarted","Data":"f3755e00341deae0a57c093af0090025e24d9304fddd0f2e63c08241ff826cf4"} Apr 21 15:37:49.058497 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:37:49.058123 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-p6s7m" Apr 21 15:37:49.059690 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:37:49.059664 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-htkxj" event={"ID":"97c1233c-3be7-4359-982f-fb2aaa9a7fea","Type":"ContainerStarted","Data":"7de0feda6cc768a6a0f6c44f6f7576f050189719950cf5789d9849a13266d05c"} Apr 21 15:37:49.078941 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:37:49.078903 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-p6s7m" podStartSLOduration=130.679412273 podStartE2EDuration="2m12.078890182s" podCreationTimestamp="2026-04-21 15:35:37 +0000 UTC" firstStartedPulling="2026-04-21 15:37:46.2734648 +0000 UTC m=+161.403518412" lastFinishedPulling="2026-04-21 15:37:47.672942706 +0000 UTC m=+162.802996321" observedRunningTime="2026-04-21 15:37:49.077849221 +0000 UTC m=+164.207902852" watchObservedRunningTime="2026-04-21 15:37:49.078890182 +0000 UTC m=+164.208943816" Apr 21 15:37:49.105134 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:37:49.105089 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-htkxj" podStartSLOduration=1.975480976 podStartE2EDuration="4.105074567s" podCreationTimestamp="2026-04-21 15:37:45 +0000 UTC" firstStartedPulling="2026-04-21 15:37:46.080965393 +0000 UTC m=+161.211019011" lastFinishedPulling="2026-04-21 15:37:48.21055898 +0000 UTC m=+163.340612602" observedRunningTime="2026-04-21 15:37:49.102960123 +0000 UTC m=+164.233013788" watchObservedRunningTime="2026-04-21 15:37:49.105074567 +0000 UTC m=+164.235128200" Apr 21 15:37:53.476498 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:37:53.476456 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-zswf8" Apr 21 15:37:53.479530 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:37:53.479504 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-xkwb5\"" Apr 21 15:37:53.486962 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:37:53.486931 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-zswf8" Apr 21 15:37:53.632201 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:37:53.632169 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-zswf8"] Apr 21 15:37:53.639698 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:37:53.639674 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podecc2bf4d_8668_46f7_a489_514b0b505d8c.slice/crio-67801df611b8c2578ada05f70041802638ecef699b1144b9e0519b35e82868fb WatchSource:0}: Error finding container 67801df611b8c2578ada05f70041802638ecef699b1144b9e0519b35e82868fb: Status 404 returned error can't find the container with id 67801df611b8c2578ada05f70041802638ecef699b1144b9e0519b35e82868fb Apr 21 15:37:54.073919 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:37:54.073873 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-zswf8" event={"ID":"ecc2bf4d-8668-46f7-a489-514b0b505d8c","Type":"ContainerStarted","Data":"67801df611b8c2578ada05f70041802638ecef699b1144b9e0519b35e82868fb"} Apr 21 15:37:54.476735 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:37:54.476646 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-28b7m" Apr 21 15:37:56.080693 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:37:56.080650 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-zswf8" event={"ID":"ecc2bf4d-8668-46f7-a489-514b0b505d8c","Type":"ContainerStarted","Data":"a64ade1730a4679f4e13a1a8863c8a1778fcb7b77fb5273fbd4c46da19adf91c"} Apr 21 15:37:56.106876 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:37:56.106823 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-zswf8" podStartSLOduration=137.570853346 podStartE2EDuration="2m19.106790068s" podCreationTimestamp="2026-04-21 15:35:37 +0000 UTC" firstStartedPulling="2026-04-21 15:37:53.64139224 +0000 UTC m=+168.771445852" lastFinishedPulling="2026-04-21 15:37:55.177328959 +0000 UTC m=+170.307382574" observedRunningTime="2026-04-21 15:37:56.105874312 +0000 UTC m=+171.235927946" watchObservedRunningTime="2026-04-21 15:37:56.106790068 +0000 UTC m=+171.236843701" Apr 21 15:37:59.067400 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:37:59.067368 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-p6s7m" Apr 21 15:38:01.002689 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:38:01.002651 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-vmrmq"] Apr 21 15:38:01.005956 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:38:01.005939 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-vmrmq" Apr 21 15:38:01.008824 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:38:01.008780 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 21 15:38:01.008971 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:38:01.008829 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 21 15:38:01.009180 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:38:01.009160 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 21 15:38:01.009336 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:38:01.009318 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 21 15:38:01.010018 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:38:01.010001 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-52fql\"" Apr 21 15:38:01.010107 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:38:01.010051 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 21 15:38:01.010166 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:38:01.010128 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 21 15:38:01.126369 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:38:01.126333 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/9a3f9599-c99a-4c6a-b295-b12b9a4fbc96-metrics-client-ca\") pod \"node-exporter-vmrmq\" (UID: \"9a3f9599-c99a-4c6a-b295-b12b9a4fbc96\") " pod="openshift-monitoring/node-exporter-vmrmq" Apr 21 15:38:01.126583 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:38:01.126450 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/9a3f9599-c99a-4c6a-b295-b12b9a4fbc96-node-exporter-accelerators-collector-config\") pod \"node-exporter-vmrmq\" (UID: \"9a3f9599-c99a-4c6a-b295-b12b9a4fbc96\") " pod="openshift-monitoring/node-exporter-vmrmq" Apr 21 15:38:01.126583 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:38:01.126522 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9a3f9599-c99a-4c6a-b295-b12b9a4fbc96-sys\") pod \"node-exporter-vmrmq\" (UID: \"9a3f9599-c99a-4c6a-b295-b12b9a4fbc96\") " pod="openshift-monitoring/node-exporter-vmrmq" Apr 21 15:38:01.126583 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:38:01.126555 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/9a3f9599-c99a-4c6a-b295-b12b9a4fbc96-node-exporter-textfile\") pod \"node-exporter-vmrmq\" (UID: \"9a3f9599-c99a-4c6a-b295-b12b9a4fbc96\") " pod="openshift-monitoring/node-exporter-vmrmq" Apr 21 15:38:01.126745 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:38:01.126618 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/9a3f9599-c99a-4c6a-b295-b12b9a4fbc96-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-vmrmq\" (UID: \"9a3f9599-c99a-4c6a-b295-b12b9a4fbc96\") " pod="openshift-monitoring/node-exporter-vmrmq" Apr 21 15:38:01.126745 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:38:01.126648 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gldg7\" (UniqueName: \"kubernetes.io/projected/9a3f9599-c99a-4c6a-b295-b12b9a4fbc96-kube-api-access-gldg7\") pod \"node-exporter-vmrmq\" (UID: \"9a3f9599-c99a-4c6a-b295-b12b9a4fbc96\") " pod="openshift-monitoring/node-exporter-vmrmq" Apr 21 15:38:01.126745 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:38:01.126682 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/9a3f9599-c99a-4c6a-b295-b12b9a4fbc96-node-exporter-wtmp\") pod \"node-exporter-vmrmq\" (UID: \"9a3f9599-c99a-4c6a-b295-b12b9a4fbc96\") " pod="openshift-monitoring/node-exporter-vmrmq" Apr 21 15:38:01.126907 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:38:01.126757 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/9a3f9599-c99a-4c6a-b295-b12b9a4fbc96-root\") pod \"node-exporter-vmrmq\" (UID: \"9a3f9599-c99a-4c6a-b295-b12b9a4fbc96\") " pod="openshift-monitoring/node-exporter-vmrmq" Apr 21 15:38:01.126907 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:38:01.126786 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/9a3f9599-c99a-4c6a-b295-b12b9a4fbc96-node-exporter-tls\") pod \"node-exporter-vmrmq\" (UID: \"9a3f9599-c99a-4c6a-b295-b12b9a4fbc96\") " pod="openshift-monitoring/node-exporter-vmrmq" Apr 21 15:38:01.227346 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:38:01.227304 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/9a3f9599-c99a-4c6a-b295-b12b9a4fbc96-metrics-client-ca\") pod \"node-exporter-vmrmq\" (UID: \"9a3f9599-c99a-4c6a-b295-b12b9a4fbc96\") " pod="openshift-monitoring/node-exporter-vmrmq" Apr 21 15:38:01.227499 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:38:01.227372 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/9a3f9599-c99a-4c6a-b295-b12b9a4fbc96-node-exporter-accelerators-collector-config\") pod \"node-exporter-vmrmq\" (UID: \"9a3f9599-c99a-4c6a-b295-b12b9a4fbc96\") " pod="openshift-monitoring/node-exporter-vmrmq" Apr 21 15:38:01.227499 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:38:01.227391 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9a3f9599-c99a-4c6a-b295-b12b9a4fbc96-sys\") pod \"node-exporter-vmrmq\" (UID: \"9a3f9599-c99a-4c6a-b295-b12b9a4fbc96\") " pod="openshift-monitoring/node-exporter-vmrmq" Apr 21 15:38:01.227499 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:38:01.227409 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/9a3f9599-c99a-4c6a-b295-b12b9a4fbc96-node-exporter-textfile\") pod \"node-exporter-vmrmq\" (UID: \"9a3f9599-c99a-4c6a-b295-b12b9a4fbc96\") " pod="openshift-monitoring/node-exporter-vmrmq" Apr 21 15:38:01.227499 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:38:01.227431 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/9a3f9599-c99a-4c6a-b295-b12b9a4fbc96-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-vmrmq\" (UID: \"9a3f9599-c99a-4c6a-b295-b12b9a4fbc96\") " pod="openshift-monitoring/node-exporter-vmrmq" Apr 21 15:38:01.227499 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:38:01.227451 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gldg7\" (UniqueName: \"kubernetes.io/projected/9a3f9599-c99a-4c6a-b295-b12b9a4fbc96-kube-api-access-gldg7\") pod \"node-exporter-vmrmq\" (UID: \"9a3f9599-c99a-4c6a-b295-b12b9a4fbc96\") " pod="openshift-monitoring/node-exporter-vmrmq" Apr 21 15:38:01.227499 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:38:01.227472 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/9a3f9599-c99a-4c6a-b295-b12b9a4fbc96-node-exporter-wtmp\") pod \"node-exporter-vmrmq\" (UID: \"9a3f9599-c99a-4c6a-b295-b12b9a4fbc96\") " pod="openshift-monitoring/node-exporter-vmrmq" Apr 21 15:38:01.227775 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:38:01.227505 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/9a3f9599-c99a-4c6a-b295-b12b9a4fbc96-root\") pod \"node-exporter-vmrmq\" (UID: \"9a3f9599-c99a-4c6a-b295-b12b9a4fbc96\") " pod="openshift-monitoring/node-exporter-vmrmq" Apr 21 15:38:01.227775 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:38:01.227515 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9a3f9599-c99a-4c6a-b295-b12b9a4fbc96-sys\") pod \"node-exporter-vmrmq\" (UID: \"9a3f9599-c99a-4c6a-b295-b12b9a4fbc96\") " pod="openshift-monitoring/node-exporter-vmrmq" Apr 21 15:38:01.227775 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:38:01.227544 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/9a3f9599-c99a-4c6a-b295-b12b9a4fbc96-root\") pod \"node-exporter-vmrmq\" (UID: \"9a3f9599-c99a-4c6a-b295-b12b9a4fbc96\") " pod="openshift-monitoring/node-exporter-vmrmq" Apr 21 15:38:01.227775 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:38:01.227560 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/9a3f9599-c99a-4c6a-b295-b12b9a4fbc96-node-exporter-tls\") pod \"node-exporter-vmrmq\" (UID: \"9a3f9599-c99a-4c6a-b295-b12b9a4fbc96\") " pod="openshift-monitoring/node-exporter-vmrmq" Apr 21 15:38:01.227775 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:38:01.227634 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/9a3f9599-c99a-4c6a-b295-b12b9a4fbc96-node-exporter-wtmp\") pod \"node-exporter-vmrmq\" (UID: \"9a3f9599-c99a-4c6a-b295-b12b9a4fbc96\") " pod="openshift-monitoring/node-exporter-vmrmq" Apr 21 15:38:01.227775 ip-10-0-136-123 kubenswrapper[2573]: E0421 15:38:01.227668 2573 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 21 15:38:01.227775 ip-10-0-136-123 kubenswrapper[2573]: E0421 15:38:01.227719 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9a3f9599-c99a-4c6a-b295-b12b9a4fbc96-node-exporter-tls podName:9a3f9599-c99a-4c6a-b295-b12b9a4fbc96 nodeName:}" failed. No retries permitted until 2026-04-21 15:38:01.727702511 +0000 UTC m=+176.857756127 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/9a3f9599-c99a-4c6a-b295-b12b9a4fbc96-node-exporter-tls") pod "node-exporter-vmrmq" (UID: "9a3f9599-c99a-4c6a-b295-b12b9a4fbc96") : secret "node-exporter-tls" not found Apr 21 15:38:01.228138 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:38:01.228095 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/9a3f9599-c99a-4c6a-b295-b12b9a4fbc96-metrics-client-ca\") pod \"node-exporter-vmrmq\" (UID: \"9a3f9599-c99a-4c6a-b295-b12b9a4fbc96\") " pod="openshift-monitoring/node-exporter-vmrmq" Apr 21 15:38:01.228138 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:38:01.228095 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/9a3f9599-c99a-4c6a-b295-b12b9a4fbc96-node-exporter-accelerators-collector-config\") pod \"node-exporter-vmrmq\" (UID: \"9a3f9599-c99a-4c6a-b295-b12b9a4fbc96\") " pod="openshift-monitoring/node-exporter-vmrmq" Apr 21 15:38:01.228243 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:38:01.228226 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/9a3f9599-c99a-4c6a-b295-b12b9a4fbc96-node-exporter-textfile\") pod \"node-exporter-vmrmq\" (UID: \"9a3f9599-c99a-4c6a-b295-b12b9a4fbc96\") " pod="openshift-monitoring/node-exporter-vmrmq" Apr 21 15:38:01.229990 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:38:01.229972 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/9a3f9599-c99a-4c6a-b295-b12b9a4fbc96-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-vmrmq\" (UID: \"9a3f9599-c99a-4c6a-b295-b12b9a4fbc96\") " pod="openshift-monitoring/node-exporter-vmrmq" Apr 21 15:38:01.259238 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:38:01.259157 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gldg7\" (UniqueName: \"kubernetes.io/projected/9a3f9599-c99a-4c6a-b295-b12b9a4fbc96-kube-api-access-gldg7\") pod \"node-exporter-vmrmq\" (UID: \"9a3f9599-c99a-4c6a-b295-b12b9a4fbc96\") " pod="openshift-monitoring/node-exporter-vmrmq" Apr 21 15:38:01.730863 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:38:01.730791 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/9a3f9599-c99a-4c6a-b295-b12b9a4fbc96-node-exporter-tls\") pod \"node-exporter-vmrmq\" (UID: \"9a3f9599-c99a-4c6a-b295-b12b9a4fbc96\") " pod="openshift-monitoring/node-exporter-vmrmq" Apr 21 15:38:01.733214 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:38:01.733189 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/9a3f9599-c99a-4c6a-b295-b12b9a4fbc96-node-exporter-tls\") pod \"node-exporter-vmrmq\" (UID: \"9a3f9599-c99a-4c6a-b295-b12b9a4fbc96\") " pod="openshift-monitoring/node-exporter-vmrmq" Apr 21 15:38:01.915335 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:38:01.915300 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-vmrmq" Apr 21 15:38:01.923391 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:38:01.923363 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9a3f9599_c99a_4c6a_b295_b12b9a4fbc96.slice/crio-563282a021a3eecf2208a33112151777cfcc035b40bb88c66c5e0a71f2a30d1a WatchSource:0}: Error finding container 563282a021a3eecf2208a33112151777cfcc035b40bb88c66c5e0a71f2a30d1a: Status 404 returned error can't find the container with id 563282a021a3eecf2208a33112151777cfcc035b40bb88c66c5e0a71f2a30d1a Apr 21 15:38:02.097778 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:38:02.097741 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-vmrmq" event={"ID":"9a3f9599-c99a-4c6a-b295-b12b9a4fbc96","Type":"ContainerStarted","Data":"563282a021a3eecf2208a33112151777cfcc035b40bb88c66c5e0a71f2a30d1a"} Apr 21 15:38:03.101329 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:38:03.101296 2573 generic.go:358] "Generic (PLEG): container finished" podID="9a3f9599-c99a-4c6a-b295-b12b9a4fbc96" containerID="c2c5f6bdcecdd8c9873b1f342fda657a62963e3519578c0a47a24eb8556eda5c" exitCode=0 Apr 21 15:38:03.101709 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:38:03.101382 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-vmrmq" event={"ID":"9a3f9599-c99a-4c6a-b295-b12b9a4fbc96","Type":"ContainerDied","Data":"c2c5f6bdcecdd8c9873b1f342fda657a62963e3519578c0a47a24eb8556eda5c"} Apr 21 15:38:04.106086 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:38:04.106049 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-vmrmq" event={"ID":"9a3f9599-c99a-4c6a-b295-b12b9a4fbc96","Type":"ContainerStarted","Data":"789b00c7b0c0997e04a643bb8c0f6e73588a39882a9f89d1c38b4cf4de2db68f"} Apr 21 15:38:04.106086 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:38:04.106085 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-vmrmq" event={"ID":"9a3f9599-c99a-4c6a-b295-b12b9a4fbc96","Type":"ContainerStarted","Data":"88eac6fc26a246bcd67968a50d05a56442690efdb7b1f584c6c141401378c8cc"} Apr 21 15:38:04.126716 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:38:04.126667 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-vmrmq" podStartSLOduration=3.2534445339999998 podStartE2EDuration="4.126652274s" podCreationTimestamp="2026-04-21 15:38:00 +0000 UTC" firstStartedPulling="2026-04-21 15:38:01.925113707 +0000 UTC m=+177.055167334" lastFinishedPulling="2026-04-21 15:38:02.798321451 +0000 UTC m=+177.928375074" observedRunningTime="2026-04-21 15:38:04.125144067 +0000 UTC m=+179.255197701" watchObservedRunningTime="2026-04-21 15:38:04.126652274 +0000 UTC m=+179.256705953" Apr 21 15:38:05.835412 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:38:05.835363 2573 patch_prober.go:28] interesting pod/image-registry-789c9c8b96-xcctw container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 21 15:38:05.835779 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:38:05.835422 2573 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-789c9c8b96-xcctw" podUID="755e24aa-0cb9-4572-b0bc-7fa61c3b2a7e" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 21 15:38:07.053779 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:38:07.053750 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-789c9c8b96-xcctw" Apr 21 15:38:07.588200 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:38:07.588165 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-789c9c8b96-xcctw"] Apr 21 15:38:28.126071 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:38:28.126030 2573 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5fc6686f8-g584l" podUID="0916d2fa-b831-41db-8365-4d6cf0182f90" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 21 15:38:32.607012 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:38:32.606972 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-789c9c8b96-xcctw" podUID="755e24aa-0cb9-4572-b0bc-7fa61c3b2a7e" containerName="registry" containerID="cri-o://0d1d98774f0330c2f838ce64e660c3e7f8b19a21dd68bafce48c22c4079e95a1" gracePeriod=30 Apr 21 15:38:32.838890 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:38:32.838867 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-789c9c8b96-xcctw" Apr 21 15:38:32.978044 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:38:32.977951 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/755e24aa-0cb9-4572-b0bc-7fa61c3b2a7e-image-registry-private-configuration\") pod \"755e24aa-0cb9-4572-b0bc-7fa61c3b2a7e\" (UID: \"755e24aa-0cb9-4572-b0bc-7fa61c3b2a7e\") " Apr 21 15:38:32.978044 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:38:32.978001 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/755e24aa-0cb9-4572-b0bc-7fa61c3b2a7e-registry-certificates\") pod \"755e24aa-0cb9-4572-b0bc-7fa61c3b2a7e\" (UID: \"755e24aa-0cb9-4572-b0bc-7fa61c3b2a7e\") " Apr 21 15:38:32.978044 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:38:32.978037 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/755e24aa-0cb9-4572-b0bc-7fa61c3b2a7e-trusted-ca\") pod \"755e24aa-0cb9-4572-b0bc-7fa61c3b2a7e\" (UID: \"755e24aa-0cb9-4572-b0bc-7fa61c3b2a7e\") " Apr 21 15:38:32.978321 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:38:32.978056 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/755e24aa-0cb9-4572-b0bc-7fa61c3b2a7e-ca-trust-extracted\") pod \"755e24aa-0cb9-4572-b0bc-7fa61c3b2a7e\" (UID: \"755e24aa-0cb9-4572-b0bc-7fa61c3b2a7e\") " Apr 21 15:38:32.978321 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:38:32.978081 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b8875\" (UniqueName: \"kubernetes.io/projected/755e24aa-0cb9-4572-b0bc-7fa61c3b2a7e-kube-api-access-b8875\") pod \"755e24aa-0cb9-4572-b0bc-7fa61c3b2a7e\" (UID: \"755e24aa-0cb9-4572-b0bc-7fa61c3b2a7e\") " Apr 21 15:38:32.978321 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:38:32.978103 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/755e24aa-0cb9-4572-b0bc-7fa61c3b2a7e-registry-tls\") pod \"755e24aa-0cb9-4572-b0bc-7fa61c3b2a7e\" (UID: \"755e24aa-0cb9-4572-b0bc-7fa61c3b2a7e\") " Apr 21 15:38:32.978321 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:38:32.978124 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/755e24aa-0cb9-4572-b0bc-7fa61c3b2a7e-bound-sa-token\") pod \"755e24aa-0cb9-4572-b0bc-7fa61c3b2a7e\" (UID: \"755e24aa-0cb9-4572-b0bc-7fa61c3b2a7e\") " Apr 21 15:38:32.978321 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:38:32.978146 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/755e24aa-0cb9-4572-b0bc-7fa61c3b2a7e-installation-pull-secrets\") pod \"755e24aa-0cb9-4572-b0bc-7fa61c3b2a7e\" (UID: \"755e24aa-0cb9-4572-b0bc-7fa61c3b2a7e\") " Apr 21 15:38:32.978675 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:38:32.978612 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/755e24aa-0cb9-4572-b0bc-7fa61c3b2a7e-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "755e24aa-0cb9-4572-b0bc-7fa61c3b2a7e" (UID: "755e24aa-0cb9-4572-b0bc-7fa61c3b2a7e"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 15:38:32.978675 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:38:32.978637 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/755e24aa-0cb9-4572-b0bc-7fa61c3b2a7e-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "755e24aa-0cb9-4572-b0bc-7fa61c3b2a7e" (UID: "755e24aa-0cb9-4572-b0bc-7fa61c3b2a7e"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 15:38:32.981635 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:38:32.981182 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/755e24aa-0cb9-4572-b0bc-7fa61c3b2a7e-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "755e24aa-0cb9-4572-b0bc-7fa61c3b2a7e" (UID: "755e24aa-0cb9-4572-b0bc-7fa61c3b2a7e"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 15:38:32.981836 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:38:32.981788 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/755e24aa-0cb9-4572-b0bc-7fa61c3b2a7e-kube-api-access-b8875" (OuterVolumeSpecName: "kube-api-access-b8875") pod "755e24aa-0cb9-4572-b0bc-7fa61c3b2a7e" (UID: "755e24aa-0cb9-4572-b0bc-7fa61c3b2a7e"). InnerVolumeSpecName "kube-api-access-b8875". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 15:38:32.982436 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:38:32.981956 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/755e24aa-0cb9-4572-b0bc-7fa61c3b2a7e-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "755e24aa-0cb9-4572-b0bc-7fa61c3b2a7e" (UID: "755e24aa-0cb9-4572-b0bc-7fa61c3b2a7e"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 15:38:32.982436 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:38:32.982016 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/755e24aa-0cb9-4572-b0bc-7fa61c3b2a7e-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "755e24aa-0cb9-4572-b0bc-7fa61c3b2a7e" (UID: "755e24aa-0cb9-4572-b0bc-7fa61c3b2a7e"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 15:38:32.985162 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:38:32.983899 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/755e24aa-0cb9-4572-b0bc-7fa61c3b2a7e-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "755e24aa-0cb9-4572-b0bc-7fa61c3b2a7e" (UID: "755e24aa-0cb9-4572-b0bc-7fa61c3b2a7e"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 15:38:32.991662 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:38:32.991634 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/755e24aa-0cb9-4572-b0bc-7fa61c3b2a7e-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "755e24aa-0cb9-4572-b0bc-7fa61c3b2a7e" (UID: "755e24aa-0cb9-4572-b0bc-7fa61c3b2a7e"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 15:38:33.079272 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:38:33.079228 2573 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/755e24aa-0cb9-4572-b0bc-7fa61c3b2a7e-trusted-ca\") on node \"ip-10-0-136-123.ec2.internal\" DevicePath \"\"" Apr 21 15:38:33.079272 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:38:33.079263 2573 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/755e24aa-0cb9-4572-b0bc-7fa61c3b2a7e-ca-trust-extracted\") on node \"ip-10-0-136-123.ec2.internal\" DevicePath \"\"" Apr 21 15:38:33.079272 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:38:33.079274 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-b8875\" (UniqueName: \"kubernetes.io/projected/755e24aa-0cb9-4572-b0bc-7fa61c3b2a7e-kube-api-access-b8875\") on node \"ip-10-0-136-123.ec2.internal\" DevicePath \"\"" Apr 21 15:38:33.079490 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:38:33.079286 2573 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/755e24aa-0cb9-4572-b0bc-7fa61c3b2a7e-registry-tls\") on node \"ip-10-0-136-123.ec2.internal\" DevicePath \"\"" Apr 21 15:38:33.079490 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:38:33.079296 2573 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/755e24aa-0cb9-4572-b0bc-7fa61c3b2a7e-bound-sa-token\") on node \"ip-10-0-136-123.ec2.internal\" DevicePath \"\"" Apr 21 15:38:33.079490 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:38:33.079304 2573 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/755e24aa-0cb9-4572-b0bc-7fa61c3b2a7e-installation-pull-secrets\") on node \"ip-10-0-136-123.ec2.internal\" DevicePath \"\"" Apr 21 15:38:33.079490 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:38:33.079314 2573 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/755e24aa-0cb9-4572-b0bc-7fa61c3b2a7e-image-registry-private-configuration\") on node \"ip-10-0-136-123.ec2.internal\" DevicePath \"\"" Apr 21 15:38:33.079490 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:38:33.079324 2573 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/755e24aa-0cb9-4572-b0bc-7fa61c3b2a7e-registry-certificates\") on node \"ip-10-0-136-123.ec2.internal\" DevicePath \"\"" Apr 21 15:38:33.179993 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:38:33.179957 2573 generic.go:358] "Generic (PLEG): container finished" podID="755e24aa-0cb9-4572-b0bc-7fa61c3b2a7e" containerID="0d1d98774f0330c2f838ce64e660c3e7f8b19a21dd68bafce48c22c4079e95a1" exitCode=0 Apr 21 15:38:33.180139 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:38:33.180014 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-789c9c8b96-xcctw" event={"ID":"755e24aa-0cb9-4572-b0bc-7fa61c3b2a7e","Type":"ContainerDied","Data":"0d1d98774f0330c2f838ce64e660c3e7f8b19a21dd68bafce48c22c4079e95a1"} Apr 21 15:38:33.180139 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:38:33.180043 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-789c9c8b96-xcctw" event={"ID":"755e24aa-0cb9-4572-b0bc-7fa61c3b2a7e","Type":"ContainerDied","Data":"c89dfdd2b5dec26aa78dd972dde87a9c7276121e072a0822b14690364207a3b1"} Apr 21 15:38:33.180139 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:38:33.180049 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-789c9c8b96-xcctw" Apr 21 15:38:33.180239 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:38:33.180058 2573 scope.go:117] "RemoveContainer" containerID="0d1d98774f0330c2f838ce64e660c3e7f8b19a21dd68bafce48c22c4079e95a1" Apr 21 15:38:33.187869 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:38:33.187838 2573 scope.go:117] "RemoveContainer" containerID="0d1d98774f0330c2f838ce64e660c3e7f8b19a21dd68bafce48c22c4079e95a1" Apr 21 15:38:33.188165 ip-10-0-136-123 kubenswrapper[2573]: E0421 15:38:33.188144 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0d1d98774f0330c2f838ce64e660c3e7f8b19a21dd68bafce48c22c4079e95a1\": container with ID starting with 0d1d98774f0330c2f838ce64e660c3e7f8b19a21dd68bafce48c22c4079e95a1 not found: ID does not exist" containerID="0d1d98774f0330c2f838ce64e660c3e7f8b19a21dd68bafce48c22c4079e95a1" Apr 21 15:38:33.188213 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:38:33.188175 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d1d98774f0330c2f838ce64e660c3e7f8b19a21dd68bafce48c22c4079e95a1"} err="failed to get container status \"0d1d98774f0330c2f838ce64e660c3e7f8b19a21dd68bafce48c22c4079e95a1\": rpc error: code = NotFound desc = could not find container \"0d1d98774f0330c2f838ce64e660c3e7f8b19a21dd68bafce48c22c4079e95a1\": container with ID starting with 0d1d98774f0330c2f838ce64e660c3e7f8b19a21dd68bafce48c22c4079e95a1 not found: ID does not exist" Apr 21 15:38:33.216164 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:38:33.216130 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-789c9c8b96-xcctw"] Apr 21 15:38:33.219812 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:38:33.219765 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-789c9c8b96-xcctw"] Apr 21 15:38:33.480181 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:38:33.480148 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="755e24aa-0cb9-4572-b0bc-7fa61c3b2a7e" path="/var/lib/kubelet/pods/755e24aa-0cb9-4572-b0bc-7fa61c3b2a7e/volumes" Apr 21 15:38:38.125848 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:38:38.125782 2573 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5fc6686f8-g584l" podUID="0916d2fa-b831-41db-8365-4d6cf0182f90" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 21 15:38:47.046297 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:38:47.046268 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-vmrmq_9a3f9599-c99a-4c6a-b295-b12b9a4fbc96/init-textfile/0.log" Apr 21 15:38:47.246940 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:38:47.246911 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-vmrmq_9a3f9599-c99a-4c6a-b295-b12b9a4fbc96/node-exporter/0.log" Apr 21 15:38:47.445437 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:38:47.445409 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-vmrmq_9a3f9599-c99a-4c6a-b295-b12b9a4fbc96/kube-rbac-proxy/0.log" Apr 21 15:38:48.125017 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:38:48.124978 2573 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5fc6686f8-g584l" podUID="0916d2fa-b831-41db-8365-4d6cf0182f90" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 21 15:38:48.125488 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:38:48.125070 2573 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5fc6686f8-g584l" Apr 21 15:38:48.125678 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:38:48.125653 2573 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="service-proxy" containerStatusID={"Type":"cri-o","ID":"aab1108f3fd677c7c6884d1f44626b0dfb249ff149a41ad32ee3e26b10b0737b"} pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5fc6686f8-g584l" containerMessage="Container service-proxy failed liveness probe, will be restarted" Apr 21 15:38:48.125747 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:38:48.125705 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5fc6686f8-g584l" podUID="0916d2fa-b831-41db-8365-4d6cf0182f90" containerName="service-proxy" containerID="cri-o://aab1108f3fd677c7c6884d1f44626b0dfb249ff149a41ad32ee3e26b10b0737b" gracePeriod=30 Apr 21 15:38:49.226114 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:38:49.226071 2573 generic.go:358] "Generic (PLEG): container finished" podID="0916d2fa-b831-41db-8365-4d6cf0182f90" containerID="aab1108f3fd677c7c6884d1f44626b0dfb249ff149a41ad32ee3e26b10b0737b" exitCode=2 Apr 21 15:38:49.226584 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:38:49.226140 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5fc6686f8-g584l" event={"ID":"0916d2fa-b831-41db-8365-4d6cf0182f90","Type":"ContainerDied","Data":"aab1108f3fd677c7c6884d1f44626b0dfb249ff149a41ad32ee3e26b10b0737b"} Apr 21 15:38:49.226584 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:38:49.226177 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5fc6686f8-g584l" event={"ID":"0916d2fa-b831-41db-8365-4d6cf0182f90","Type":"ContainerStarted","Data":"7599309e0f40a9be8df2e60f78510518e3a632d9321fbbc2c5abd5df1fce8ac8"} Apr 21 15:39:16.309105 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:39:16.309070 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9c107ca7-f14c-4f8c-a8d4-4e08e3acb233-metrics-certs\") pod \"network-metrics-daemon-28b7m\" (UID: \"9c107ca7-f14c-4f8c-a8d4-4e08e3acb233\") " pod="openshift-multus/network-metrics-daemon-28b7m" Apr 21 15:39:16.311367 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:39:16.311343 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9c107ca7-f14c-4f8c-a8d4-4e08e3acb233-metrics-certs\") pod \"network-metrics-daemon-28b7m\" (UID: \"9c107ca7-f14c-4f8c-a8d4-4e08e3acb233\") " pod="openshift-multus/network-metrics-daemon-28b7m" Apr 21 15:39:16.380046 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:39:16.380011 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-sjbkh\"" Apr 21 15:39:16.387596 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:39:16.387573 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-28b7m" Apr 21 15:39:16.506249 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:39:16.506208 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-28b7m"] Apr 21 15:39:16.509329 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:39:16.509299 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9c107ca7_f14c_4f8c_a8d4_4e08e3acb233.slice/crio-a06f1c74c033502105588c86f87f50481210333890e4173cd728cd1c0bc2fcd7 WatchSource:0}: Error finding container a06f1c74c033502105588c86f87f50481210333890e4173cd728cd1c0bc2fcd7: Status 404 returned error can't find the container with id a06f1c74c033502105588c86f87f50481210333890e4173cd728cd1c0bc2fcd7 Apr 21 15:39:17.302047 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:39:17.302006 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-28b7m" event={"ID":"9c107ca7-f14c-4f8c-a8d4-4e08e3acb233","Type":"ContainerStarted","Data":"a06f1c74c033502105588c86f87f50481210333890e4173cd728cd1c0bc2fcd7"} Apr 21 15:39:18.306976 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:39:18.306933 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-28b7m" event={"ID":"9c107ca7-f14c-4f8c-a8d4-4e08e3acb233","Type":"ContainerStarted","Data":"081d32ed8cd9f63ff6a25f313aa7bc24cce7587505e31ef682cda3b31db217f0"} Apr 21 15:39:18.306976 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:39:18.306974 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-28b7m" event={"ID":"9c107ca7-f14c-4f8c-a8d4-4e08e3acb233","Type":"ContainerStarted","Data":"5cb23638bd90c341d0b25abc14d968475fcc80675b9028bc8288b1a98ea7e582"} Apr 21 15:39:18.325643 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:39:18.325441 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-28b7m" podStartSLOduration=252.419862737 podStartE2EDuration="4m13.325422713s" podCreationTimestamp="2026-04-21 15:35:05 +0000 UTC" firstStartedPulling="2026-04-21 15:39:16.511629116 +0000 UTC m=+251.641682728" lastFinishedPulling="2026-04-21 15:39:17.41718909 +0000 UTC m=+252.547242704" observedRunningTime="2026-04-21 15:39:18.324467662 +0000 UTC m=+253.454521295" watchObservedRunningTime="2026-04-21 15:39:18.325422713 +0000 UTC m=+253.455476350" Apr 21 15:40:05.373049 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:40:05.373003 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9v44z_a8821bf6-e244-4b55-bfcc-7d85dec39bc4/ovn-acl-logging/0.log" Apr 21 15:40:05.373856 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:40:05.373836 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9v44z_a8821bf6-e244-4b55-bfcc-7d85dec39bc4/ovn-acl-logging/0.log" Apr 21 15:40:05.377106 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:40:05.377081 2573 kubelet.go:1628] "Image garbage collection succeeded" Apr 21 15:42:31.704294 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:42:31.704220 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-6vhnl"] Apr 21 15:42:31.704717 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:42:31.704459 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="755e24aa-0cb9-4572-b0bc-7fa61c3b2a7e" containerName="registry" Apr 21 15:42:31.704717 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:42:31.704469 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="755e24aa-0cb9-4572-b0bc-7fa61c3b2a7e" containerName="registry" Apr 21 15:42:31.704717 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:42:31.704512 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="755e24aa-0cb9-4572-b0bc-7fa61c3b2a7e" containerName="registry" Apr 21 15:42:31.707033 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:42:31.707018 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-6vhnl" Apr 21 15:42:31.710246 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:42:31.710222 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-certs\"" Apr 21 15:42:31.710559 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:42:31.710540 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"keda-ocp-cabundle\"" Apr 21 15:42:31.710676 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:42:31.710575 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"kedaorg-certs\"" Apr 21 15:42:31.710747 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:42:31.710690 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"kube-root-ca.crt\"" Apr 21 15:42:31.711467 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:42:31.711451 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"openshift-service-ca.crt\"" Apr 21 15:42:31.711987 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:42:31.711972 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-dockercfg-vrwtg\"" Apr 21 15:42:31.719586 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:42:31.719564 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-6vhnl"] Apr 21 15:42:31.833594 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:42:31.833559 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/5bb02280-688c-4ff5-9216-7f370bf82081-cabundle0\") pod \"keda-operator-ffbb595cb-6vhnl\" (UID: \"5bb02280-688c-4ff5-9216-7f370bf82081\") " pod="openshift-keda/keda-operator-ffbb595cb-6vhnl" Apr 21 15:42:31.833594 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:42:31.833602 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dgptr\" (UniqueName: \"kubernetes.io/projected/5bb02280-688c-4ff5-9216-7f370bf82081-kube-api-access-dgptr\") pod \"keda-operator-ffbb595cb-6vhnl\" (UID: \"5bb02280-688c-4ff5-9216-7f370bf82081\") " pod="openshift-keda/keda-operator-ffbb595cb-6vhnl" Apr 21 15:42:31.833823 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:42:31.833626 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/5bb02280-688c-4ff5-9216-7f370bf82081-certificates\") pod \"keda-operator-ffbb595cb-6vhnl\" (UID: \"5bb02280-688c-4ff5-9216-7f370bf82081\") " pod="openshift-keda/keda-operator-ffbb595cb-6vhnl" Apr 21 15:42:31.934934 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:42:31.934888 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/5bb02280-688c-4ff5-9216-7f370bf82081-cabundle0\") pod \"keda-operator-ffbb595cb-6vhnl\" (UID: \"5bb02280-688c-4ff5-9216-7f370bf82081\") " pod="openshift-keda/keda-operator-ffbb595cb-6vhnl" Apr 21 15:42:31.935053 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:42:31.934950 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dgptr\" (UniqueName: \"kubernetes.io/projected/5bb02280-688c-4ff5-9216-7f370bf82081-kube-api-access-dgptr\") pod \"keda-operator-ffbb595cb-6vhnl\" (UID: \"5bb02280-688c-4ff5-9216-7f370bf82081\") " pod="openshift-keda/keda-operator-ffbb595cb-6vhnl" Apr 21 15:42:31.935053 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:42:31.934982 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/5bb02280-688c-4ff5-9216-7f370bf82081-certificates\") pod \"keda-operator-ffbb595cb-6vhnl\" (UID: \"5bb02280-688c-4ff5-9216-7f370bf82081\") " pod="openshift-keda/keda-operator-ffbb595cb-6vhnl" Apr 21 15:42:31.935134 ip-10-0-136-123 kubenswrapper[2573]: E0421 15:42:31.935089 2573 projected.go:264] Couldn't get secret openshift-keda/keda-operator-certs: secret "keda-operator-certs" not found Apr 21 15:42:31.935134 ip-10-0-136-123 kubenswrapper[2573]: E0421 15:42:31.935108 2573 secret.go:281] references non-existent secret key: ca.crt Apr 21 15:42:31.935134 ip-10-0-136-123 kubenswrapper[2573]: E0421 15:42:31.935118 2573 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 21 15:42:31.935134 ip-10-0-136-123 kubenswrapper[2573]: E0421 15:42:31.935134 2573 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-6vhnl: [secret "keda-operator-certs" not found, references non-existent secret key: ca.crt] Apr 21 15:42:31.935272 ip-10-0-136-123 kubenswrapper[2573]: E0421 15:42:31.935218 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5bb02280-688c-4ff5-9216-7f370bf82081-certificates podName:5bb02280-688c-4ff5-9216-7f370bf82081 nodeName:}" failed. No retries permitted until 2026-04-21 15:42:32.435196635 +0000 UTC m=+447.565250250 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/5bb02280-688c-4ff5-9216-7f370bf82081-certificates") pod "keda-operator-ffbb595cb-6vhnl" (UID: "5bb02280-688c-4ff5-9216-7f370bf82081") : [secret "keda-operator-certs" not found, references non-existent secret key: ca.crt] Apr 21 15:42:31.935662 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:42:31.935644 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/5bb02280-688c-4ff5-9216-7f370bf82081-cabundle0\") pod \"keda-operator-ffbb595cb-6vhnl\" (UID: \"5bb02280-688c-4ff5-9216-7f370bf82081\") " pod="openshift-keda/keda-operator-ffbb595cb-6vhnl" Apr 21 15:42:31.947510 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:42:31.947473 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dgptr\" (UniqueName: \"kubernetes.io/projected/5bb02280-688c-4ff5-9216-7f370bf82081-kube-api-access-dgptr\") pod \"keda-operator-ffbb595cb-6vhnl\" (UID: \"5bb02280-688c-4ff5-9216-7f370bf82081\") " pod="openshift-keda/keda-operator-ffbb595cb-6vhnl" Apr 21 15:42:32.045412 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:42:32.045338 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-c2gbv"] Apr 21 15:42:32.048496 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:42:32.048476 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-c2gbv" Apr 21 15:42:32.051642 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:42:32.051617 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-metrics-apiserver-certs\"" Apr 21 15:42:32.057055 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:42:32.057016 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-c2gbv"] Apr 21 15:42:32.137048 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:42:32.137010 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/15baaaf4-8abc-4db2-8dd7-d772eb3a90d1-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-c2gbv\" (UID: \"15baaaf4-8abc-4db2-8dd7-d772eb3a90d1\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-c2gbv" Apr 21 15:42:32.137048 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:42:32.137048 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7mjs2\" (UniqueName: \"kubernetes.io/projected/15baaaf4-8abc-4db2-8dd7-d772eb3a90d1-kube-api-access-7mjs2\") pod \"keda-metrics-apiserver-7c9f485588-c2gbv\" (UID: \"15baaaf4-8abc-4db2-8dd7-d772eb3a90d1\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-c2gbv" Apr 21 15:42:32.137267 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:42:32.137120 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/15baaaf4-8abc-4db2-8dd7-d772eb3a90d1-certificates\") pod \"keda-metrics-apiserver-7c9f485588-c2gbv\" (UID: \"15baaaf4-8abc-4db2-8dd7-d772eb3a90d1\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-c2gbv" Apr 21 15:42:32.238275 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:42:32.238224 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/15baaaf4-8abc-4db2-8dd7-d772eb3a90d1-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-c2gbv\" (UID: \"15baaaf4-8abc-4db2-8dd7-d772eb3a90d1\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-c2gbv" Apr 21 15:42:32.238275 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:42:32.238280 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7mjs2\" (UniqueName: \"kubernetes.io/projected/15baaaf4-8abc-4db2-8dd7-d772eb3a90d1-kube-api-access-7mjs2\") pod \"keda-metrics-apiserver-7c9f485588-c2gbv\" (UID: \"15baaaf4-8abc-4db2-8dd7-d772eb3a90d1\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-c2gbv" Apr 21 15:42:32.238432 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:42:32.238303 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/15baaaf4-8abc-4db2-8dd7-d772eb3a90d1-certificates\") pod \"keda-metrics-apiserver-7c9f485588-c2gbv\" (UID: \"15baaaf4-8abc-4db2-8dd7-d772eb3a90d1\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-c2gbv" Apr 21 15:42:32.238432 ip-10-0-136-123 kubenswrapper[2573]: E0421 15:42:32.238402 2573 secret.go:281] references non-existent secret key: tls.crt Apr 21 15:42:32.238432 ip-10-0-136-123 kubenswrapper[2573]: E0421 15:42:32.238415 2573 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 21 15:42:32.238432 ip-10-0-136-123 kubenswrapper[2573]: E0421 15:42:32.238429 2573 projected.go:264] Couldn't get secret openshift-keda/keda-metrics-apiserver-certs: secret "keda-metrics-apiserver-certs" not found Apr 21 15:42:32.238552 ip-10-0-136-123 kubenswrapper[2573]: E0421 15:42:32.238444 2573 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-c2gbv: [references non-existent secret key: tls.crt, secret "keda-metrics-apiserver-certs" not found] Apr 21 15:42:32.238552 ip-10-0-136-123 kubenswrapper[2573]: E0421 15:42:32.238507 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/15baaaf4-8abc-4db2-8dd7-d772eb3a90d1-certificates podName:15baaaf4-8abc-4db2-8dd7-d772eb3a90d1 nodeName:}" failed. No retries permitted until 2026-04-21 15:42:32.738489486 +0000 UTC m=+447.868543098 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/15baaaf4-8abc-4db2-8dd7-d772eb3a90d1-certificates") pod "keda-metrics-apiserver-7c9f485588-c2gbv" (UID: "15baaaf4-8abc-4db2-8dd7-d772eb3a90d1") : [references non-existent secret key: tls.crt, secret "keda-metrics-apiserver-certs" not found] Apr 21 15:42:32.238622 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:42:32.238576 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/15baaaf4-8abc-4db2-8dd7-d772eb3a90d1-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-c2gbv\" (UID: \"15baaaf4-8abc-4db2-8dd7-d772eb3a90d1\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-c2gbv" Apr 21 15:42:32.251412 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:42:32.251387 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7mjs2\" (UniqueName: \"kubernetes.io/projected/15baaaf4-8abc-4db2-8dd7-d772eb3a90d1-kube-api-access-7mjs2\") pod \"keda-metrics-apiserver-7c9f485588-c2gbv\" (UID: \"15baaaf4-8abc-4db2-8dd7-d772eb3a90d1\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-c2gbv" Apr 21 15:42:32.440045 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:42:32.440011 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/5bb02280-688c-4ff5-9216-7f370bf82081-certificates\") pod \"keda-operator-ffbb595cb-6vhnl\" (UID: \"5bb02280-688c-4ff5-9216-7f370bf82081\") " pod="openshift-keda/keda-operator-ffbb595cb-6vhnl" Apr 21 15:42:32.440212 ip-10-0-136-123 kubenswrapper[2573]: E0421 15:42:32.440160 2573 secret.go:281] references non-existent secret key: ca.crt Apr 21 15:42:32.440212 ip-10-0-136-123 kubenswrapper[2573]: E0421 15:42:32.440180 2573 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 21 15:42:32.440212 ip-10-0-136-123 kubenswrapper[2573]: E0421 15:42:32.440190 2573 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-6vhnl: references non-existent secret key: ca.crt Apr 21 15:42:32.440313 ip-10-0-136-123 kubenswrapper[2573]: E0421 15:42:32.440254 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5bb02280-688c-4ff5-9216-7f370bf82081-certificates podName:5bb02280-688c-4ff5-9216-7f370bf82081 nodeName:}" failed. No retries permitted until 2026-04-21 15:42:33.440238344 +0000 UTC m=+448.570291955 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/5bb02280-688c-4ff5-9216-7f370bf82081-certificates") pod "keda-operator-ffbb595cb-6vhnl" (UID: "5bb02280-688c-4ff5-9216-7f370bf82081") : references non-existent secret key: ca.crt Apr 21 15:42:32.742420 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:42:32.742337 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/15baaaf4-8abc-4db2-8dd7-d772eb3a90d1-certificates\") pod \"keda-metrics-apiserver-7c9f485588-c2gbv\" (UID: \"15baaaf4-8abc-4db2-8dd7-d772eb3a90d1\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-c2gbv" Apr 21 15:42:32.744757 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:42:32.744736 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/15baaaf4-8abc-4db2-8dd7-d772eb3a90d1-certificates\") pod \"keda-metrics-apiserver-7c9f485588-c2gbv\" (UID: \"15baaaf4-8abc-4db2-8dd7-d772eb3a90d1\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-c2gbv" Apr 21 15:42:32.959959 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:42:32.959912 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-c2gbv" Apr 21 15:42:33.084345 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:42:33.084321 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-c2gbv"] Apr 21 15:42:33.086707 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:42:33.086675 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod15baaaf4_8abc_4db2_8dd7_d772eb3a90d1.slice/crio-dacff4c65a4201085e825fed12770851ebb77419dfdf709928da515bfb922cbb WatchSource:0}: Error finding container dacff4c65a4201085e825fed12770851ebb77419dfdf709928da515bfb922cbb: Status 404 returned error can't find the container with id dacff4c65a4201085e825fed12770851ebb77419dfdf709928da515bfb922cbb Apr 21 15:42:33.088031 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:42:33.088011 2573 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 21 15:42:33.447576 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:42:33.447541 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/5bb02280-688c-4ff5-9216-7f370bf82081-certificates\") pod \"keda-operator-ffbb595cb-6vhnl\" (UID: \"5bb02280-688c-4ff5-9216-7f370bf82081\") " pod="openshift-keda/keda-operator-ffbb595cb-6vhnl" Apr 21 15:42:33.450065 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:42:33.450042 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/5bb02280-688c-4ff5-9216-7f370bf82081-certificates\") pod \"keda-operator-ffbb595cb-6vhnl\" (UID: \"5bb02280-688c-4ff5-9216-7f370bf82081\") " pod="openshift-keda/keda-operator-ffbb595cb-6vhnl" Apr 21 15:42:33.516671 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:42:33.516642 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-6vhnl" Apr 21 15:42:33.639774 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:42:33.639741 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-6vhnl"] Apr 21 15:42:33.642788 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:42:33.642761 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5bb02280_688c_4ff5_9216_7f370bf82081.slice/crio-aaf89c7a08ff6e33709cbf9ccd254ce07bf977e142594d9c31724cf903d0deaa WatchSource:0}: Error finding container aaf89c7a08ff6e33709cbf9ccd254ce07bf977e142594d9c31724cf903d0deaa: Status 404 returned error can't find the container with id aaf89c7a08ff6e33709cbf9ccd254ce07bf977e142594d9c31724cf903d0deaa Apr 21 15:42:33.798176 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:42:33.798085 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-6vhnl" event={"ID":"5bb02280-688c-4ff5-9216-7f370bf82081","Type":"ContainerStarted","Data":"aaf89c7a08ff6e33709cbf9ccd254ce07bf977e142594d9c31724cf903d0deaa"} Apr 21 15:42:33.799273 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:42:33.799242 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-c2gbv" event={"ID":"15baaaf4-8abc-4db2-8dd7-d772eb3a90d1","Type":"ContainerStarted","Data":"dacff4c65a4201085e825fed12770851ebb77419dfdf709928da515bfb922cbb"} Apr 21 15:42:38.816463 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:42:38.816417 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-6vhnl" event={"ID":"5bb02280-688c-4ff5-9216-7f370bf82081","Type":"ContainerStarted","Data":"4fbf169d5559a9b8de16fa316065affc7de455d8ba23c49b711fb5f2f4422cc1"} Apr 21 15:42:38.817024 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:42:38.816493 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-operator-ffbb595cb-6vhnl" Apr 21 15:42:38.817736 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:42:38.817716 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-c2gbv" event={"ID":"15baaaf4-8abc-4db2-8dd7-d772eb3a90d1","Type":"ContainerStarted","Data":"8bdfe6a4a62cf2cd5ffd95f22f39986ce589e39a3a2b3a31f2ba6f2778ace202"} Apr 21 15:42:38.817836 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:42:38.817823 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-c2gbv" Apr 21 15:42:38.836385 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:42:38.836339 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-operator-ffbb595cb-6vhnl" podStartSLOduration=3.681149823 podStartE2EDuration="7.836328723s" podCreationTimestamp="2026-04-21 15:42:31 +0000 UTC" firstStartedPulling="2026-04-21 15:42:33.644142033 +0000 UTC m=+448.774195649" lastFinishedPulling="2026-04-21 15:42:37.799320932 +0000 UTC m=+452.929374549" observedRunningTime="2026-04-21 15:42:38.834377342 +0000 UTC m=+453.964430974" watchObservedRunningTime="2026-04-21 15:42:38.836328723 +0000 UTC m=+453.966382357" Apr 21 15:42:38.855303 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:42:38.855260 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-c2gbv" podStartSLOduration=2.144124939 podStartE2EDuration="6.85525156s" podCreationTimestamp="2026-04-21 15:42:32 +0000 UTC" firstStartedPulling="2026-04-21 15:42:33.088195425 +0000 UTC m=+448.218249045" lastFinishedPulling="2026-04-21 15:42:37.799322054 +0000 UTC m=+452.929375666" observedRunningTime="2026-04-21 15:42:38.853903582 +0000 UTC m=+453.983957242" watchObservedRunningTime="2026-04-21 15:42:38.85525156 +0000 UTC m=+453.985305193" Apr 21 15:42:49.825063 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:42:49.825034 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-c2gbv" Apr 21 15:42:59.822219 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:42:59.822185 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-operator-ffbb595cb-6vhnl" Apr 21 15:43:46.488261 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:43:46.488224 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-sfw24"] Apr 21 15:43:46.491273 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:43:46.491253 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-sfw24" Apr 21 15:43:46.495938 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:43:46.495917 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"openshift-service-ca.crt\"" Apr 21 15:43:46.496518 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:43:46.496500 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager-operator\"/\"cert-manager-operator-controller-manager-dockercfg-sfbk2\"" Apr 21 15:43:46.496987 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:43:46.496967 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"kube-root-ca.crt\"" Apr 21 15:43:46.503092 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:43:46.503071 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-sfw24"] Apr 21 15:43:46.549744 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:43:46.549708 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bcmdn\" (UniqueName: \"kubernetes.io/projected/34326c84-951d-4fd0-a6c6-0050ff7950bf-kube-api-access-bcmdn\") pod \"cert-manager-operator-controller-manager-54b9655956-sfw24\" (UID: \"34326c84-951d-4fd0-a6c6-0050ff7950bf\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-sfw24" Apr 21 15:43:46.549910 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:43:46.549752 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/34326c84-951d-4fd0-a6c6-0050ff7950bf-tmp\") pod \"cert-manager-operator-controller-manager-54b9655956-sfw24\" (UID: \"34326c84-951d-4fd0-a6c6-0050ff7950bf\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-sfw24" Apr 21 15:43:46.651043 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:43:46.651009 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bcmdn\" (UniqueName: \"kubernetes.io/projected/34326c84-951d-4fd0-a6c6-0050ff7950bf-kube-api-access-bcmdn\") pod \"cert-manager-operator-controller-manager-54b9655956-sfw24\" (UID: \"34326c84-951d-4fd0-a6c6-0050ff7950bf\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-sfw24" Apr 21 15:43:46.651171 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:43:46.651061 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/34326c84-951d-4fd0-a6c6-0050ff7950bf-tmp\") pod \"cert-manager-operator-controller-manager-54b9655956-sfw24\" (UID: \"34326c84-951d-4fd0-a6c6-0050ff7950bf\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-sfw24" Apr 21 15:43:46.651407 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:43:46.651393 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/34326c84-951d-4fd0-a6c6-0050ff7950bf-tmp\") pod \"cert-manager-operator-controller-manager-54b9655956-sfw24\" (UID: \"34326c84-951d-4fd0-a6c6-0050ff7950bf\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-sfw24" Apr 21 15:43:46.659416 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:43:46.659387 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bcmdn\" (UniqueName: \"kubernetes.io/projected/34326c84-951d-4fd0-a6c6-0050ff7950bf-kube-api-access-bcmdn\") pod \"cert-manager-operator-controller-manager-54b9655956-sfw24\" (UID: \"34326c84-951d-4fd0-a6c6-0050ff7950bf\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-sfw24" Apr 21 15:43:46.800113 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:43:46.800028 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-sfw24" Apr 21 15:43:46.930256 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:43:46.930200 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-sfw24"] Apr 21 15:43:46.932885 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:43:46.932860 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod34326c84_951d_4fd0_a6c6_0050ff7950bf.slice/crio-c965a1311bbb27c120386dc6f646a36f8e60bb1d08e9287b2c4cd1751748aeb0 WatchSource:0}: Error finding container c965a1311bbb27c120386dc6f646a36f8e60bb1d08e9287b2c4cd1751748aeb0: Status 404 returned error can't find the container with id c965a1311bbb27c120386dc6f646a36f8e60bb1d08e9287b2c4cd1751748aeb0 Apr 21 15:43:46.993766 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:43:46.993733 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-sfw24" event={"ID":"34326c84-951d-4fd0-a6c6-0050ff7950bf","Type":"ContainerStarted","Data":"c965a1311bbb27c120386dc6f646a36f8e60bb1d08e9287b2c4cd1751748aeb0"} Apr 21 15:43:50.003244 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:43:50.003204 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-sfw24" event={"ID":"34326c84-951d-4fd0-a6c6-0050ff7950bf","Type":"ContainerStarted","Data":"c1196ba8ed4930840c940ed2a32253e14f9ee16183a0233822a6cf77f2ccd55a"} Apr 21 15:43:50.040428 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:43:50.040384 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-sfw24" podStartSLOduration=1.515426334 podStartE2EDuration="4.040369729s" podCreationTimestamp="2026-04-21 15:43:46 +0000 UTC" firstStartedPulling="2026-04-21 15:43:46.935350111 +0000 UTC m=+522.065403723" lastFinishedPulling="2026-04-21 15:43:49.4602935 +0000 UTC m=+524.590347118" observedRunningTime="2026-04-21 15:43:50.03841861 +0000 UTC m=+525.168472268" watchObservedRunningTime="2026-04-21 15:43:50.040369729 +0000 UTC m=+525.170423362" Apr 21 15:44:32.037904 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:44:32.037862 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-vtcxl"] Apr 21 15:44:32.040052 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:44:32.040031 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/servicemesh-operator3-55f49c5f94-vtcxl" Apr 21 15:44:32.045032 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:44:32.045013 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operators\"/\"kube-root-ca.crt\"" Apr 21 15:44:32.045252 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:44:32.045236 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operators\"/\"openshift-service-ca.crt\"" Apr 21 15:44:32.045329 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:44:32.045242 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-operators\"/\"servicemesh-operator3-dockercfg-xzr98\"" Apr 21 15:44:32.051855 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:44:32.051831 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-vtcxl"] Apr 21 15:44:32.163090 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:44:32.163056 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nxxkw\" (UniqueName: \"kubernetes.io/projected/765b22cc-8a8d-4c0f-93f3-558fa3a4f71a-kube-api-access-nxxkw\") pod \"servicemesh-operator3-55f49c5f94-vtcxl\" (UID: \"765b22cc-8a8d-4c0f-93f3-558fa3a4f71a\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-vtcxl" Apr 21 15:44:32.163090 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:44:32.163093 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/765b22cc-8a8d-4c0f-93f3-558fa3a4f71a-operator-config\") pod \"servicemesh-operator3-55f49c5f94-vtcxl\" (UID: \"765b22cc-8a8d-4c0f-93f3-558fa3a4f71a\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-vtcxl" Apr 21 15:44:32.264052 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:44:32.264023 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nxxkw\" (UniqueName: \"kubernetes.io/projected/765b22cc-8a8d-4c0f-93f3-558fa3a4f71a-kube-api-access-nxxkw\") pod \"servicemesh-operator3-55f49c5f94-vtcxl\" (UID: \"765b22cc-8a8d-4c0f-93f3-558fa3a4f71a\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-vtcxl" Apr 21 15:44:32.264052 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:44:32.264056 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/765b22cc-8a8d-4c0f-93f3-558fa3a4f71a-operator-config\") pod \"servicemesh-operator3-55f49c5f94-vtcxl\" (UID: \"765b22cc-8a8d-4c0f-93f3-558fa3a4f71a\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-vtcxl" Apr 21 15:44:32.266519 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:44:32.266488 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/765b22cc-8a8d-4c0f-93f3-558fa3a4f71a-operator-config\") pod \"servicemesh-operator3-55f49c5f94-vtcxl\" (UID: \"765b22cc-8a8d-4c0f-93f3-558fa3a4f71a\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-vtcxl" Apr 21 15:44:32.272524 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:44:32.272502 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nxxkw\" (UniqueName: \"kubernetes.io/projected/765b22cc-8a8d-4c0f-93f3-558fa3a4f71a-kube-api-access-nxxkw\") pod \"servicemesh-operator3-55f49c5f94-vtcxl\" (UID: \"765b22cc-8a8d-4c0f-93f3-558fa3a4f71a\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-vtcxl" Apr 21 15:44:32.349177 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:44:32.349135 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/servicemesh-operator3-55f49c5f94-vtcxl" Apr 21 15:44:32.482909 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:44:32.481612 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-vtcxl"] Apr 21 15:44:32.485884 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:44:32.485846 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod765b22cc_8a8d_4c0f_93f3_558fa3a4f71a.slice/crio-e92da6d09c67a440bff36210214daab8a09b4b7033632824f95c48881297ddb7 WatchSource:0}: Error finding container e92da6d09c67a440bff36210214daab8a09b4b7033632824f95c48881297ddb7: Status 404 returned error can't find the container with id e92da6d09c67a440bff36210214daab8a09b4b7033632824f95c48881297ddb7 Apr 21 15:44:33.114399 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:44:33.114353 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operators/servicemesh-operator3-55f49c5f94-vtcxl" event={"ID":"765b22cc-8a8d-4c0f-93f3-558fa3a4f71a","Type":"ContainerStarted","Data":"e92da6d09c67a440bff36210214daab8a09b4b7033632824f95c48881297ddb7"} Apr 21 15:44:36.128479 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:44:36.128353 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operators/servicemesh-operator3-55f49c5f94-vtcxl" event={"ID":"765b22cc-8a8d-4c0f-93f3-558fa3a4f71a","Type":"ContainerStarted","Data":"e91337cfcc9d1bcee79e874b89ad070836804d618856668778d69f4871d081f7"} Apr 21 15:44:36.128986 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:44:36.128518 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-operators/servicemesh-operator3-55f49c5f94-vtcxl" Apr 21 15:44:36.158489 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:44:36.158434 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/servicemesh-operator3-55f49c5f94-vtcxl" podStartSLOduration=0.824352099 podStartE2EDuration="4.158417653s" podCreationTimestamp="2026-04-21 15:44:32 +0000 UTC" firstStartedPulling="2026-04-21 15:44:32.487578786 +0000 UTC m=+567.617632398" lastFinishedPulling="2026-04-21 15:44:35.821644338 +0000 UTC m=+570.951697952" observedRunningTime="2026-04-21 15:44:36.156329045 +0000 UTC m=+571.286382680" watchObservedRunningTime="2026-04-21 15:44:36.158417653 +0000 UTC m=+571.288471288" Apr 21 15:44:43.126210 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:44:43.126169 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-rr47g"] Apr 21 15:44:43.128383 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:44:43.128361 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-rr47g" Apr 21 15:44:43.133281 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:44:43.133263 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istio-kubeconfig\"" Apr 21 15:44:43.133521 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:44:43.133506 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"istio-ca-root-cert\"" Apr 21 15:44:43.133610 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:44:43.133558 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istiod-tls\"" Apr 21 15:44:43.133871 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:44:43.133851 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"cacerts\"" Apr 21 15:44:43.133988 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:44:43.133876 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istiod-openshift-gateway-dockercfg-hvdmm\"" Apr 21 15:44:43.134183 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:44:43.134167 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 21 15:44:43.134283 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:44:43.134231 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 21 15:44:43.162482 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:44:43.162460 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-rr47g"] Apr 21 15:44:43.248894 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:44:43.248848 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/b08c5e38-0e05-411f-b67a-1427e00f2b85-istio-kubeconfig\") pod \"istiod-openshift-gateway-7cd77c7ffd-rr47g\" (UID: \"b08c5e38-0e05-411f-b67a-1427e00f2b85\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-rr47g" Apr 21 15:44:43.249052 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:44:43.248924 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/b08c5e38-0e05-411f-b67a-1427e00f2b85-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-7cd77c7ffd-rr47g\" (UID: \"b08c5e38-0e05-411f-b67a-1427e00f2b85\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-rr47g" Apr 21 15:44:43.249052 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:44:43.248954 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/b08c5e38-0e05-411f-b67a-1427e00f2b85-local-certs\") pod \"istiod-openshift-gateway-7cd77c7ffd-rr47g\" (UID: \"b08c5e38-0e05-411f-b67a-1427e00f2b85\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-rr47g" Apr 21 15:44:43.249052 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:44:43.248975 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/b08c5e38-0e05-411f-b67a-1427e00f2b85-cacerts\") pod \"istiod-openshift-gateway-7cd77c7ffd-rr47g\" (UID: \"b08c5e38-0e05-411f-b67a-1427e00f2b85\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-rr47g" Apr 21 15:44:43.249052 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:44:43.249016 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/b08c5e38-0e05-411f-b67a-1427e00f2b85-istio-token\") pod \"istiod-openshift-gateway-7cd77c7ffd-rr47g\" (UID: \"b08c5e38-0e05-411f-b67a-1427e00f2b85\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-rr47g" Apr 21 15:44:43.249186 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:44:43.249056 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/b08c5e38-0e05-411f-b67a-1427e00f2b85-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-7cd77c7ffd-rr47g\" (UID: \"b08c5e38-0e05-411f-b67a-1427e00f2b85\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-rr47g" Apr 21 15:44:43.249186 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:44:43.249099 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-69rkr\" (UniqueName: \"kubernetes.io/projected/b08c5e38-0e05-411f-b67a-1427e00f2b85-kube-api-access-69rkr\") pod \"istiod-openshift-gateway-7cd77c7ffd-rr47g\" (UID: \"b08c5e38-0e05-411f-b67a-1427e00f2b85\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-rr47g" Apr 21 15:44:43.350039 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:44:43.350002 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/b08c5e38-0e05-411f-b67a-1427e00f2b85-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-7cd77c7ffd-rr47g\" (UID: \"b08c5e38-0e05-411f-b67a-1427e00f2b85\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-rr47g" Apr 21 15:44:43.350039 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:44:43.350042 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/b08c5e38-0e05-411f-b67a-1427e00f2b85-local-certs\") pod \"istiod-openshift-gateway-7cd77c7ffd-rr47g\" (UID: \"b08c5e38-0e05-411f-b67a-1427e00f2b85\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-rr47g" Apr 21 15:44:43.350266 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:44:43.350067 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/b08c5e38-0e05-411f-b67a-1427e00f2b85-cacerts\") pod \"istiod-openshift-gateway-7cd77c7ffd-rr47g\" (UID: \"b08c5e38-0e05-411f-b67a-1427e00f2b85\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-rr47g" Apr 21 15:44:43.350266 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:44:43.350103 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/b08c5e38-0e05-411f-b67a-1427e00f2b85-istio-token\") pod \"istiod-openshift-gateway-7cd77c7ffd-rr47g\" (UID: \"b08c5e38-0e05-411f-b67a-1427e00f2b85\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-rr47g" Apr 21 15:44:43.350266 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:44:43.350135 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/b08c5e38-0e05-411f-b67a-1427e00f2b85-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-7cd77c7ffd-rr47g\" (UID: \"b08c5e38-0e05-411f-b67a-1427e00f2b85\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-rr47g" Apr 21 15:44:43.350426 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:44:43.350310 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-69rkr\" (UniqueName: \"kubernetes.io/projected/b08c5e38-0e05-411f-b67a-1427e00f2b85-kube-api-access-69rkr\") pod \"istiod-openshift-gateway-7cd77c7ffd-rr47g\" (UID: \"b08c5e38-0e05-411f-b67a-1427e00f2b85\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-rr47g" Apr 21 15:44:43.350426 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:44:43.350373 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/b08c5e38-0e05-411f-b67a-1427e00f2b85-istio-kubeconfig\") pod \"istiod-openshift-gateway-7cd77c7ffd-rr47g\" (UID: \"b08c5e38-0e05-411f-b67a-1427e00f2b85\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-rr47g" Apr 21 15:44:43.350886 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:44:43.350860 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/b08c5e38-0e05-411f-b67a-1427e00f2b85-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-7cd77c7ffd-rr47g\" (UID: \"b08c5e38-0e05-411f-b67a-1427e00f2b85\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-rr47g" Apr 21 15:44:43.352942 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:44:43.352921 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/b08c5e38-0e05-411f-b67a-1427e00f2b85-local-certs\") pod \"istiod-openshift-gateway-7cd77c7ffd-rr47g\" (UID: \"b08c5e38-0e05-411f-b67a-1427e00f2b85\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-rr47g" Apr 21 15:44:43.353047 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:44:43.352957 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/b08c5e38-0e05-411f-b67a-1427e00f2b85-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-7cd77c7ffd-rr47g\" (UID: \"b08c5e38-0e05-411f-b67a-1427e00f2b85\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-rr47g" Apr 21 15:44:43.353047 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:44:43.352992 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/b08c5e38-0e05-411f-b67a-1427e00f2b85-istio-kubeconfig\") pod \"istiod-openshift-gateway-7cd77c7ffd-rr47g\" (UID: \"b08c5e38-0e05-411f-b67a-1427e00f2b85\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-rr47g" Apr 21 15:44:43.353306 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:44:43.353288 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/b08c5e38-0e05-411f-b67a-1427e00f2b85-cacerts\") pod \"istiod-openshift-gateway-7cd77c7ffd-rr47g\" (UID: \"b08c5e38-0e05-411f-b67a-1427e00f2b85\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-rr47g" Apr 21 15:44:43.363758 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:44:43.363731 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-69rkr\" (UniqueName: \"kubernetes.io/projected/b08c5e38-0e05-411f-b67a-1427e00f2b85-kube-api-access-69rkr\") pod \"istiod-openshift-gateway-7cd77c7ffd-rr47g\" (UID: \"b08c5e38-0e05-411f-b67a-1427e00f2b85\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-rr47g" Apr 21 15:44:43.364104 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:44:43.364084 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/b08c5e38-0e05-411f-b67a-1427e00f2b85-istio-token\") pod \"istiod-openshift-gateway-7cd77c7ffd-rr47g\" (UID: \"b08c5e38-0e05-411f-b67a-1427e00f2b85\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-rr47g" Apr 21 15:44:43.438023 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:44:43.437937 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-rr47g" Apr 21 15:44:43.573042 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:44:43.573011 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-rr47g"] Apr 21 15:44:43.574310 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:44:43.574284 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb08c5e38_0e05_411f_b67a_1427e00f2b85.slice/crio-f11ae929d46834609a352476ecab53cbe7c257056fdbb4cc3c695cc9eff61eee WatchSource:0}: Error finding container f11ae929d46834609a352476ecab53cbe7c257056fdbb4cc3c695cc9eff61eee: Status 404 returned error can't find the container with id f11ae929d46834609a352476ecab53cbe7c257056fdbb4cc3c695cc9eff61eee Apr 21 15:44:44.154309 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:44:44.154268 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-rr47g" event={"ID":"b08c5e38-0e05-411f-b67a-1427e00f2b85","Type":"ContainerStarted","Data":"f11ae929d46834609a352476ecab53cbe7c257056fdbb4cc3c695cc9eff61eee"} Apr 21 15:44:45.696909 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:44:45.696862 2573 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 21 15:44:45.697224 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:44:45.696935 2573 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 21 15:44:46.161850 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:44:46.161812 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-rr47g" event={"ID":"b08c5e38-0e05-411f-b67a-1427e00f2b85","Type":"ContainerStarted","Data":"95ecf9d47eeeb28d91d7e128e5d6c6311be2deb4b07527848c1c303fb936a48f"} Apr 21 15:44:46.162028 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:44:46.162011 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-rr47g" Apr 21 15:44:46.163745 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:44:46.163715 2573 patch_prober.go:28] interesting pod/istiod-openshift-gateway-7cd77c7ffd-rr47g container/discovery namespace/openshift-ingress: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body= Apr 21 15:44:46.163898 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:44:46.163770 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-rr47g" podUID="b08c5e38-0e05-411f-b67a-1427e00f2b85" containerName="discovery" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 21 15:44:46.187184 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:44:46.187116 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-rr47g" podStartSLOduration=1.066720064 podStartE2EDuration="3.187100757s" podCreationTimestamp="2026-04-21 15:44:43 +0000 UTC" firstStartedPulling="2026-04-21 15:44:43.57625197 +0000 UTC m=+578.706305585" lastFinishedPulling="2026-04-21 15:44:45.696632653 +0000 UTC m=+580.826686278" observedRunningTime="2026-04-21 15:44:46.186125105 +0000 UTC m=+581.316178738" watchObservedRunningTime="2026-04-21 15:44:46.187100757 +0000 UTC m=+581.317154385" Apr 21 15:44:47.133963 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:44:47.133928 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/servicemesh-operator3-55f49c5f94-vtcxl" Apr 21 15:44:47.165983 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:44:47.165958 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-rr47g" Apr 21 15:44:50.560121 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:44:50.560083 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-8z8tz"] Apr 21 15:44:50.562531 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:44:50.562509 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-8z8tz" Apr 21 15:44:50.568636 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:44:50.568613 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"openshift-ai-inference-openshift-default-dockercfg-5w5wz\"" Apr 21 15:44:50.609261 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:44:50.609229 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-8z8tz"] Apr 21 15:44:50.713475 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:44:50.713446 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/79cfeeaf-2ff6-4d93-9ee3-7d75b97d3996-istio-data\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-8z8tz\" (UID: \"79cfeeaf-2ff6-4d93-9ee3-7d75b97d3996\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-8z8tz" Apr 21 15:44:50.713475 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:44:50.713479 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/79cfeeaf-2ff6-4d93-9ee3-7d75b97d3996-istio-podinfo\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-8z8tz\" (UID: \"79cfeeaf-2ff6-4d93-9ee3-7d75b97d3996\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-8z8tz" Apr 21 15:44:50.713706 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:44:50.713504 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-48wmz\" (UniqueName: \"kubernetes.io/projected/79cfeeaf-2ff6-4d93-9ee3-7d75b97d3996-kube-api-access-48wmz\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-8z8tz\" (UID: \"79cfeeaf-2ff6-4d93-9ee3-7d75b97d3996\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-8z8tz" Apr 21 15:44:50.713706 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:44:50.713594 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/79cfeeaf-2ff6-4d93-9ee3-7d75b97d3996-istio-envoy\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-8z8tz\" (UID: \"79cfeeaf-2ff6-4d93-9ee3-7d75b97d3996\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-8z8tz" Apr 21 15:44:50.713706 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:44:50.713638 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/79cfeeaf-2ff6-4d93-9ee3-7d75b97d3996-istiod-ca-cert\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-8z8tz\" (UID: \"79cfeeaf-2ff6-4d93-9ee3-7d75b97d3996\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-8z8tz" Apr 21 15:44:50.713706 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:44:50.713687 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/79cfeeaf-2ff6-4d93-9ee3-7d75b97d3996-istio-token\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-8z8tz\" (UID: \"79cfeeaf-2ff6-4d93-9ee3-7d75b97d3996\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-8z8tz" Apr 21 15:44:50.713882 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:44:50.713717 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/79cfeeaf-2ff6-4d93-9ee3-7d75b97d3996-workload-certs\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-8z8tz\" (UID: \"79cfeeaf-2ff6-4d93-9ee3-7d75b97d3996\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-8z8tz" Apr 21 15:44:50.713882 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:44:50.713753 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/79cfeeaf-2ff6-4d93-9ee3-7d75b97d3996-workload-socket\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-8z8tz\" (UID: \"79cfeeaf-2ff6-4d93-9ee3-7d75b97d3996\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-8z8tz" Apr 21 15:44:50.713882 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:44:50.713772 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/79cfeeaf-2ff6-4d93-9ee3-7d75b97d3996-credential-socket\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-8z8tz\" (UID: \"79cfeeaf-2ff6-4d93-9ee3-7d75b97d3996\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-8z8tz" Apr 21 15:44:50.814789 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:44:50.814757 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/79cfeeaf-2ff6-4d93-9ee3-7d75b97d3996-workload-certs\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-8z8tz\" (UID: \"79cfeeaf-2ff6-4d93-9ee3-7d75b97d3996\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-8z8tz" Apr 21 15:44:50.814962 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:44:50.814833 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/79cfeeaf-2ff6-4d93-9ee3-7d75b97d3996-workload-socket\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-8z8tz\" (UID: \"79cfeeaf-2ff6-4d93-9ee3-7d75b97d3996\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-8z8tz" Apr 21 15:44:50.814962 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:44:50.814853 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/79cfeeaf-2ff6-4d93-9ee3-7d75b97d3996-credential-socket\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-8z8tz\" (UID: \"79cfeeaf-2ff6-4d93-9ee3-7d75b97d3996\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-8z8tz" Apr 21 15:44:50.814962 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:44:50.814874 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/79cfeeaf-2ff6-4d93-9ee3-7d75b97d3996-istio-data\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-8z8tz\" (UID: \"79cfeeaf-2ff6-4d93-9ee3-7d75b97d3996\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-8z8tz" Apr 21 15:44:50.814962 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:44:50.814893 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/79cfeeaf-2ff6-4d93-9ee3-7d75b97d3996-istio-podinfo\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-8z8tz\" (UID: \"79cfeeaf-2ff6-4d93-9ee3-7d75b97d3996\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-8z8tz" Apr 21 15:44:50.814962 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:44:50.814924 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-48wmz\" (UniqueName: \"kubernetes.io/projected/79cfeeaf-2ff6-4d93-9ee3-7d75b97d3996-kube-api-access-48wmz\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-8z8tz\" (UID: \"79cfeeaf-2ff6-4d93-9ee3-7d75b97d3996\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-8z8tz" Apr 21 15:44:50.814962 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:44:50.814958 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/79cfeeaf-2ff6-4d93-9ee3-7d75b97d3996-istio-envoy\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-8z8tz\" (UID: \"79cfeeaf-2ff6-4d93-9ee3-7d75b97d3996\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-8z8tz" Apr 21 15:44:50.815262 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:44:50.815068 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/79cfeeaf-2ff6-4d93-9ee3-7d75b97d3996-istiod-ca-cert\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-8z8tz\" (UID: \"79cfeeaf-2ff6-4d93-9ee3-7d75b97d3996\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-8z8tz" Apr 21 15:44:50.815262 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:44:50.815127 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/79cfeeaf-2ff6-4d93-9ee3-7d75b97d3996-istio-token\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-8z8tz\" (UID: \"79cfeeaf-2ff6-4d93-9ee3-7d75b97d3996\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-8z8tz" Apr 21 15:44:50.815262 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:44:50.815225 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/79cfeeaf-2ff6-4d93-9ee3-7d75b97d3996-workload-certs\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-8z8tz\" (UID: \"79cfeeaf-2ff6-4d93-9ee3-7d75b97d3996\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-8z8tz" Apr 21 15:44:50.815395 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:44:50.815358 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/79cfeeaf-2ff6-4d93-9ee3-7d75b97d3996-credential-socket\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-8z8tz\" (UID: \"79cfeeaf-2ff6-4d93-9ee3-7d75b97d3996\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-8z8tz" Apr 21 15:44:50.815553 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:44:50.815525 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/79cfeeaf-2ff6-4d93-9ee3-7d75b97d3996-workload-socket\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-8z8tz\" (UID: \"79cfeeaf-2ff6-4d93-9ee3-7d75b97d3996\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-8z8tz" Apr 21 15:44:50.815682 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:44:50.815596 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/79cfeeaf-2ff6-4d93-9ee3-7d75b97d3996-istio-data\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-8z8tz\" (UID: \"79cfeeaf-2ff6-4d93-9ee3-7d75b97d3996\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-8z8tz" Apr 21 15:44:50.815853 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:44:50.815820 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/79cfeeaf-2ff6-4d93-9ee3-7d75b97d3996-istiod-ca-cert\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-8z8tz\" (UID: \"79cfeeaf-2ff6-4d93-9ee3-7d75b97d3996\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-8z8tz" Apr 21 15:44:50.817322 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:44:50.817304 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/79cfeeaf-2ff6-4d93-9ee3-7d75b97d3996-istio-envoy\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-8z8tz\" (UID: \"79cfeeaf-2ff6-4d93-9ee3-7d75b97d3996\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-8z8tz" Apr 21 15:44:50.817702 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:44:50.817677 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/79cfeeaf-2ff6-4d93-9ee3-7d75b97d3996-istio-podinfo\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-8z8tz\" (UID: \"79cfeeaf-2ff6-4d93-9ee3-7d75b97d3996\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-8z8tz" Apr 21 15:44:50.832317 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:44:50.832285 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/79cfeeaf-2ff6-4d93-9ee3-7d75b97d3996-istio-token\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-8z8tz\" (UID: \"79cfeeaf-2ff6-4d93-9ee3-7d75b97d3996\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-8z8tz" Apr 21 15:44:50.832408 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:44:50.832292 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-48wmz\" (UniqueName: \"kubernetes.io/projected/79cfeeaf-2ff6-4d93-9ee3-7d75b97d3996-kube-api-access-48wmz\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-8z8tz\" (UID: \"79cfeeaf-2ff6-4d93-9ee3-7d75b97d3996\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-8z8tz" Apr 21 15:44:50.872262 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:44:50.872228 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-8z8tz" Apr 21 15:44:51.043898 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:44:51.043865 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-8z8tz"] Apr 21 15:44:51.047350 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:44:51.047322 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod79cfeeaf_2ff6_4d93_9ee3_7d75b97d3996.slice/crio-812a950be9662cc7d670f50a88ad7f8768296a23514f6765587b35d07454d339 WatchSource:0}: Error finding container 812a950be9662cc7d670f50a88ad7f8768296a23514f6765587b35d07454d339: Status 404 returned error can't find the container with id 812a950be9662cc7d670f50a88ad7f8768296a23514f6765587b35d07454d339 Apr 21 15:44:51.178964 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:44:51.178883 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-8z8tz" event={"ID":"79cfeeaf-2ff6-4d93-9ee3-7d75b97d3996","Type":"ContainerStarted","Data":"812a950be9662cc7d670f50a88ad7f8768296a23514f6765587b35d07454d339"} Apr 21 15:44:53.398770 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:44:53.398725 2573 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 21 15:44:53.399031 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:44:53.398822 2573 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 21 15:44:53.399031 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:44:53.398859 2573 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 21 15:44:54.190342 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:44:54.190309 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-8z8tz" event={"ID":"79cfeeaf-2ff6-4d93-9ee3-7d75b97d3996","Type":"ContainerStarted","Data":"075db5c687434d29f5df31bf5c5acb4d84282d3a6fbd19d47e25fd492105ceee"} Apr 21 15:44:54.216005 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:44:54.215950 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-8z8tz" podStartSLOduration=1.86671138 podStartE2EDuration="4.215934061s" podCreationTimestamp="2026-04-21 15:44:50 +0000 UTC" firstStartedPulling="2026-04-21 15:44:51.049276048 +0000 UTC m=+586.179329663" lastFinishedPulling="2026-04-21 15:44:53.398498732 +0000 UTC m=+588.528552344" observedRunningTime="2026-04-21 15:44:54.214521253 +0000 UTC m=+589.344574899" watchObservedRunningTime="2026-04-21 15:44:54.215934061 +0000 UTC m=+589.345987695" Apr 21 15:44:54.873389 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:44:54.873348 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-8z8tz" Apr 21 15:44:54.877817 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:44:54.877781 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-8z8tz" Apr 21 15:44:55.193820 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:44:55.193704 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-8z8tz" Apr 21 15:44:55.194649 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:44:55.194630 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-8z8tz" Apr 21 15:45:05.403453 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:45:05.403422 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9v44z_a8821bf6-e244-4b55-bfcc-7d85dec39bc4/ovn-acl-logging/0.log" Apr 21 15:45:05.404157 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:45:05.404136 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9v44z_a8821bf6-e244-4b55-bfcc-7d85dec39bc4/ovn-acl-logging/0.log" Apr 21 15:45:14.959272 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:45:14.959235 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-8z254"] Apr 21 15:45:14.966495 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:45:14.966472 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-8z254" Apr 21 15:45:14.969202 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:45:14.969173 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"limitador-operator-controller-manager-dockercfg-mvprw\"" Apr 21 15:45:14.969320 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:45:14.969214 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 21 15:45:14.970532 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:45:14.970511 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 21 15:45:14.975293 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:45:14.975269 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-8z254"] Apr 21 15:45:14.996074 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:45:14.996045 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2ds8w\" (UniqueName: \"kubernetes.io/projected/31e563af-bf75-499d-93b3-29b3469180f2-kube-api-access-2ds8w\") pod \"limitador-operator-controller-manager-c7fb4c8d5-8z254\" (UID: \"31e563af-bf75-499d-93b3-29b3469180f2\") " pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-8z254" Apr 21 15:45:15.097173 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:45:15.097140 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2ds8w\" (UniqueName: \"kubernetes.io/projected/31e563af-bf75-499d-93b3-29b3469180f2-kube-api-access-2ds8w\") pod \"limitador-operator-controller-manager-c7fb4c8d5-8z254\" (UID: \"31e563af-bf75-499d-93b3-29b3469180f2\") " pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-8z254" Apr 21 15:45:15.106511 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:45:15.106479 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2ds8w\" (UniqueName: \"kubernetes.io/projected/31e563af-bf75-499d-93b3-29b3469180f2-kube-api-access-2ds8w\") pod \"limitador-operator-controller-manager-c7fb4c8d5-8z254\" (UID: \"31e563af-bf75-499d-93b3-29b3469180f2\") " pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-8z254" Apr 21 15:45:15.277052 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:45:15.276980 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-8z254" Apr 21 15:45:15.404483 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:45:15.404459 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-8z254"] Apr 21 15:45:15.406401 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:45:15.406370 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod31e563af_bf75_499d_93b3_29b3469180f2.slice/crio-bb02269509865fefd2d74639fa23b859a83b44c828b437535ec4a9e7ff1f8ccb WatchSource:0}: Error finding container bb02269509865fefd2d74639fa23b859a83b44c828b437535ec4a9e7ff1f8ccb: Status 404 returned error can't find the container with id bb02269509865fefd2d74639fa23b859a83b44c828b437535ec4a9e7ff1f8ccb Apr 21 15:45:16.255745 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:45:16.255704 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-8z254" event={"ID":"31e563af-bf75-499d-93b3-29b3469180f2","Type":"ContainerStarted","Data":"bb02269509865fefd2d74639fa23b859a83b44c828b437535ec4a9e7ff1f8ccb"} Apr 21 15:45:17.840336 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:45:17.840297 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-operator-7587b89b76-jpxwt"] Apr 21 15:45:17.842952 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:45:17.842931 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-operator-7587b89b76-jpxwt" Apr 21 15:45:17.847376 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:45:17.847351 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-operator-dockercfg-zt4gv\"" Apr 21 15:45:17.864988 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:45:17.864961 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-operator-7587b89b76-jpxwt"] Apr 21 15:45:17.916429 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:45:17.916381 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nz89x\" (UniqueName: \"kubernetes.io/projected/bfa1b708-a291-48e4-a35f-8ddf287fc1b8-kube-api-access-nz89x\") pod \"authorino-operator-7587b89b76-jpxwt\" (UID: \"bfa1b708-a291-48e4-a35f-8ddf287fc1b8\") " pod="kuadrant-system/authorino-operator-7587b89b76-jpxwt" Apr 21 15:45:18.017242 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:45:18.017220 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nz89x\" (UniqueName: \"kubernetes.io/projected/bfa1b708-a291-48e4-a35f-8ddf287fc1b8-kube-api-access-nz89x\") pod \"authorino-operator-7587b89b76-jpxwt\" (UID: \"bfa1b708-a291-48e4-a35f-8ddf287fc1b8\") " pod="kuadrant-system/authorino-operator-7587b89b76-jpxwt" Apr 21 15:45:18.026647 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:45:18.026627 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nz89x\" (UniqueName: \"kubernetes.io/projected/bfa1b708-a291-48e4-a35f-8ddf287fc1b8-kube-api-access-nz89x\") pod \"authorino-operator-7587b89b76-jpxwt\" (UID: \"bfa1b708-a291-48e4-a35f-8ddf287fc1b8\") " pod="kuadrant-system/authorino-operator-7587b89b76-jpxwt" Apr 21 15:45:18.156961 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:45:18.156887 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-operator-7587b89b76-jpxwt" Apr 21 15:45:18.266090 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:45:18.266056 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-8z254" event={"ID":"31e563af-bf75-499d-93b3-29b3469180f2","Type":"ContainerStarted","Data":"b6daa34f4838dd6c06e59377540e24ac5eba4a8b6350e02c460dbce463a6fef1"} Apr 21 15:45:18.266233 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:45:18.266115 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-8z254" Apr 21 15:45:18.284786 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:45:18.284760 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-operator-7587b89b76-jpxwt"] Apr 21 15:45:18.288337 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:45:18.288308 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbfa1b708_a291_48e4_a35f_8ddf287fc1b8.slice/crio-9e26d35dfb72d31cf5cd35f36cf2cdd9c189a6c4b4ae7c8720c27614bdcf7955 WatchSource:0}: Error finding container 9e26d35dfb72d31cf5cd35f36cf2cdd9c189a6c4b4ae7c8720c27614bdcf7955: Status 404 returned error can't find the container with id 9e26d35dfb72d31cf5cd35f36cf2cdd9c189a6c4b4ae7c8720c27614bdcf7955 Apr 21 15:45:18.297316 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:45:18.297266 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-8z254" podStartSLOduration=1.709630034 podStartE2EDuration="4.297251742s" podCreationTimestamp="2026-04-21 15:45:14 +0000 UTC" firstStartedPulling="2026-04-21 15:45:15.408540765 +0000 UTC m=+610.538594377" lastFinishedPulling="2026-04-21 15:45:17.996162455 +0000 UTC m=+613.126216085" observedRunningTime="2026-04-21 15:45:18.295972517 +0000 UTC m=+613.426026150" watchObservedRunningTime="2026-04-21 15:45:18.297251742 +0000 UTC m=+613.427305377" Apr 21 15:45:19.270736 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:45:19.270697 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-operator-7587b89b76-jpxwt" event={"ID":"bfa1b708-a291-48e4-a35f-8ddf287fc1b8","Type":"ContainerStarted","Data":"9e26d35dfb72d31cf5cd35f36cf2cdd9c189a6c4b4ae7c8720c27614bdcf7955"} Apr 21 15:45:21.278703 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:45:21.278665 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-operator-7587b89b76-jpxwt" event={"ID":"bfa1b708-a291-48e4-a35f-8ddf287fc1b8","Type":"ContainerStarted","Data":"43be37371b96f13c5a421166b7f945c9ef7a3c183828dec90db62b818d3d25fd"} Apr 21 15:45:21.279225 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:45:21.278808 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/authorino-operator-7587b89b76-jpxwt" Apr 21 15:45:21.316187 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:45:21.316139 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-operator-7587b89b76-jpxwt" podStartSLOduration=2.372341042 podStartE2EDuration="4.316124065s" podCreationTimestamp="2026-04-21 15:45:17 +0000 UTC" firstStartedPulling="2026-04-21 15:45:18.290828931 +0000 UTC m=+613.420882543" lastFinishedPulling="2026-04-21 15:45:20.23461194 +0000 UTC m=+615.364665566" observedRunningTime="2026-04-21 15:45:21.312582442 +0000 UTC m=+616.442636076" watchObservedRunningTime="2026-04-21 15:45:21.316124065 +0000 UTC m=+616.446177699" Apr 21 15:45:29.272890 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:45:29.272807 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-8z254" Apr 21 15:45:32.283537 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:45:32.283506 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/authorino-operator-7587b89b76-jpxwt" Apr 21 15:46:11.041918 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:46:11.038696 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-limitador-67566c68b4-skws5"] Apr 21 15:46:11.043733 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:46:11.043108 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-67566c68b4-skws5" Apr 21 15:46:11.046036 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:46:11.046012 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"limitador-limits-config-limitador\"" Apr 21 15:46:11.047476 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:46:11.047458 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-pv44v\"" Apr 21 15:46:11.048385 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:46:11.048360 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-67566c68b4-skws5"] Apr 21 15:46:11.078569 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:46:11.078546 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-67566c68b4-skws5"] Apr 21 15:46:11.229629 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:46:11.229595 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/65e3be56-85d2-4fd7-8fc9-dd5648a64ac8-config-file\") pod \"limitador-limitador-67566c68b4-skws5\" (UID: \"65e3be56-85d2-4fd7-8fc9-dd5648a64ac8\") " pod="kuadrant-system/limitador-limitador-67566c68b4-skws5" Apr 21 15:46:11.229785 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:46:11.229652 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qprh4\" (UniqueName: \"kubernetes.io/projected/65e3be56-85d2-4fd7-8fc9-dd5648a64ac8-kube-api-access-qprh4\") pod \"limitador-limitador-67566c68b4-skws5\" (UID: \"65e3be56-85d2-4fd7-8fc9-dd5648a64ac8\") " pod="kuadrant-system/limitador-limitador-67566c68b4-skws5" Apr 21 15:46:11.330180 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:46:11.330147 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/65e3be56-85d2-4fd7-8fc9-dd5648a64ac8-config-file\") pod \"limitador-limitador-67566c68b4-skws5\" (UID: \"65e3be56-85d2-4fd7-8fc9-dd5648a64ac8\") " pod="kuadrant-system/limitador-limitador-67566c68b4-skws5" Apr 21 15:46:11.330324 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:46:11.330207 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qprh4\" (UniqueName: \"kubernetes.io/projected/65e3be56-85d2-4fd7-8fc9-dd5648a64ac8-kube-api-access-qprh4\") pod \"limitador-limitador-67566c68b4-skws5\" (UID: \"65e3be56-85d2-4fd7-8fc9-dd5648a64ac8\") " pod="kuadrant-system/limitador-limitador-67566c68b4-skws5" Apr 21 15:46:11.330758 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:46:11.330737 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/65e3be56-85d2-4fd7-8fc9-dd5648a64ac8-config-file\") pod \"limitador-limitador-67566c68b4-skws5\" (UID: \"65e3be56-85d2-4fd7-8fc9-dd5648a64ac8\") " pod="kuadrant-system/limitador-limitador-67566c68b4-skws5" Apr 21 15:46:11.338743 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:46:11.338716 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qprh4\" (UniqueName: \"kubernetes.io/projected/65e3be56-85d2-4fd7-8fc9-dd5648a64ac8-kube-api-access-qprh4\") pod \"limitador-limitador-67566c68b4-skws5\" (UID: \"65e3be56-85d2-4fd7-8fc9-dd5648a64ac8\") " pod="kuadrant-system/limitador-limitador-67566c68b4-skws5" Apr 21 15:46:11.354689 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:46:11.354667 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-67566c68b4-skws5" Apr 21 15:46:11.481770 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:46:11.481739 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-67566c68b4-skws5"] Apr 21 15:46:11.482125 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:46:11.482097 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod65e3be56_85d2_4fd7_8fc9_dd5648a64ac8.slice/crio-3c3239e9da8064924233df91e82b8b0be9a14b3a58b38680b5f7568c8cf7d5c5 WatchSource:0}: Error finding container 3c3239e9da8064924233df91e82b8b0be9a14b3a58b38680b5f7568c8cf7d5c5: Status 404 returned error can't find the container with id 3c3239e9da8064924233df91e82b8b0be9a14b3a58b38680b5f7568c8cf7d5c5 Apr 21 15:46:12.437256 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:46:12.437219 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-67566c68b4-skws5" event={"ID":"65e3be56-85d2-4fd7-8fc9-dd5648a64ac8","Type":"ContainerStarted","Data":"3c3239e9da8064924233df91e82b8b0be9a14b3a58b38680b5f7568c8cf7d5c5"} Apr 21 15:46:18.464952 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:46:18.464909 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-67566c68b4-skws5" event={"ID":"65e3be56-85d2-4fd7-8fc9-dd5648a64ac8","Type":"ContainerStarted","Data":"8eca8a6f5ac8718c3bd17c3027ed0c49b137a64af4d4abfdece4130615e6e27a"} Apr 21 15:46:18.465381 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:46:18.465023 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-limitador-67566c68b4-skws5" Apr 21 15:46:18.483979 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:46:18.483933 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-limitador-67566c68b4-skws5" podStartSLOduration=1.158341158 podStartE2EDuration="7.4839201s" podCreationTimestamp="2026-04-21 15:46:11 +0000 UTC" firstStartedPulling="2026-04-21 15:46:11.484042168 +0000 UTC m=+666.614095779" lastFinishedPulling="2026-04-21 15:46:17.80962111 +0000 UTC m=+672.939674721" observedRunningTime="2026-04-21 15:46:18.483323704 +0000 UTC m=+673.613377340" watchObservedRunningTime="2026-04-21 15:46:18.4839201 +0000 UTC m=+673.613973733" Apr 21 15:46:29.470636 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:46:29.470607 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-limitador-67566c68b4-skws5" Apr 21 15:46:53.031045 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:46:53.031009 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-rr47g"] Apr 21 15:46:53.031598 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:46:53.031282 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-rr47g" podUID="b08c5e38-0e05-411f-b67a-1427e00f2b85" containerName="discovery" containerID="cri-o://95ecf9d47eeeb28d91d7e128e5d6c6311be2deb4b07527848c1c303fb936a48f" gracePeriod=30 Apr 21 15:46:53.166266 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:46:53.166226 2573 patch_prober.go:28] interesting pod/istiod-openshift-gateway-7cd77c7ffd-rr47g container/discovery namespace/openshift-ingress: Readiness probe status=failure output="Get \"http://10.132.0.17:8080/ready\": dial tcp 10.132.0.17:8080: connect: connection refused" start-of-body= Apr 21 15:46:53.166419 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:46:53.166303 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-rr47g" podUID="b08c5e38-0e05-411f-b67a-1427e00f2b85" containerName="discovery" probeResult="failure" output="Get \"http://10.132.0.17:8080/ready\": dial tcp 10.132.0.17:8080: connect: connection refused" Apr 21 15:46:53.279446 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:46:53.279423 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-rr47g" Apr 21 15:46:53.344028 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:46:53.343997 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/b08c5e38-0e05-411f-b67a-1427e00f2b85-istio-csr-ca-configmap\") pod \"b08c5e38-0e05-411f-b67a-1427e00f2b85\" (UID: \"b08c5e38-0e05-411f-b67a-1427e00f2b85\") " Apr 21 15:46:53.344213 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:46:53.344058 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/b08c5e38-0e05-411f-b67a-1427e00f2b85-local-certs\") pod \"b08c5e38-0e05-411f-b67a-1427e00f2b85\" (UID: \"b08c5e38-0e05-411f-b67a-1427e00f2b85\") " Apr 21 15:46:53.344213 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:46:53.344088 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/b08c5e38-0e05-411f-b67a-1427e00f2b85-istio-kubeconfig\") pod \"b08c5e38-0e05-411f-b67a-1427e00f2b85\" (UID: \"b08c5e38-0e05-411f-b67a-1427e00f2b85\") " Apr 21 15:46:53.344213 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:46:53.344112 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/b08c5e38-0e05-411f-b67a-1427e00f2b85-istio-csr-dns-cert\") pod \"b08c5e38-0e05-411f-b67a-1427e00f2b85\" (UID: \"b08c5e38-0e05-411f-b67a-1427e00f2b85\") " Apr 21 15:46:53.344213 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:46:53.344134 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/b08c5e38-0e05-411f-b67a-1427e00f2b85-istio-token\") pod \"b08c5e38-0e05-411f-b67a-1427e00f2b85\" (UID: \"b08c5e38-0e05-411f-b67a-1427e00f2b85\") " Apr 21 15:46:53.344213 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:46:53.344176 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-69rkr\" (UniqueName: \"kubernetes.io/projected/b08c5e38-0e05-411f-b67a-1427e00f2b85-kube-api-access-69rkr\") pod \"b08c5e38-0e05-411f-b67a-1427e00f2b85\" (UID: \"b08c5e38-0e05-411f-b67a-1427e00f2b85\") " Apr 21 15:46:53.344447 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:46:53.344315 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/b08c5e38-0e05-411f-b67a-1427e00f2b85-cacerts\") pod \"b08c5e38-0e05-411f-b67a-1427e00f2b85\" (UID: \"b08c5e38-0e05-411f-b67a-1427e00f2b85\") " Apr 21 15:46:53.344496 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:46:53.344456 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b08c5e38-0e05-411f-b67a-1427e00f2b85-istio-csr-ca-configmap" (OuterVolumeSpecName: "istio-csr-ca-configmap") pod "b08c5e38-0e05-411f-b67a-1427e00f2b85" (UID: "b08c5e38-0e05-411f-b67a-1427e00f2b85"). InnerVolumeSpecName "istio-csr-ca-configmap". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 15:46:53.344602 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:46:53.344583 2573 reconciler_common.go:299] "Volume detached for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/b08c5e38-0e05-411f-b67a-1427e00f2b85-istio-csr-ca-configmap\") on node \"ip-10-0-136-123.ec2.internal\" DevicePath \"\"" Apr 21 15:46:53.346596 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:46:53.346551 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b08c5e38-0e05-411f-b67a-1427e00f2b85-istio-kubeconfig" (OuterVolumeSpecName: "istio-kubeconfig") pod "b08c5e38-0e05-411f-b67a-1427e00f2b85" (UID: "b08c5e38-0e05-411f-b67a-1427e00f2b85"). InnerVolumeSpecName "istio-kubeconfig". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 15:46:53.346736 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:46:53.346688 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b08c5e38-0e05-411f-b67a-1427e00f2b85-istio-csr-dns-cert" (OuterVolumeSpecName: "istio-csr-dns-cert") pod "b08c5e38-0e05-411f-b67a-1427e00f2b85" (UID: "b08c5e38-0e05-411f-b67a-1427e00f2b85"). InnerVolumeSpecName "istio-csr-dns-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 15:46:53.346736 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:46:53.346719 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b08c5e38-0e05-411f-b67a-1427e00f2b85-local-certs" (OuterVolumeSpecName: "local-certs") pod "b08c5e38-0e05-411f-b67a-1427e00f2b85" (UID: "b08c5e38-0e05-411f-b67a-1427e00f2b85"). InnerVolumeSpecName "local-certs". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 15:46:53.346875 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:46:53.346860 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b08c5e38-0e05-411f-b67a-1427e00f2b85-istio-token" (OuterVolumeSpecName: "istio-token") pod "b08c5e38-0e05-411f-b67a-1427e00f2b85" (UID: "b08c5e38-0e05-411f-b67a-1427e00f2b85"). InnerVolumeSpecName "istio-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 15:46:53.347013 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:46:53.346993 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b08c5e38-0e05-411f-b67a-1427e00f2b85-cacerts" (OuterVolumeSpecName: "cacerts") pod "b08c5e38-0e05-411f-b67a-1427e00f2b85" (UID: "b08c5e38-0e05-411f-b67a-1427e00f2b85"). InnerVolumeSpecName "cacerts". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 15:46:53.347123 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:46:53.347043 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b08c5e38-0e05-411f-b67a-1427e00f2b85-kube-api-access-69rkr" (OuterVolumeSpecName: "kube-api-access-69rkr") pod "b08c5e38-0e05-411f-b67a-1427e00f2b85" (UID: "b08c5e38-0e05-411f-b67a-1427e00f2b85"). InnerVolumeSpecName "kube-api-access-69rkr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 15:46:53.446015 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:46:53.445976 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-69rkr\" (UniqueName: \"kubernetes.io/projected/b08c5e38-0e05-411f-b67a-1427e00f2b85-kube-api-access-69rkr\") on node \"ip-10-0-136-123.ec2.internal\" DevicePath \"\"" Apr 21 15:46:53.446166 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:46:53.446017 2573 reconciler_common.go:299] "Volume detached for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/b08c5e38-0e05-411f-b67a-1427e00f2b85-cacerts\") on node \"ip-10-0-136-123.ec2.internal\" DevicePath \"\"" Apr 21 15:46:53.446166 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:46:53.446052 2573 reconciler_common.go:299] "Volume detached for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/b08c5e38-0e05-411f-b67a-1427e00f2b85-local-certs\") on node \"ip-10-0-136-123.ec2.internal\" DevicePath \"\"" Apr 21 15:46:53.446166 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:46:53.446067 2573 reconciler_common.go:299] "Volume detached for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/b08c5e38-0e05-411f-b67a-1427e00f2b85-istio-kubeconfig\") on node \"ip-10-0-136-123.ec2.internal\" DevicePath \"\"" Apr 21 15:46:53.446166 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:46:53.446093 2573 reconciler_common.go:299] "Volume detached for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/b08c5e38-0e05-411f-b67a-1427e00f2b85-istio-csr-dns-cert\") on node \"ip-10-0-136-123.ec2.internal\" DevicePath \"\"" Apr 21 15:46:53.446166 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:46:53.446106 2573 reconciler_common.go:299] "Volume detached for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/b08c5e38-0e05-411f-b67a-1427e00f2b85-istio-token\") on node \"ip-10-0-136-123.ec2.internal\" DevicePath \"\"" Apr 21 15:46:53.572397 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:46:53.572355 2573 generic.go:358] "Generic (PLEG): container finished" podID="b08c5e38-0e05-411f-b67a-1427e00f2b85" containerID="95ecf9d47eeeb28d91d7e128e5d6c6311be2deb4b07527848c1c303fb936a48f" exitCode=0 Apr 21 15:46:53.572553 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:46:53.572395 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-rr47g" event={"ID":"b08c5e38-0e05-411f-b67a-1427e00f2b85","Type":"ContainerDied","Data":"95ecf9d47eeeb28d91d7e128e5d6c6311be2deb4b07527848c1c303fb936a48f"} Apr 21 15:46:53.572553 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:46:53.572424 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-rr47g" Apr 21 15:46:53.572553 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:46:53.572436 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-rr47g" event={"ID":"b08c5e38-0e05-411f-b67a-1427e00f2b85","Type":"ContainerDied","Data":"f11ae929d46834609a352476ecab53cbe7c257056fdbb4cc3c695cc9eff61eee"} Apr 21 15:46:53.572553 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:46:53.572454 2573 scope.go:117] "RemoveContainer" containerID="95ecf9d47eeeb28d91d7e128e5d6c6311be2deb4b07527848c1c303fb936a48f" Apr 21 15:46:53.580815 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:46:53.580767 2573 scope.go:117] "RemoveContainer" containerID="95ecf9d47eeeb28d91d7e128e5d6c6311be2deb4b07527848c1c303fb936a48f" Apr 21 15:46:53.581056 ip-10-0-136-123 kubenswrapper[2573]: E0421 15:46:53.581031 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"95ecf9d47eeeb28d91d7e128e5d6c6311be2deb4b07527848c1c303fb936a48f\": container with ID starting with 95ecf9d47eeeb28d91d7e128e5d6c6311be2deb4b07527848c1c303fb936a48f not found: ID does not exist" containerID="95ecf9d47eeeb28d91d7e128e5d6c6311be2deb4b07527848c1c303fb936a48f" Apr 21 15:46:53.581109 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:46:53.581068 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"95ecf9d47eeeb28d91d7e128e5d6c6311be2deb4b07527848c1c303fb936a48f"} err="failed to get container status \"95ecf9d47eeeb28d91d7e128e5d6c6311be2deb4b07527848c1c303fb936a48f\": rpc error: code = NotFound desc = could not find container \"95ecf9d47eeeb28d91d7e128e5d6c6311be2deb4b07527848c1c303fb936a48f\": container with ID starting with 95ecf9d47eeeb28d91d7e128e5d6c6311be2deb4b07527848c1c303fb936a48f not found: ID does not exist" Apr 21 15:46:53.590870 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:46:53.590842 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-rr47g"] Apr 21 15:46:53.594008 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:46:53.593989 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-rr47g"] Apr 21 15:46:55.482144 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:46:55.482069 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b08c5e38-0e05-411f-b67a-1427e00f2b85" path="/var/lib/kubelet/pods/b08c5e38-0e05-411f-b67a-1427e00f2b85/volumes" Apr 21 15:46:57.306691 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:46:57.306635 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-9c85dd4d8-45qb8"] Apr 21 15:46:57.307166 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:46:57.307122 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b08c5e38-0e05-411f-b67a-1427e00f2b85" containerName="discovery" Apr 21 15:46:57.307166 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:46:57.307141 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="b08c5e38-0e05-411f-b67a-1427e00f2b85" containerName="discovery" Apr 21 15:46:57.307273 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:46:57.307223 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="b08c5e38-0e05-411f-b67a-1427e00f2b85" containerName="discovery" Apr 21 15:46:57.311231 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:46:57.311210 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-9c85dd4d8-45qb8" Apr 21 15:46:57.314396 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:46:57.314374 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 21 15:46:57.314493 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:46:57.314372 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-webhook-server-cert\"" Apr 21 15:46:57.314493 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:46:57.314453 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-controller-manager-dockercfg-hjdv4\"" Apr 21 15:46:57.315549 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:46:57.315530 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 21 15:46:57.319405 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:46:57.319375 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-9c85dd4d8-45qb8"] Apr 21 15:46:57.324223 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:46:57.324202 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/llmisvc-controller-manager-7454559757-dpm6j"] Apr 21 15:46:57.327257 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:46:57.327238 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-7454559757-dpm6j" Apr 21 15:46:57.329978 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:46:57.329942 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-controller-manager-dockercfg-ftn4z\"" Apr 21 15:46:57.329978 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:46:57.329969 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-webhook-server-cert\"" Apr 21 15:46:57.334274 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:46:57.334224 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-7454559757-dpm6j"] Apr 21 15:46:57.356183 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:46:57.356155 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/seaweedfs-86cc847c5c-b8b7x"] Apr 21 15:46:57.359266 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:46:57.359245 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-b8b7x" Apr 21 15:46:57.361902 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:46:57.361884 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"mlpipeline-s3-artifact\"" Apr 21 15:46:57.362127 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:46:57.361982 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-kz6r4\"" Apr 21 15:46:57.367271 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:46:57.367251 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-b8b7x"] Apr 21 15:46:57.377830 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:46:57.377786 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tvcsn\" (UniqueName: \"kubernetes.io/projected/fd643b9b-bf44-4eee-85c4-78480002fb7a-kube-api-access-tvcsn\") pod \"kserve-controller-manager-9c85dd4d8-45qb8\" (UID: \"fd643b9b-bf44-4eee-85c4-78480002fb7a\") " pod="kserve/kserve-controller-manager-9c85dd4d8-45qb8" Apr 21 15:46:57.377946 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:46:57.377864 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fd643b9b-bf44-4eee-85c4-78480002fb7a-cert\") pod \"kserve-controller-manager-9c85dd4d8-45qb8\" (UID: \"fd643b9b-bf44-4eee-85c4-78480002fb7a\") " pod="kserve/kserve-controller-manager-9c85dd4d8-45qb8" Apr 21 15:46:57.378006 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:46:57.377955 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/56a319d7-3883-4871-bc37-8a8b0772785c-cert\") pod \"llmisvc-controller-manager-7454559757-dpm6j\" (UID: \"56a319d7-3883-4871-bc37-8a8b0772785c\") " pod="kserve/llmisvc-controller-manager-7454559757-dpm6j" Apr 21 15:46:57.378064 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:46:57.378011 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d687v\" (UniqueName: \"kubernetes.io/projected/56a319d7-3883-4871-bc37-8a8b0772785c-kube-api-access-d687v\") pod \"llmisvc-controller-manager-7454559757-dpm6j\" (UID: \"56a319d7-3883-4871-bc37-8a8b0772785c\") " pod="kserve/llmisvc-controller-manager-7454559757-dpm6j" Apr 21 15:46:57.479195 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:46:57.479166 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/56a319d7-3883-4871-bc37-8a8b0772785c-cert\") pod \"llmisvc-controller-manager-7454559757-dpm6j\" (UID: \"56a319d7-3883-4871-bc37-8a8b0772785c\") " pod="kserve/llmisvc-controller-manager-7454559757-dpm6j" Apr 21 15:46:57.479365 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:46:57.479199 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d687v\" (UniqueName: \"kubernetes.io/projected/56a319d7-3883-4871-bc37-8a8b0772785c-kube-api-access-d687v\") pod \"llmisvc-controller-manager-7454559757-dpm6j\" (UID: \"56a319d7-3883-4871-bc37-8a8b0772785c\") " pod="kserve/llmisvc-controller-manager-7454559757-dpm6j" Apr 21 15:46:57.479365 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:46:57.479236 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xdxsq\" (UniqueName: \"kubernetes.io/projected/06c79565-0f30-407d-99a1-82fd07c760f3-kube-api-access-xdxsq\") pod \"seaweedfs-86cc847c5c-b8b7x\" (UID: \"06c79565-0f30-407d-99a1-82fd07c760f3\") " pod="kserve/seaweedfs-86cc847c5c-b8b7x" Apr 21 15:46:57.479365 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:46:57.479272 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tvcsn\" (UniqueName: \"kubernetes.io/projected/fd643b9b-bf44-4eee-85c4-78480002fb7a-kube-api-access-tvcsn\") pod \"kserve-controller-manager-9c85dd4d8-45qb8\" (UID: \"fd643b9b-bf44-4eee-85c4-78480002fb7a\") " pod="kserve/kserve-controller-manager-9c85dd4d8-45qb8" Apr 21 15:46:57.479513 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:46:57.479434 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/06c79565-0f30-407d-99a1-82fd07c760f3-data\") pod \"seaweedfs-86cc847c5c-b8b7x\" (UID: \"06c79565-0f30-407d-99a1-82fd07c760f3\") " pod="kserve/seaweedfs-86cc847c5c-b8b7x" Apr 21 15:46:57.479513 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:46:57.479479 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fd643b9b-bf44-4eee-85c4-78480002fb7a-cert\") pod \"kserve-controller-manager-9c85dd4d8-45qb8\" (UID: \"fd643b9b-bf44-4eee-85c4-78480002fb7a\") " pod="kserve/kserve-controller-manager-9c85dd4d8-45qb8" Apr 21 15:46:57.481891 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:46:57.481870 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/56a319d7-3883-4871-bc37-8a8b0772785c-cert\") pod \"llmisvc-controller-manager-7454559757-dpm6j\" (UID: \"56a319d7-3883-4871-bc37-8a8b0772785c\") " pod="kserve/llmisvc-controller-manager-7454559757-dpm6j" Apr 21 15:46:57.482262 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:46:57.482240 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fd643b9b-bf44-4eee-85c4-78480002fb7a-cert\") pod \"kserve-controller-manager-9c85dd4d8-45qb8\" (UID: \"fd643b9b-bf44-4eee-85c4-78480002fb7a\") " pod="kserve/kserve-controller-manager-9c85dd4d8-45qb8" Apr 21 15:46:57.489544 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:46:57.489510 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d687v\" (UniqueName: \"kubernetes.io/projected/56a319d7-3883-4871-bc37-8a8b0772785c-kube-api-access-d687v\") pod \"llmisvc-controller-manager-7454559757-dpm6j\" (UID: \"56a319d7-3883-4871-bc37-8a8b0772785c\") " pod="kserve/llmisvc-controller-manager-7454559757-dpm6j" Apr 21 15:46:57.489924 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:46:57.489904 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tvcsn\" (UniqueName: \"kubernetes.io/projected/fd643b9b-bf44-4eee-85c4-78480002fb7a-kube-api-access-tvcsn\") pod \"kserve-controller-manager-9c85dd4d8-45qb8\" (UID: \"fd643b9b-bf44-4eee-85c4-78480002fb7a\") " pod="kserve/kserve-controller-manager-9c85dd4d8-45qb8" Apr 21 15:46:57.584866 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:46:57.584151 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/06c79565-0f30-407d-99a1-82fd07c760f3-data\") pod \"seaweedfs-86cc847c5c-b8b7x\" (UID: \"06c79565-0f30-407d-99a1-82fd07c760f3\") " pod="kserve/seaweedfs-86cc847c5c-b8b7x" Apr 21 15:46:57.584866 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:46:57.584261 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xdxsq\" (UniqueName: \"kubernetes.io/projected/06c79565-0f30-407d-99a1-82fd07c760f3-kube-api-access-xdxsq\") pod \"seaweedfs-86cc847c5c-b8b7x\" (UID: \"06c79565-0f30-407d-99a1-82fd07c760f3\") " pod="kserve/seaweedfs-86cc847c5c-b8b7x" Apr 21 15:46:57.584866 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:46:57.584533 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/06c79565-0f30-407d-99a1-82fd07c760f3-data\") pod \"seaweedfs-86cc847c5c-b8b7x\" (UID: \"06c79565-0f30-407d-99a1-82fd07c760f3\") " pod="kserve/seaweedfs-86cc847c5c-b8b7x" Apr 21 15:46:57.593154 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:46:57.593128 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xdxsq\" (UniqueName: \"kubernetes.io/projected/06c79565-0f30-407d-99a1-82fd07c760f3-kube-api-access-xdxsq\") pod \"seaweedfs-86cc847c5c-b8b7x\" (UID: \"06c79565-0f30-407d-99a1-82fd07c760f3\") " pod="kserve/seaweedfs-86cc847c5c-b8b7x" Apr 21 15:46:57.621755 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:46:57.621712 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-9c85dd4d8-45qb8" Apr 21 15:46:57.637948 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:46:57.637924 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-7454559757-dpm6j" Apr 21 15:46:57.669788 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:46:57.669754 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-b8b7x" Apr 21 15:46:57.761350 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:46:57.761319 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-9c85dd4d8-45qb8"] Apr 21 15:46:57.763747 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:46:57.763717 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfd643b9b_bf44_4eee_85c4_78480002fb7a.slice/crio-4e2c752ba93a80d62def12e698f3dcbeaca6771b8ed7bf1ce9fc948aa3a135b6 WatchSource:0}: Error finding container 4e2c752ba93a80d62def12e698f3dcbeaca6771b8ed7bf1ce9fc948aa3a135b6: Status 404 returned error can't find the container with id 4e2c752ba93a80d62def12e698f3dcbeaca6771b8ed7bf1ce9fc948aa3a135b6 Apr 21 15:46:57.790190 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:46:57.790163 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-7454559757-dpm6j"] Apr 21 15:46:57.791290 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:46:57.791267 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod56a319d7_3883_4871_bc37_8a8b0772785c.slice/crio-151332a37a3e95397413b2b0a2e78145273da6e14515cd5744c289656b2a2fd4 WatchSource:0}: Error finding container 151332a37a3e95397413b2b0a2e78145273da6e14515cd5744c289656b2a2fd4: Status 404 returned error can't find the container with id 151332a37a3e95397413b2b0a2e78145273da6e14515cd5744c289656b2a2fd4 Apr 21 15:46:57.815985 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:46:57.815939 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-b8b7x"] Apr 21 15:46:57.818042 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:46:57.818009 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod06c79565_0f30_407d_99a1_82fd07c760f3.slice/crio-afe7a9a696ba51a4c1593353b427875f6c516ddae80b9b202257e26efc435392 WatchSource:0}: Error finding container afe7a9a696ba51a4c1593353b427875f6c516ddae80b9b202257e26efc435392: Status 404 returned error can't find the container with id afe7a9a696ba51a4c1593353b427875f6c516ddae80b9b202257e26efc435392 Apr 21 15:46:58.595742 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:46:58.595704 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-7454559757-dpm6j" event={"ID":"56a319d7-3883-4871-bc37-8a8b0772785c","Type":"ContainerStarted","Data":"151332a37a3e95397413b2b0a2e78145273da6e14515cd5744c289656b2a2fd4"} Apr 21 15:46:58.597028 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:46:58.596995 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-9c85dd4d8-45qb8" event={"ID":"fd643b9b-bf44-4eee-85c4-78480002fb7a","Type":"ContainerStarted","Data":"4e2c752ba93a80d62def12e698f3dcbeaca6771b8ed7bf1ce9fc948aa3a135b6"} Apr 21 15:46:58.598674 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:46:58.598645 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-b8b7x" event={"ID":"06c79565-0f30-407d-99a1-82fd07c760f3","Type":"ContainerStarted","Data":"afe7a9a696ba51a4c1593353b427875f6c516ddae80b9b202257e26efc435392"} Apr 21 15:47:02.619171 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:47:02.619127 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-9c85dd4d8-45qb8" event={"ID":"fd643b9b-bf44-4eee-85c4-78480002fb7a","Type":"ContainerStarted","Data":"802f9a3b553c2c285e0351cb290d73b5f5f19e782c8367134bab880fd599eaa5"} Apr 21 15:47:02.619537 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:47:02.619268 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-9c85dd4d8-45qb8" Apr 21 15:47:02.637161 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:47:02.637080 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-9c85dd4d8-45qb8" podStartSLOduration=1.5925186359999999 podStartE2EDuration="5.637059884s" podCreationTimestamp="2026-04-21 15:46:57 +0000 UTC" firstStartedPulling="2026-04-21 15:46:57.765478043 +0000 UTC m=+712.895531677" lastFinishedPulling="2026-04-21 15:47:01.810019314 +0000 UTC m=+716.940072925" observedRunningTime="2026-04-21 15:47:02.63473379 +0000 UTC m=+717.764787421" watchObservedRunningTime="2026-04-21 15:47:02.637059884 +0000 UTC m=+717.767113518" Apr 21 15:47:03.623631 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:47:03.623590 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-b8b7x" event={"ID":"06c79565-0f30-407d-99a1-82fd07c760f3","Type":"ContainerStarted","Data":"a1dbb8641b8980270ccfa13587549d7ff3cdbcf26c0762f347c7784120e7a4dd"} Apr 21 15:47:03.624134 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:47:03.623641 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/seaweedfs-86cc847c5c-b8b7x" Apr 21 15:47:03.624860 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:47:03.624835 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-7454559757-dpm6j" event={"ID":"56a319d7-3883-4871-bc37-8a8b0772785c","Type":"ContainerStarted","Data":"1b1808b36d101e7222c748900fe184877fad0ac9376ea9125f64c8fed74a291c"} Apr 21 15:47:03.624930 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:47:03.624917 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/llmisvc-controller-manager-7454559757-dpm6j" Apr 21 15:47:03.640222 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:47:03.640179 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/seaweedfs-86cc847c5c-b8b7x" podStartSLOduration=1.923109019 podStartE2EDuration="6.6401606s" podCreationTimestamp="2026-04-21 15:46:57 +0000 UTC" firstStartedPulling="2026-04-21 15:46:57.819266167 +0000 UTC m=+712.949319780" lastFinishedPulling="2026-04-21 15:47:02.536317732 +0000 UTC m=+717.666371361" observedRunningTime="2026-04-21 15:47:03.640147218 +0000 UTC m=+718.770200860" watchObservedRunningTime="2026-04-21 15:47:03.6401606 +0000 UTC m=+718.770214235" Apr 21 15:47:03.656197 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:47:03.656098 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/llmisvc-controller-manager-7454559757-dpm6j" podStartSLOduration=1.964033285 podStartE2EDuration="6.656072898s" podCreationTimestamp="2026-04-21 15:46:57 +0000 UTC" firstStartedPulling="2026-04-21 15:46:57.792708562 +0000 UTC m=+712.922762174" lastFinishedPulling="2026-04-21 15:47:02.484748161 +0000 UTC m=+717.614801787" observedRunningTime="2026-04-21 15:47:03.65496919 +0000 UTC m=+718.785022825" watchObservedRunningTime="2026-04-21 15:47:03.656072898 +0000 UTC m=+718.786126613" Apr 21 15:47:09.630852 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:47:09.630817 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/seaweedfs-86cc847c5c-b8b7x" Apr 21 15:47:33.629944 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:47:33.629907 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-9c85dd4d8-45qb8" Apr 21 15:47:34.630947 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:47:34.630915 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/llmisvc-controller-manager-7454559757-dpm6j" Apr 21 15:47:36.002543 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:47:36.002506 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-9c85dd4d8-45qb8"] Apr 21 15:47:36.003026 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:47:36.002742 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve/kserve-controller-manager-9c85dd4d8-45qb8" podUID="fd643b9b-bf44-4eee-85c4-78480002fb7a" containerName="manager" containerID="cri-o://802f9a3b553c2c285e0351cb290d73b5f5f19e782c8367134bab880fd599eaa5" gracePeriod=10 Apr 21 15:47:36.023896 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:47:36.023872 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-9c85dd4d8-zp5r9"] Apr 21 15:47:36.026696 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:47:36.026670 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-9c85dd4d8-zp5r9" Apr 21 15:47:36.037497 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:47:36.037469 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-9c85dd4d8-zp5r9"] Apr 21 15:47:36.086111 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:47:36.086079 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qqx95\" (UniqueName: \"kubernetes.io/projected/4ad48c88-fd28-4a98-bf60-c7e0dd7a58f6-kube-api-access-qqx95\") pod \"kserve-controller-manager-9c85dd4d8-zp5r9\" (UID: \"4ad48c88-fd28-4a98-bf60-c7e0dd7a58f6\") " pod="kserve/kserve-controller-manager-9c85dd4d8-zp5r9" Apr 21 15:47:36.086239 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:47:36.086133 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4ad48c88-fd28-4a98-bf60-c7e0dd7a58f6-cert\") pod \"kserve-controller-manager-9c85dd4d8-zp5r9\" (UID: \"4ad48c88-fd28-4a98-bf60-c7e0dd7a58f6\") " pod="kserve/kserve-controller-manager-9c85dd4d8-zp5r9" Apr 21 15:47:36.186472 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:47:36.186441 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qqx95\" (UniqueName: \"kubernetes.io/projected/4ad48c88-fd28-4a98-bf60-c7e0dd7a58f6-kube-api-access-qqx95\") pod \"kserve-controller-manager-9c85dd4d8-zp5r9\" (UID: \"4ad48c88-fd28-4a98-bf60-c7e0dd7a58f6\") " pod="kserve/kserve-controller-manager-9c85dd4d8-zp5r9" Apr 21 15:47:36.186595 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:47:36.186485 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4ad48c88-fd28-4a98-bf60-c7e0dd7a58f6-cert\") pod \"kserve-controller-manager-9c85dd4d8-zp5r9\" (UID: \"4ad48c88-fd28-4a98-bf60-c7e0dd7a58f6\") " pod="kserve/kserve-controller-manager-9c85dd4d8-zp5r9" Apr 21 15:47:36.189013 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:47:36.188992 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4ad48c88-fd28-4a98-bf60-c7e0dd7a58f6-cert\") pod \"kserve-controller-manager-9c85dd4d8-zp5r9\" (UID: \"4ad48c88-fd28-4a98-bf60-c7e0dd7a58f6\") " pod="kserve/kserve-controller-manager-9c85dd4d8-zp5r9" Apr 21 15:47:36.196174 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:47:36.196150 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qqx95\" (UniqueName: \"kubernetes.io/projected/4ad48c88-fd28-4a98-bf60-c7e0dd7a58f6-kube-api-access-qqx95\") pod \"kserve-controller-manager-9c85dd4d8-zp5r9\" (UID: \"4ad48c88-fd28-4a98-bf60-c7e0dd7a58f6\") " pod="kserve/kserve-controller-manager-9c85dd4d8-zp5r9" Apr 21 15:47:36.240509 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:47:36.240484 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-9c85dd4d8-45qb8" Apr 21 15:47:36.287583 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:47:36.287516 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fd643b9b-bf44-4eee-85c4-78480002fb7a-cert\") pod \"fd643b9b-bf44-4eee-85c4-78480002fb7a\" (UID: \"fd643b9b-bf44-4eee-85c4-78480002fb7a\") " Apr 21 15:47:36.287583 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:47:36.287549 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tvcsn\" (UniqueName: \"kubernetes.io/projected/fd643b9b-bf44-4eee-85c4-78480002fb7a-kube-api-access-tvcsn\") pod \"fd643b9b-bf44-4eee-85c4-78480002fb7a\" (UID: \"fd643b9b-bf44-4eee-85c4-78480002fb7a\") " Apr 21 15:47:36.289611 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:47:36.289578 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fd643b9b-bf44-4eee-85c4-78480002fb7a-cert" (OuterVolumeSpecName: "cert") pod "fd643b9b-bf44-4eee-85c4-78480002fb7a" (UID: "fd643b9b-bf44-4eee-85c4-78480002fb7a"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 15:47:36.289709 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:47:36.289650 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fd643b9b-bf44-4eee-85c4-78480002fb7a-kube-api-access-tvcsn" (OuterVolumeSpecName: "kube-api-access-tvcsn") pod "fd643b9b-bf44-4eee-85c4-78480002fb7a" (UID: "fd643b9b-bf44-4eee-85c4-78480002fb7a"). InnerVolumeSpecName "kube-api-access-tvcsn". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 15:47:36.388252 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:47:36.388222 2573 reconciler_common.go:299] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fd643b9b-bf44-4eee-85c4-78480002fb7a-cert\") on node \"ip-10-0-136-123.ec2.internal\" DevicePath \"\"" Apr 21 15:47:36.388252 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:47:36.388249 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-tvcsn\" (UniqueName: \"kubernetes.io/projected/fd643b9b-bf44-4eee-85c4-78480002fb7a-kube-api-access-tvcsn\") on node \"ip-10-0-136-123.ec2.internal\" DevicePath \"\"" Apr 21 15:47:36.393185 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:47:36.393163 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-9c85dd4d8-zp5r9" Apr 21 15:47:36.510305 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:47:36.510282 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-9c85dd4d8-zp5r9"] Apr 21 15:47:36.512714 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:47:36.512684 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4ad48c88_fd28_4a98_bf60_c7e0dd7a58f6.slice/crio-f6b082209a7726e2dc8b4df70af42293cc99c117d1613e4f19aa4774e6712e50 WatchSource:0}: Error finding container f6b082209a7726e2dc8b4df70af42293cc99c117d1613e4f19aa4774e6712e50: Status 404 returned error can't find the container with id f6b082209a7726e2dc8b4df70af42293cc99c117d1613e4f19aa4774e6712e50 Apr 21 15:47:36.513984 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:47:36.513967 2573 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 21 15:47:36.730197 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:47:36.730158 2573 generic.go:358] "Generic (PLEG): container finished" podID="fd643b9b-bf44-4eee-85c4-78480002fb7a" containerID="802f9a3b553c2c285e0351cb290d73b5f5f19e782c8367134bab880fd599eaa5" exitCode=0 Apr 21 15:47:36.730399 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:47:36.730204 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-9c85dd4d8-45qb8" event={"ID":"fd643b9b-bf44-4eee-85c4-78480002fb7a","Type":"ContainerDied","Data":"802f9a3b553c2c285e0351cb290d73b5f5f19e782c8367134bab880fd599eaa5"} Apr 21 15:47:36.730399 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:47:36.730229 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-9c85dd4d8-45qb8" Apr 21 15:47:36.730399 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:47:36.730245 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-9c85dd4d8-45qb8" event={"ID":"fd643b9b-bf44-4eee-85c4-78480002fb7a","Type":"ContainerDied","Data":"4e2c752ba93a80d62def12e698f3dcbeaca6771b8ed7bf1ce9fc948aa3a135b6"} Apr 21 15:47:36.730399 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:47:36.730267 2573 scope.go:117] "RemoveContainer" containerID="802f9a3b553c2c285e0351cb290d73b5f5f19e782c8367134bab880fd599eaa5" Apr 21 15:47:36.731271 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:47:36.731238 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-9c85dd4d8-zp5r9" event={"ID":"4ad48c88-fd28-4a98-bf60-c7e0dd7a58f6","Type":"ContainerStarted","Data":"f6b082209a7726e2dc8b4df70af42293cc99c117d1613e4f19aa4774e6712e50"} Apr 21 15:47:36.737940 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:47:36.737739 2573 scope.go:117] "RemoveContainer" containerID="802f9a3b553c2c285e0351cb290d73b5f5f19e782c8367134bab880fd599eaa5" Apr 21 15:47:36.738274 ip-10-0-136-123 kubenswrapper[2573]: E0421 15:47:36.738243 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"802f9a3b553c2c285e0351cb290d73b5f5f19e782c8367134bab880fd599eaa5\": container with ID starting with 802f9a3b553c2c285e0351cb290d73b5f5f19e782c8367134bab880fd599eaa5 not found: ID does not exist" containerID="802f9a3b553c2c285e0351cb290d73b5f5f19e782c8367134bab880fd599eaa5" Apr 21 15:47:36.738349 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:47:36.738285 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"802f9a3b553c2c285e0351cb290d73b5f5f19e782c8367134bab880fd599eaa5"} err="failed to get container status \"802f9a3b553c2c285e0351cb290d73b5f5f19e782c8367134bab880fd599eaa5\": rpc error: code = NotFound desc = could not find container \"802f9a3b553c2c285e0351cb290d73b5f5f19e782c8367134bab880fd599eaa5\": container with ID starting with 802f9a3b553c2c285e0351cb290d73b5f5f19e782c8367134bab880fd599eaa5 not found: ID does not exist" Apr 21 15:47:36.752655 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:47:36.752633 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-9c85dd4d8-45qb8"] Apr 21 15:47:36.754625 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:47:36.754605 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve/kserve-controller-manager-9c85dd4d8-45qb8"] Apr 21 15:47:37.480831 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:47:37.480778 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fd643b9b-bf44-4eee-85c4-78480002fb7a" path="/var/lib/kubelet/pods/fd643b9b-bf44-4eee-85c4-78480002fb7a/volumes" Apr 21 15:47:37.736223 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:47:37.736139 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-9c85dd4d8-zp5r9" event={"ID":"4ad48c88-fd28-4a98-bf60-c7e0dd7a58f6","Type":"ContainerStarted","Data":"574f7174a455b86a4c317e5cc46feb7923d5a54a726f70c9532cf7584d45de20"} Apr 21 15:47:37.736391 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:47:37.736252 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-9c85dd4d8-zp5r9" Apr 21 15:47:37.753438 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:47:37.753390 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-9c85dd4d8-zp5r9" podStartSLOduration=1.438826177 podStartE2EDuration="1.753376378s" podCreationTimestamp="2026-04-21 15:47:36 +0000 UTC" firstStartedPulling="2026-04-21 15:47:36.514084285 +0000 UTC m=+751.644137897" lastFinishedPulling="2026-04-21 15:47:36.828634486 +0000 UTC m=+751.958688098" observedRunningTime="2026-04-21 15:47:37.752581017 +0000 UTC m=+752.882634652" watchObservedRunningTime="2026-04-21 15:47:37.753376378 +0000 UTC m=+752.883430012" Apr 21 15:48:08.743711 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:48:08.743675 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-9c85dd4d8-zp5r9" Apr 21 15:48:09.711874 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:48:09.711837 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/model-serving-api-86f7b4b499-l4gvp"] Apr 21 15:48:09.712175 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:48:09.712163 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fd643b9b-bf44-4eee-85c4-78480002fb7a" containerName="manager" Apr 21 15:48:09.712220 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:48:09.712177 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd643b9b-bf44-4eee-85c4-78480002fb7a" containerName="manager" Apr 21 15:48:09.712253 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:48:09.712227 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="fd643b9b-bf44-4eee-85c4-78480002fb7a" containerName="manager" Apr 21 15:48:09.716352 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:48:09.716327 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-l4gvp" Apr 21 15:48:09.717619 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:48:09.717595 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/odh-model-controller-696fc77849-lnjn8"] Apr 21 15:48:09.719079 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:48:09.719055 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-dockercfg-qb2h6\"" Apr 21 15:48:09.719576 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:48:09.719559 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-tls\"" Apr 21 15:48:09.720478 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:48:09.720462 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-lnjn8" Apr 21 15:48:09.723060 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:48:09.723038 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-webhook-cert\"" Apr 21 15:48:09.723850 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:48:09.723765 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-dockercfg-r6b8f\"" Apr 21 15:48:09.729288 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:48:09.729266 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-l4gvp"] Apr 21 15:48:09.733448 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:48:09.733428 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-lnjn8"] Apr 21 15:48:09.841750 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:48:09.841716 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p7wrx\" (UniqueName: \"kubernetes.io/projected/2c857db3-273e-45ae-afd5-424141f11fbb-kube-api-access-p7wrx\") pod \"model-serving-api-86f7b4b499-l4gvp\" (UID: \"2c857db3-273e-45ae-afd5-424141f11fbb\") " pod="kserve/model-serving-api-86f7b4b499-l4gvp" Apr 21 15:48:09.841750 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:48:09.841755 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2c857db3-273e-45ae-afd5-424141f11fbb-tls-certs\") pod \"model-serving-api-86f7b4b499-l4gvp\" (UID: \"2c857db3-273e-45ae-afd5-424141f11fbb\") " pod="kserve/model-serving-api-86f7b4b499-l4gvp" Apr 21 15:48:09.842184 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:48:09.841774 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a8b86f3e-2c7b-4474-bc10-18c0c1269089-cert\") pod \"odh-model-controller-696fc77849-lnjn8\" (UID: \"a8b86f3e-2c7b-4474-bc10-18c0c1269089\") " pod="kserve/odh-model-controller-696fc77849-lnjn8" Apr 21 15:48:09.842184 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:48:09.841831 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l422q\" (UniqueName: \"kubernetes.io/projected/a8b86f3e-2c7b-4474-bc10-18c0c1269089-kube-api-access-l422q\") pod \"odh-model-controller-696fc77849-lnjn8\" (UID: \"a8b86f3e-2c7b-4474-bc10-18c0c1269089\") " pod="kserve/odh-model-controller-696fc77849-lnjn8" Apr 21 15:48:09.942991 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:48:09.942958 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p7wrx\" (UniqueName: \"kubernetes.io/projected/2c857db3-273e-45ae-afd5-424141f11fbb-kube-api-access-p7wrx\") pod \"model-serving-api-86f7b4b499-l4gvp\" (UID: \"2c857db3-273e-45ae-afd5-424141f11fbb\") " pod="kserve/model-serving-api-86f7b4b499-l4gvp" Apr 21 15:48:09.942991 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:48:09.942994 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2c857db3-273e-45ae-afd5-424141f11fbb-tls-certs\") pod \"model-serving-api-86f7b4b499-l4gvp\" (UID: \"2c857db3-273e-45ae-afd5-424141f11fbb\") " pod="kserve/model-serving-api-86f7b4b499-l4gvp" Apr 21 15:48:09.943240 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:48:09.943011 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a8b86f3e-2c7b-4474-bc10-18c0c1269089-cert\") pod \"odh-model-controller-696fc77849-lnjn8\" (UID: \"a8b86f3e-2c7b-4474-bc10-18c0c1269089\") " pod="kserve/odh-model-controller-696fc77849-lnjn8" Apr 21 15:48:09.943240 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:48:09.943046 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l422q\" (UniqueName: \"kubernetes.io/projected/a8b86f3e-2c7b-4474-bc10-18c0c1269089-kube-api-access-l422q\") pod \"odh-model-controller-696fc77849-lnjn8\" (UID: \"a8b86f3e-2c7b-4474-bc10-18c0c1269089\") " pod="kserve/odh-model-controller-696fc77849-lnjn8" Apr 21 15:48:09.943240 ip-10-0-136-123 kubenswrapper[2573]: E0421 15:48:09.943158 2573 secret.go:189] Couldn't get secret kserve/odh-model-controller-webhook-cert: secret "odh-model-controller-webhook-cert" not found Apr 21 15:48:09.943240 ip-10-0-136-123 kubenswrapper[2573]: E0421 15:48:09.943221 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a8b86f3e-2c7b-4474-bc10-18c0c1269089-cert podName:a8b86f3e-2c7b-4474-bc10-18c0c1269089 nodeName:}" failed. No retries permitted until 2026-04-21 15:48:10.443204059 +0000 UTC m=+785.573257671 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a8b86f3e-2c7b-4474-bc10-18c0c1269089-cert") pod "odh-model-controller-696fc77849-lnjn8" (UID: "a8b86f3e-2c7b-4474-bc10-18c0c1269089") : secret "odh-model-controller-webhook-cert" not found Apr 21 15:48:09.945467 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:48:09.945442 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2c857db3-273e-45ae-afd5-424141f11fbb-tls-certs\") pod \"model-serving-api-86f7b4b499-l4gvp\" (UID: \"2c857db3-273e-45ae-afd5-424141f11fbb\") " pod="kserve/model-serving-api-86f7b4b499-l4gvp" Apr 21 15:48:09.954341 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:48:09.954317 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p7wrx\" (UniqueName: \"kubernetes.io/projected/2c857db3-273e-45ae-afd5-424141f11fbb-kube-api-access-p7wrx\") pod \"model-serving-api-86f7b4b499-l4gvp\" (UID: \"2c857db3-273e-45ae-afd5-424141f11fbb\") " pod="kserve/model-serving-api-86f7b4b499-l4gvp" Apr 21 15:48:09.954554 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:48:09.954538 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l422q\" (UniqueName: \"kubernetes.io/projected/a8b86f3e-2c7b-4474-bc10-18c0c1269089-kube-api-access-l422q\") pod \"odh-model-controller-696fc77849-lnjn8\" (UID: \"a8b86f3e-2c7b-4474-bc10-18c0c1269089\") " pod="kserve/odh-model-controller-696fc77849-lnjn8" Apr 21 15:48:10.028691 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:48:10.028593 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-l4gvp" Apr 21 15:48:10.148974 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:48:10.148940 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-l4gvp"] Apr 21 15:48:10.153190 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:48:10.153160 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2c857db3_273e_45ae_afd5_424141f11fbb.slice/crio-315a2a7c7917ea661cb906051c5046e4faf5830638e12e3a5484bb16226c87b2 WatchSource:0}: Error finding container 315a2a7c7917ea661cb906051c5046e4faf5830638e12e3a5484bb16226c87b2: Status 404 returned error can't find the container with id 315a2a7c7917ea661cb906051c5046e4faf5830638e12e3a5484bb16226c87b2 Apr 21 15:48:10.447392 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:48:10.447357 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a8b86f3e-2c7b-4474-bc10-18c0c1269089-cert\") pod \"odh-model-controller-696fc77849-lnjn8\" (UID: \"a8b86f3e-2c7b-4474-bc10-18c0c1269089\") " pod="kserve/odh-model-controller-696fc77849-lnjn8" Apr 21 15:48:10.449763 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:48:10.449741 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a8b86f3e-2c7b-4474-bc10-18c0c1269089-cert\") pod \"odh-model-controller-696fc77849-lnjn8\" (UID: \"a8b86f3e-2c7b-4474-bc10-18c0c1269089\") " pod="kserve/odh-model-controller-696fc77849-lnjn8" Apr 21 15:48:10.636242 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:48:10.636206 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-lnjn8" Apr 21 15:48:10.758936 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:48:10.758908 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-lnjn8"] Apr 21 15:48:10.761416 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:48:10.761386 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda8b86f3e_2c7b_4474_bc10_18c0c1269089.slice/crio-5fb29fe6a6bfeca2bc923592d86283b8ab28349729778855d938fd5df2ee3bb5 WatchSource:0}: Error finding container 5fb29fe6a6bfeca2bc923592d86283b8ab28349729778855d938fd5df2ee3bb5: Status 404 returned error can't find the container with id 5fb29fe6a6bfeca2bc923592d86283b8ab28349729778855d938fd5df2ee3bb5 Apr 21 15:48:10.838706 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:48:10.838648 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-lnjn8" event={"ID":"a8b86f3e-2c7b-4474-bc10-18c0c1269089","Type":"ContainerStarted","Data":"5fb29fe6a6bfeca2bc923592d86283b8ab28349729778855d938fd5df2ee3bb5"} Apr 21 15:48:10.839991 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:48:10.839962 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-l4gvp" event={"ID":"2c857db3-273e-45ae-afd5-424141f11fbb","Type":"ContainerStarted","Data":"315a2a7c7917ea661cb906051c5046e4faf5830638e12e3a5484bb16226c87b2"} Apr 21 15:48:11.845399 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:48:11.845361 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-l4gvp" event={"ID":"2c857db3-273e-45ae-afd5-424141f11fbb","Type":"ContainerStarted","Data":"93b0d9a7d638ba28edacff74a3625d0b1c8dbc66d5d52130fa3aac5d63d8e7f0"} Apr 21 15:48:11.845864 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:48:11.845436 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/model-serving-api-86f7b4b499-l4gvp" Apr 21 15:48:11.862973 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:48:11.862911 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/model-serving-api-86f7b4b499-l4gvp" podStartSLOduration=1.3875480470000001 podStartE2EDuration="2.862892781s" podCreationTimestamp="2026-04-21 15:48:09 +0000 UTC" firstStartedPulling="2026-04-21 15:48:10.155131972 +0000 UTC m=+785.285185584" lastFinishedPulling="2026-04-21 15:48:11.630476691 +0000 UTC m=+786.760530318" observedRunningTime="2026-04-21 15:48:11.86242598 +0000 UTC m=+786.992479629" watchObservedRunningTime="2026-04-21 15:48:11.862892781 +0000 UTC m=+786.992946418" Apr 21 15:48:13.853665 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:48:13.853596 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-lnjn8" event={"ID":"a8b86f3e-2c7b-4474-bc10-18c0c1269089","Type":"ContainerStarted","Data":"ee7112c246bcc7b3b38e4f6598a7928dcad814ef41e73a6aa789996ec7c94e8a"} Apr 21 15:48:13.853665 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:48:13.853672 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/odh-model-controller-696fc77849-lnjn8" Apr 21 15:48:13.872954 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:48:13.872904 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/odh-model-controller-696fc77849-lnjn8" podStartSLOduration=2.1260112270000002 podStartE2EDuration="4.872888394s" podCreationTimestamp="2026-04-21 15:48:09 +0000 UTC" firstStartedPulling="2026-04-21 15:48:10.762756153 +0000 UTC m=+785.892809766" lastFinishedPulling="2026-04-21 15:48:13.509633318 +0000 UTC m=+788.639686933" observedRunningTime="2026-04-21 15:48:13.871437669 +0000 UTC m=+789.001491303" watchObservedRunningTime="2026-04-21 15:48:13.872888394 +0000 UTC m=+789.002942028" Apr 21 15:48:22.854288 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:48:22.854252 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/model-serving-api-86f7b4b499-l4gvp" Apr 21 15:48:24.859337 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:48:24.859258 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/odh-model-controller-696fc77849-lnjn8" Apr 21 15:48:25.698685 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:48:25.698651 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/s3-init-hgbvx"] Apr 21 15:48:25.702054 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:48:25.702038 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-hgbvx" Apr 21 15:48:25.708405 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:48:25.708375 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-hgbvx"] Apr 21 15:48:25.782286 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:48:25.782248 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-btbdv\" (UniqueName: \"kubernetes.io/projected/d697e35b-9a68-4f52-b436-5e4f4784ff7c-kube-api-access-btbdv\") pod \"s3-init-hgbvx\" (UID: \"d697e35b-9a68-4f52-b436-5e4f4784ff7c\") " pod="kserve/s3-init-hgbvx" Apr 21 15:48:25.883260 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:48:25.883223 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-btbdv\" (UniqueName: \"kubernetes.io/projected/d697e35b-9a68-4f52-b436-5e4f4784ff7c-kube-api-access-btbdv\") pod \"s3-init-hgbvx\" (UID: \"d697e35b-9a68-4f52-b436-5e4f4784ff7c\") " pod="kserve/s3-init-hgbvx" Apr 21 15:48:25.892664 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:48:25.892635 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-btbdv\" (UniqueName: \"kubernetes.io/projected/d697e35b-9a68-4f52-b436-5e4f4784ff7c-kube-api-access-btbdv\") pod \"s3-init-hgbvx\" (UID: \"d697e35b-9a68-4f52-b436-5e4f4784ff7c\") " pod="kserve/s3-init-hgbvx" Apr 21 15:48:26.012529 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:48:26.012440 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-hgbvx" Apr 21 15:48:26.136668 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:48:26.136643 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-hgbvx"] Apr 21 15:48:26.139274 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:48:26.139244 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd697e35b_9a68_4f52_b436_5e4f4784ff7c.slice/crio-aac4bb8735d184377deb975cb53c026f78c25e3b8a5ea773c3197a14f51f0ad5 WatchSource:0}: Error finding container aac4bb8735d184377deb975cb53c026f78c25e3b8a5ea773c3197a14f51f0ad5: Status 404 returned error can't find the container with id aac4bb8735d184377deb975cb53c026f78c25e3b8a5ea773c3197a14f51f0ad5 Apr 21 15:48:26.899293 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:48:26.899244 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-hgbvx" event={"ID":"d697e35b-9a68-4f52-b436-5e4f4784ff7c","Type":"ContainerStarted","Data":"aac4bb8735d184377deb975cb53c026f78c25e3b8a5ea773c3197a14f51f0ad5"} Apr 21 15:48:30.914266 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:48:30.914182 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-hgbvx" event={"ID":"d697e35b-9a68-4f52-b436-5e4f4784ff7c","Type":"ContainerStarted","Data":"507da35e6bceec47b032afd3e94fb470b3683b395679caadd8fe74bb9b840a4f"} Apr 21 15:48:30.930723 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:48:30.930614 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/s3-init-hgbvx" podStartSLOduration=1.490384945 podStartE2EDuration="5.930598651s" podCreationTimestamp="2026-04-21 15:48:25 +0000 UTC" firstStartedPulling="2026-04-21 15:48:26.141045059 +0000 UTC m=+801.271098675" lastFinishedPulling="2026-04-21 15:48:30.58125875 +0000 UTC m=+805.711312381" observedRunningTime="2026-04-21 15:48:30.93015172 +0000 UTC m=+806.060205349" watchObservedRunningTime="2026-04-21 15:48:30.930598651 +0000 UTC m=+806.060652286" Apr 21 15:48:33.924693 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:48:33.924607 2573 generic.go:358] "Generic (PLEG): container finished" podID="d697e35b-9a68-4f52-b436-5e4f4784ff7c" containerID="507da35e6bceec47b032afd3e94fb470b3683b395679caadd8fe74bb9b840a4f" exitCode=0 Apr 21 15:48:33.924693 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:48:33.924666 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-hgbvx" event={"ID":"d697e35b-9a68-4f52-b436-5e4f4784ff7c","Type":"ContainerDied","Data":"507da35e6bceec47b032afd3e94fb470b3683b395679caadd8fe74bb9b840a4f"} Apr 21 15:48:35.061762 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:48:35.061734 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-hgbvx" Apr 21 15:48:35.162578 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:48:35.162548 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-btbdv\" (UniqueName: \"kubernetes.io/projected/d697e35b-9a68-4f52-b436-5e4f4784ff7c-kube-api-access-btbdv\") pod \"d697e35b-9a68-4f52-b436-5e4f4784ff7c\" (UID: \"d697e35b-9a68-4f52-b436-5e4f4784ff7c\") " Apr 21 15:48:35.164530 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:48:35.164499 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d697e35b-9a68-4f52-b436-5e4f4784ff7c-kube-api-access-btbdv" (OuterVolumeSpecName: "kube-api-access-btbdv") pod "d697e35b-9a68-4f52-b436-5e4f4784ff7c" (UID: "d697e35b-9a68-4f52-b436-5e4f4784ff7c"). InnerVolumeSpecName "kube-api-access-btbdv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 15:48:35.263870 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:48:35.263754 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-btbdv\" (UniqueName: \"kubernetes.io/projected/d697e35b-9a68-4f52-b436-5e4f4784ff7c-kube-api-access-btbdv\") on node \"ip-10-0-136-123.ec2.internal\" DevicePath \"\"" Apr 21 15:48:35.932732 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:48:35.932696 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-hgbvx" event={"ID":"d697e35b-9a68-4f52-b436-5e4f4784ff7c","Type":"ContainerDied","Data":"aac4bb8735d184377deb975cb53c026f78c25e3b8a5ea773c3197a14f51f0ad5"} Apr 21 15:48:35.932732 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:48:35.932725 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-hgbvx" Apr 21 15:48:35.932997 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:48:35.932731 2573 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="aac4bb8735d184377deb975cb53c026f78c25e3b8a5ea773c3197a14f51f0ad5" Apr 21 15:48:46.229480 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:48:46.229445 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-pjq6c"] Apr 21 15:48:46.229987 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:48:46.229967 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d697e35b-9a68-4f52-b436-5e4f4784ff7c" containerName="s3-init" Apr 21 15:48:46.230053 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:48:46.229989 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="d697e35b-9a68-4f52-b436-5e4f4784ff7c" containerName="s3-init" Apr 21 15:48:46.230092 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:48:46.230052 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="d697e35b-9a68-4f52-b436-5e4f4784ff7c" containerName="s3-init" Apr 21 15:48:46.237032 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:48:46.237006 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-pjq6c" Apr 21 15:48:46.240114 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:48:46.239905 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 21 15:48:46.240114 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:48:46.239947 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 21 15:48:46.240308 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:48:46.240182 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"istio-ca-root-cert\"" Apr 21 15:48:46.240308 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:48:46.240261 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-gateway-1-openshift-default-dockercfg-kwpld\"" Apr 21 15:48:46.244523 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:48:46.244491 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-pjq6c"] Apr 21 15:48:46.254084 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:48:46.253675 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/4a4d299c-f46e-479b-aeee-2f438d299618-istio-envoy\") pod \"router-gateway-1-openshift-default-6c59fbf55c-pjq6c\" (UID: \"4a4d299c-f46e-479b-aeee-2f438d299618\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-pjq6c" Apr 21 15:48:46.254084 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:48:46.253734 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/4a4d299c-f46e-479b-aeee-2f438d299618-credential-socket\") pod \"router-gateway-1-openshift-default-6c59fbf55c-pjq6c\" (UID: \"4a4d299c-f46e-479b-aeee-2f438d299618\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-pjq6c" Apr 21 15:48:46.254084 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:48:46.253774 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/4a4d299c-f46e-479b-aeee-2f438d299618-istio-data\") pod \"router-gateway-1-openshift-default-6c59fbf55c-pjq6c\" (UID: \"4a4d299c-f46e-479b-aeee-2f438d299618\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-pjq6c" Apr 21 15:48:46.254084 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:48:46.253828 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/4a4d299c-f46e-479b-aeee-2f438d299618-istiod-ca-cert\") pod \"router-gateway-1-openshift-default-6c59fbf55c-pjq6c\" (UID: \"4a4d299c-f46e-479b-aeee-2f438d299618\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-pjq6c" Apr 21 15:48:46.254084 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:48:46.253856 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/4a4d299c-f46e-479b-aeee-2f438d299618-istio-podinfo\") pod \"router-gateway-1-openshift-default-6c59fbf55c-pjq6c\" (UID: \"4a4d299c-f46e-479b-aeee-2f438d299618\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-pjq6c" Apr 21 15:48:46.254084 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:48:46.253886 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/4a4d299c-f46e-479b-aeee-2f438d299618-istio-token\") pod \"router-gateway-1-openshift-default-6c59fbf55c-pjq6c\" (UID: \"4a4d299c-f46e-479b-aeee-2f438d299618\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-pjq6c" Apr 21 15:48:46.254084 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:48:46.253917 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/4a4d299c-f46e-479b-aeee-2f438d299618-workload-certs\") pod \"router-gateway-1-openshift-default-6c59fbf55c-pjq6c\" (UID: \"4a4d299c-f46e-479b-aeee-2f438d299618\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-pjq6c" Apr 21 15:48:46.254084 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:48:46.253960 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8b7mm\" (UniqueName: \"kubernetes.io/projected/4a4d299c-f46e-479b-aeee-2f438d299618-kube-api-access-8b7mm\") pod \"router-gateway-1-openshift-default-6c59fbf55c-pjq6c\" (UID: \"4a4d299c-f46e-479b-aeee-2f438d299618\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-pjq6c" Apr 21 15:48:46.254084 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:48:46.253993 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/4a4d299c-f46e-479b-aeee-2f438d299618-workload-socket\") pod \"router-gateway-1-openshift-default-6c59fbf55c-pjq6c\" (UID: \"4a4d299c-f46e-479b-aeee-2f438d299618\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-pjq6c" Apr 21 15:48:46.355260 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:48:46.355229 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/4a4d299c-f46e-479b-aeee-2f438d299618-istiod-ca-cert\") pod \"router-gateway-1-openshift-default-6c59fbf55c-pjq6c\" (UID: \"4a4d299c-f46e-479b-aeee-2f438d299618\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-pjq6c" Apr 21 15:48:46.355516 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:48:46.355269 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/4a4d299c-f46e-479b-aeee-2f438d299618-istio-podinfo\") pod \"router-gateway-1-openshift-default-6c59fbf55c-pjq6c\" (UID: \"4a4d299c-f46e-479b-aeee-2f438d299618\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-pjq6c" Apr 21 15:48:46.355516 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:48:46.355294 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/4a4d299c-f46e-479b-aeee-2f438d299618-istio-token\") pod \"router-gateway-1-openshift-default-6c59fbf55c-pjq6c\" (UID: \"4a4d299c-f46e-479b-aeee-2f438d299618\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-pjq6c" Apr 21 15:48:46.355516 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:48:46.355317 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/4a4d299c-f46e-479b-aeee-2f438d299618-workload-certs\") pod \"router-gateway-1-openshift-default-6c59fbf55c-pjq6c\" (UID: \"4a4d299c-f46e-479b-aeee-2f438d299618\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-pjq6c" Apr 21 15:48:46.355516 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:48:46.355354 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8b7mm\" (UniqueName: \"kubernetes.io/projected/4a4d299c-f46e-479b-aeee-2f438d299618-kube-api-access-8b7mm\") pod \"router-gateway-1-openshift-default-6c59fbf55c-pjq6c\" (UID: \"4a4d299c-f46e-479b-aeee-2f438d299618\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-pjq6c" Apr 21 15:48:46.355516 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:48:46.355388 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/4a4d299c-f46e-479b-aeee-2f438d299618-workload-socket\") pod \"router-gateway-1-openshift-default-6c59fbf55c-pjq6c\" (UID: \"4a4d299c-f46e-479b-aeee-2f438d299618\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-pjq6c" Apr 21 15:48:46.355516 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:48:46.355415 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/4a4d299c-f46e-479b-aeee-2f438d299618-istio-envoy\") pod \"router-gateway-1-openshift-default-6c59fbf55c-pjq6c\" (UID: \"4a4d299c-f46e-479b-aeee-2f438d299618\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-pjq6c" Apr 21 15:48:46.355516 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:48:46.355479 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/4a4d299c-f46e-479b-aeee-2f438d299618-credential-socket\") pod \"router-gateway-1-openshift-default-6c59fbf55c-pjq6c\" (UID: \"4a4d299c-f46e-479b-aeee-2f438d299618\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-pjq6c" Apr 21 15:48:46.355516 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:48:46.355520 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/4a4d299c-f46e-479b-aeee-2f438d299618-istio-data\") pod \"router-gateway-1-openshift-default-6c59fbf55c-pjq6c\" (UID: \"4a4d299c-f46e-479b-aeee-2f438d299618\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-pjq6c" Apr 21 15:48:46.356121 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:48:46.355941 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/4a4d299c-f46e-479b-aeee-2f438d299618-istio-data\") pod \"router-gateway-1-openshift-default-6c59fbf55c-pjq6c\" (UID: \"4a4d299c-f46e-479b-aeee-2f438d299618\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-pjq6c" Apr 21 15:48:46.356121 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:48:46.355964 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/4a4d299c-f46e-479b-aeee-2f438d299618-workload-socket\") pod \"router-gateway-1-openshift-default-6c59fbf55c-pjq6c\" (UID: \"4a4d299c-f46e-479b-aeee-2f438d299618\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-pjq6c" Apr 21 15:48:46.356121 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:48:46.356100 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/4a4d299c-f46e-479b-aeee-2f438d299618-istiod-ca-cert\") pod \"router-gateway-1-openshift-default-6c59fbf55c-pjq6c\" (UID: \"4a4d299c-f46e-479b-aeee-2f438d299618\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-pjq6c" Apr 21 15:48:46.356393 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:48:46.356368 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/4a4d299c-f46e-479b-aeee-2f438d299618-credential-socket\") pod \"router-gateway-1-openshift-default-6c59fbf55c-pjq6c\" (UID: \"4a4d299c-f46e-479b-aeee-2f438d299618\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-pjq6c" Apr 21 15:48:46.356607 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:48:46.356427 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/4a4d299c-f46e-479b-aeee-2f438d299618-workload-certs\") pod \"router-gateway-1-openshift-default-6c59fbf55c-pjq6c\" (UID: \"4a4d299c-f46e-479b-aeee-2f438d299618\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-pjq6c" Apr 21 15:48:46.358387 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:48:46.358366 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/4a4d299c-f46e-479b-aeee-2f438d299618-istio-envoy\") pod \"router-gateway-1-openshift-default-6c59fbf55c-pjq6c\" (UID: \"4a4d299c-f46e-479b-aeee-2f438d299618\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-pjq6c" Apr 21 15:48:46.358550 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:48:46.358533 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/4a4d299c-f46e-479b-aeee-2f438d299618-istio-podinfo\") pod \"router-gateway-1-openshift-default-6c59fbf55c-pjq6c\" (UID: \"4a4d299c-f46e-479b-aeee-2f438d299618\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-pjq6c" Apr 21 15:48:46.363297 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:48:46.363274 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/4a4d299c-f46e-479b-aeee-2f438d299618-istio-token\") pod \"router-gateway-1-openshift-default-6c59fbf55c-pjq6c\" (UID: \"4a4d299c-f46e-479b-aeee-2f438d299618\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-pjq6c" Apr 21 15:48:46.363609 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:48:46.363594 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8b7mm\" (UniqueName: \"kubernetes.io/projected/4a4d299c-f46e-479b-aeee-2f438d299618-kube-api-access-8b7mm\") pod \"router-gateway-1-openshift-default-6c59fbf55c-pjq6c\" (UID: \"4a4d299c-f46e-479b-aeee-2f438d299618\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-pjq6c" Apr 21 15:48:46.554016 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:48:46.553917 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-pjq6c" Apr 21 15:48:46.698539 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:48:46.698505 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-pjq6c"] Apr 21 15:48:46.703042 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:48:46.703012 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4a4d299c_f46e_479b_aeee_2f438d299618.slice/crio-c239eb15c8772e465ee2ae0e6488d4d27bff36a6022aef806379d67460f1de82 WatchSource:0}: Error finding container c239eb15c8772e465ee2ae0e6488d4d27bff36a6022aef806379d67460f1de82: Status 404 returned error can't find the container with id c239eb15c8772e465ee2ae0e6488d4d27bff36a6022aef806379d67460f1de82 Apr 21 15:48:46.705411 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:48:46.705368 2573 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 21 15:48:46.705535 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:48:46.705457 2573 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 21 15:48:46.705535 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:48:46.705495 2573 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 21 15:48:46.977352 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:48:46.977313 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-pjq6c" event={"ID":"4a4d299c-f46e-479b-aeee-2f438d299618","Type":"ContainerStarted","Data":"85503004ec7f6d0fa70cf73ae6f8d942bc40d5bf29c6fac4c2e36d7dc80ef13c"} Apr 21 15:48:46.977352 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:48:46.977355 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-pjq6c" event={"ID":"4a4d299c-f46e-479b-aeee-2f438d299618","Type":"ContainerStarted","Data":"c239eb15c8772e465ee2ae0e6488d4d27bff36a6022aef806379d67460f1de82"} Apr 21 15:48:47.003180 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:48:47.003111 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-pjq6c" podStartSLOduration=1.003092065 podStartE2EDuration="1.003092065s" podCreationTimestamp="2026-04-21 15:48:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 15:48:46.999649339 +0000 UTC m=+822.129702975" watchObservedRunningTime="2026-04-21 15:48:47.003092065 +0000 UTC m=+822.133145700" Apr 21 15:48:47.555009 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:48:47.554969 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-pjq6c" Apr 21 15:48:47.560006 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:48:47.559977 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-pjq6c" Apr 21 15:48:47.981033 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:48:47.981000 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-pjq6c" Apr 21 15:48:47.982030 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:48:47.982012 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-pjq6c" Apr 21 15:48:51.474275 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:48:51.474240 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-f649f474-wcs8f"] Apr 21 15:48:51.477362 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:48:51.477334 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-f649f474-wcs8f" Apr 21 15:48:51.482317 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:48:51.482294 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-lp6xp\"" Apr 21 15:48:51.482441 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:48:51.482324 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-configmap-ref-test-kserve-self-signed-certs\"" Apr 21 15:48:51.490308 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:48:51.490286 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-f649f474-wcs8f"] Apr 21 15:48:51.593680 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:48:51.593644 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n4mfb\" (UniqueName: \"kubernetes.io/projected/8f5312cc-8836-479a-9c43-8b9708268f65-kube-api-access-n4mfb\") pod \"scheduler-configmap-ref-test-kserve-f649f474-wcs8f\" (UID: \"8f5312cc-8836-479a-9c43-8b9708268f65\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-f649f474-wcs8f" Apr 21 15:48:51.593680 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:48:51.593682 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/8f5312cc-8836-479a-9c43-8b9708268f65-home\") pod \"scheduler-configmap-ref-test-kserve-f649f474-wcs8f\" (UID: \"8f5312cc-8836-479a-9c43-8b9708268f65\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-f649f474-wcs8f" Apr 21 15:48:51.593948 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:48:51.593711 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8f5312cc-8836-479a-9c43-8b9708268f65-kserve-provision-location\") pod \"scheduler-configmap-ref-test-kserve-f649f474-wcs8f\" (UID: \"8f5312cc-8836-479a-9c43-8b9708268f65\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-f649f474-wcs8f" Apr 21 15:48:51.593948 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:48:51.593813 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/8f5312cc-8836-479a-9c43-8b9708268f65-dshm\") pod \"scheduler-configmap-ref-test-kserve-f649f474-wcs8f\" (UID: \"8f5312cc-8836-479a-9c43-8b9708268f65\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-f649f474-wcs8f" Apr 21 15:48:51.593948 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:48:51.593850 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/8f5312cc-8836-479a-9c43-8b9708268f65-tls-certs\") pod \"scheduler-configmap-ref-test-kserve-f649f474-wcs8f\" (UID: \"8f5312cc-8836-479a-9c43-8b9708268f65\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-f649f474-wcs8f" Apr 21 15:48:51.593948 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:48:51.593883 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/8f5312cc-8836-479a-9c43-8b9708268f65-model-cache\") pod \"scheduler-configmap-ref-test-kserve-f649f474-wcs8f\" (UID: \"8f5312cc-8836-479a-9c43-8b9708268f65\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-f649f474-wcs8f" Apr 21 15:48:51.694657 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:48:51.694621 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/8f5312cc-8836-479a-9c43-8b9708268f65-model-cache\") pod \"scheduler-configmap-ref-test-kserve-f649f474-wcs8f\" (UID: \"8f5312cc-8836-479a-9c43-8b9708268f65\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-f649f474-wcs8f" Apr 21 15:48:51.694841 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:48:51.694683 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n4mfb\" (UniqueName: \"kubernetes.io/projected/8f5312cc-8836-479a-9c43-8b9708268f65-kube-api-access-n4mfb\") pod \"scheduler-configmap-ref-test-kserve-f649f474-wcs8f\" (UID: \"8f5312cc-8836-479a-9c43-8b9708268f65\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-f649f474-wcs8f" Apr 21 15:48:51.694841 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:48:51.694717 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/8f5312cc-8836-479a-9c43-8b9708268f65-home\") pod \"scheduler-configmap-ref-test-kserve-f649f474-wcs8f\" (UID: \"8f5312cc-8836-479a-9c43-8b9708268f65\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-f649f474-wcs8f" Apr 21 15:48:51.694841 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:48:51.694765 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8f5312cc-8836-479a-9c43-8b9708268f65-kserve-provision-location\") pod \"scheduler-configmap-ref-test-kserve-f649f474-wcs8f\" (UID: \"8f5312cc-8836-479a-9c43-8b9708268f65\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-f649f474-wcs8f" Apr 21 15:48:51.694841 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:48:51.694807 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/8f5312cc-8836-479a-9c43-8b9708268f65-dshm\") pod \"scheduler-configmap-ref-test-kserve-f649f474-wcs8f\" (UID: \"8f5312cc-8836-479a-9c43-8b9708268f65\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-f649f474-wcs8f" Apr 21 15:48:51.695055 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:48:51.694848 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/8f5312cc-8836-479a-9c43-8b9708268f65-tls-certs\") pod \"scheduler-configmap-ref-test-kserve-f649f474-wcs8f\" (UID: \"8f5312cc-8836-479a-9c43-8b9708268f65\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-f649f474-wcs8f" Apr 21 15:48:51.695141 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:48:51.695121 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/8f5312cc-8836-479a-9c43-8b9708268f65-model-cache\") pod \"scheduler-configmap-ref-test-kserve-f649f474-wcs8f\" (UID: \"8f5312cc-8836-479a-9c43-8b9708268f65\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-f649f474-wcs8f" Apr 21 15:48:51.695201 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:48:51.695128 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/8f5312cc-8836-479a-9c43-8b9708268f65-home\") pod \"scheduler-configmap-ref-test-kserve-f649f474-wcs8f\" (UID: \"8f5312cc-8836-479a-9c43-8b9708268f65\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-f649f474-wcs8f" Apr 21 15:48:51.695235 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:48:51.695191 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8f5312cc-8836-479a-9c43-8b9708268f65-kserve-provision-location\") pod \"scheduler-configmap-ref-test-kserve-f649f474-wcs8f\" (UID: \"8f5312cc-8836-479a-9c43-8b9708268f65\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-f649f474-wcs8f" Apr 21 15:48:51.696942 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:48:51.696912 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/8f5312cc-8836-479a-9c43-8b9708268f65-dshm\") pod \"scheduler-configmap-ref-test-kserve-f649f474-wcs8f\" (UID: \"8f5312cc-8836-479a-9c43-8b9708268f65\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-f649f474-wcs8f" Apr 21 15:48:51.697205 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:48:51.697188 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/8f5312cc-8836-479a-9c43-8b9708268f65-tls-certs\") pod \"scheduler-configmap-ref-test-kserve-f649f474-wcs8f\" (UID: \"8f5312cc-8836-479a-9c43-8b9708268f65\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-f649f474-wcs8f" Apr 21 15:48:51.707428 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:48:51.707400 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n4mfb\" (UniqueName: \"kubernetes.io/projected/8f5312cc-8836-479a-9c43-8b9708268f65-kube-api-access-n4mfb\") pod \"scheduler-configmap-ref-test-kserve-f649f474-wcs8f\" (UID: \"8f5312cc-8836-479a-9c43-8b9708268f65\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-f649f474-wcs8f" Apr 21 15:48:51.789674 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:48:51.789583 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-f649f474-wcs8f" Apr 21 15:48:51.916273 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:48:51.916089 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-f649f474-wcs8f"] Apr 21 15:48:51.918505 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:48:51.918475 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8f5312cc_8836_479a_9c43_8b9708268f65.slice/crio-ae05d2d357f4eae67d3e00c7be677c386cea1b0f90063b671e4679b0a461707f WatchSource:0}: Error finding container ae05d2d357f4eae67d3e00c7be677c386cea1b0f90063b671e4679b0a461707f: Status 404 returned error can't find the container with id ae05d2d357f4eae67d3e00c7be677c386cea1b0f90063b671e4679b0a461707f Apr 21 15:48:51.994898 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:48:51.994868 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-f649f474-wcs8f" event={"ID":"8f5312cc-8836-479a-9c43-8b9708268f65","Type":"ContainerStarted","Data":"ae05d2d357f4eae67d3e00c7be677c386cea1b0f90063b671e4679b0a461707f"} Apr 21 15:48:57.025530 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:48:57.025494 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-f649f474-wcs8f" event={"ID":"8f5312cc-8836-479a-9c43-8b9708268f65","Type":"ContainerStarted","Data":"520a08a35a8c73ef8ea029da978bdb018593e99b9859eee789246e80e2317515"} Apr 21 15:48:57.068176 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:48:57.068140 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-84bb9f8458kjmrb"] Apr 21 15:48:57.072144 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:48:57.072121 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-84bb9f8458kjmrb" Apr 21 15:48:57.074672 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:48:57.074645 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvad71fa5348b85aebd404221bba611457-kserve-self-signed-certs\"" Apr 21 15:48:57.084396 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:48:57.084374 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-84bb9f8458kjmrb"] Apr 21 15:48:57.162316 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:48:57.162285 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/dd5ff7b2-09a9-4ff5-8da2-69a2d3a5a925-dshm\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-84bb9f8458kjmrb\" (UID: \"dd5ff7b2-09a9-4ff5-8da2-69a2d3a5a925\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-84bb9f8458kjmrb" Apr 21 15:48:57.162493 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:48:57.162340 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/dd5ff7b2-09a9-4ff5-8da2-69a2d3a5a925-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-84bb9f8458kjmrb\" (UID: \"dd5ff7b2-09a9-4ff5-8da2-69a2d3a5a925\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-84bb9f8458kjmrb" Apr 21 15:48:57.162493 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:48:57.162382 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2trll\" (UniqueName: \"kubernetes.io/projected/dd5ff7b2-09a9-4ff5-8da2-69a2d3a5a925-kube-api-access-2trll\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-84bb9f8458kjmrb\" (UID: \"dd5ff7b2-09a9-4ff5-8da2-69a2d3a5a925\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-84bb9f8458kjmrb" Apr 21 15:48:57.162493 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:48:57.162442 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/dd5ff7b2-09a9-4ff5-8da2-69a2d3a5a925-home\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-84bb9f8458kjmrb\" (UID: \"dd5ff7b2-09a9-4ff5-8da2-69a2d3a5a925\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-84bb9f8458kjmrb" Apr 21 15:48:57.162493 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:48:57.162483 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/dd5ff7b2-09a9-4ff5-8da2-69a2d3a5a925-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-84bb9f8458kjmrb\" (UID: \"dd5ff7b2-09a9-4ff5-8da2-69a2d3a5a925\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-84bb9f8458kjmrb" Apr 21 15:48:57.162691 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:48:57.162587 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/dd5ff7b2-09a9-4ff5-8da2-69a2d3a5a925-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-84bb9f8458kjmrb\" (UID: \"dd5ff7b2-09a9-4ff5-8da2-69a2d3a5a925\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-84bb9f8458kjmrb" Apr 21 15:48:57.263757 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:48:57.263647 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/dd5ff7b2-09a9-4ff5-8da2-69a2d3a5a925-dshm\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-84bb9f8458kjmrb\" (UID: \"dd5ff7b2-09a9-4ff5-8da2-69a2d3a5a925\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-84bb9f8458kjmrb" Apr 21 15:48:57.263757 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:48:57.263697 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/dd5ff7b2-09a9-4ff5-8da2-69a2d3a5a925-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-84bb9f8458kjmrb\" (UID: \"dd5ff7b2-09a9-4ff5-8da2-69a2d3a5a925\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-84bb9f8458kjmrb" Apr 21 15:48:57.263757 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:48:57.263723 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2trll\" (UniqueName: \"kubernetes.io/projected/dd5ff7b2-09a9-4ff5-8da2-69a2d3a5a925-kube-api-access-2trll\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-84bb9f8458kjmrb\" (UID: \"dd5ff7b2-09a9-4ff5-8da2-69a2d3a5a925\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-84bb9f8458kjmrb" Apr 21 15:48:57.263757 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:48:57.263755 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/dd5ff7b2-09a9-4ff5-8da2-69a2d3a5a925-home\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-84bb9f8458kjmrb\" (UID: \"dd5ff7b2-09a9-4ff5-8da2-69a2d3a5a925\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-84bb9f8458kjmrb" Apr 21 15:48:57.263757 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:48:57.263819 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/dd5ff7b2-09a9-4ff5-8da2-69a2d3a5a925-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-84bb9f8458kjmrb\" (UID: \"dd5ff7b2-09a9-4ff5-8da2-69a2d3a5a925\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-84bb9f8458kjmrb" Apr 21 15:48:57.263757 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:48:57.263881 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/dd5ff7b2-09a9-4ff5-8da2-69a2d3a5a925-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-84bb9f8458kjmrb\" (UID: \"dd5ff7b2-09a9-4ff5-8da2-69a2d3a5a925\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-84bb9f8458kjmrb" Apr 21 15:48:57.264745 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:48:57.264237 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/dd5ff7b2-09a9-4ff5-8da2-69a2d3a5a925-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-84bb9f8458kjmrb\" (UID: \"dd5ff7b2-09a9-4ff5-8da2-69a2d3a5a925\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-84bb9f8458kjmrb" Apr 21 15:48:57.264745 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:48:57.264327 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/dd5ff7b2-09a9-4ff5-8da2-69a2d3a5a925-home\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-84bb9f8458kjmrb\" (UID: \"dd5ff7b2-09a9-4ff5-8da2-69a2d3a5a925\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-84bb9f8458kjmrb" Apr 21 15:48:57.264745 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:48:57.264415 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/dd5ff7b2-09a9-4ff5-8da2-69a2d3a5a925-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-84bb9f8458kjmrb\" (UID: \"dd5ff7b2-09a9-4ff5-8da2-69a2d3a5a925\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-84bb9f8458kjmrb" Apr 21 15:48:57.266718 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:48:57.266662 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/dd5ff7b2-09a9-4ff5-8da2-69a2d3a5a925-dshm\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-84bb9f8458kjmrb\" (UID: \"dd5ff7b2-09a9-4ff5-8da2-69a2d3a5a925\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-84bb9f8458kjmrb" Apr 21 15:48:57.266718 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:48:57.266683 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/dd5ff7b2-09a9-4ff5-8da2-69a2d3a5a925-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-84bb9f8458kjmrb\" (UID: \"dd5ff7b2-09a9-4ff5-8da2-69a2d3a5a925\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-84bb9f8458kjmrb" Apr 21 15:48:57.272980 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:48:57.272950 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2trll\" (UniqueName: \"kubernetes.io/projected/dd5ff7b2-09a9-4ff5-8da2-69a2d3a5a925-kube-api-access-2trll\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-84bb9f8458kjmrb\" (UID: \"dd5ff7b2-09a9-4ff5-8da2-69a2d3a5a925\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-84bb9f8458kjmrb" Apr 21 15:48:57.383948 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:48:57.383911 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-84bb9f8458kjmrb" Apr 21 15:48:57.548991 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:48:57.548958 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-84bb9f8458kjmrb"] Apr 21 15:48:58.032720 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:48:58.032672 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-84bb9f8458kjmrb" event={"ID":"dd5ff7b2-09a9-4ff5-8da2-69a2d3a5a925","Type":"ContainerStarted","Data":"67304691b01969687ff2abae07fcbc0f9a5f5065e72f9b7dc6c3a52ce2db71f1"} Apr 21 15:48:58.032720 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:48:58.032722 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-84bb9f8458kjmrb" event={"ID":"dd5ff7b2-09a9-4ff5-8da2-69a2d3a5a925","Type":"ContainerStarted","Data":"bec082787ed8b1181e7a9fc5187d403c52f09846d981728d6893811e25308fb2"} Apr 21 15:49:01.045074 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:49:01.044994 2573 generic.go:358] "Generic (PLEG): container finished" podID="8f5312cc-8836-479a-9c43-8b9708268f65" containerID="520a08a35a8c73ef8ea029da978bdb018593e99b9859eee789246e80e2317515" exitCode=0 Apr 21 15:49:01.045074 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:49:01.045062 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-f649f474-wcs8f" event={"ID":"8f5312cc-8836-479a-9c43-8b9708268f65","Type":"ContainerDied","Data":"520a08a35a8c73ef8ea029da978bdb018593e99b9859eee789246e80e2317515"} Apr 21 15:49:03.053762 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:49:03.053726 2573 generic.go:358] "Generic (PLEG): container finished" podID="dd5ff7b2-09a9-4ff5-8da2-69a2d3a5a925" containerID="67304691b01969687ff2abae07fcbc0f9a5f5065e72f9b7dc6c3a52ce2db71f1" exitCode=0 Apr 21 15:49:03.054095 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:49:03.053821 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-84bb9f8458kjmrb" event={"ID":"dd5ff7b2-09a9-4ff5-8da2-69a2d3a5a925","Type":"ContainerDied","Data":"67304691b01969687ff2abae07fcbc0f9a5f5065e72f9b7dc6c3a52ce2db71f1"} Apr 21 15:49:04.058819 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:49:04.058757 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-84bb9f8458kjmrb" event={"ID":"dd5ff7b2-09a9-4ff5-8da2-69a2d3a5a925","Type":"ContainerStarted","Data":"dc7be136e2fb715f602ad3a17f5c15176f7252bae3fb8ceb51a90b577d078566"} Apr 21 15:49:04.060302 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:49:04.060278 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-f649f474-wcs8f" event={"ID":"8f5312cc-8836-479a-9c43-8b9708268f65","Type":"ContainerStarted","Data":"8d66869124251fa3d520403b3d450e5bcbe2a31626df865cce2ee7bb40469ce7"} Apr 21 15:49:04.079362 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:49:04.079321 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-84bb9f8458kjmrb" podStartSLOduration=7.079308996 podStartE2EDuration="7.079308996s" podCreationTimestamp="2026-04-21 15:48:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 15:49:04.077820799 +0000 UTC m=+839.207874430" watchObservedRunningTime="2026-04-21 15:49:04.079308996 +0000 UTC m=+839.209362630" Apr 21 15:49:04.098716 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:49:04.098676 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-f649f474-wcs8f" podStartSLOduration=2.096760775 podStartE2EDuration="13.098664817s" podCreationTimestamp="2026-04-21 15:48:51 +0000 UTC" firstStartedPulling="2026-04-21 15:48:51.92044727 +0000 UTC m=+827.050500882" lastFinishedPulling="2026-04-21 15:49:02.922351293 +0000 UTC m=+838.052404924" observedRunningTime="2026-04-21 15:49:04.097850786 +0000 UTC m=+839.227904421" watchObservedRunningTime="2026-04-21 15:49:04.098664817 +0000 UTC m=+839.228718451" Apr 21 15:49:07.385004 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:49:07.384957 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-84bb9f8458kjmrb" Apr 21 15:49:07.385466 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:49:07.385050 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-84bb9f8458kjmrb" Apr 21 15:49:07.397456 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:49:07.397428 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-84bb9f8458kjmrb" Apr 21 15:49:08.089006 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:49:08.088971 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-84bb9f8458kjmrb" Apr 21 15:49:11.790691 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:49:11.790654 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-f649f474-wcs8f" Apr 21 15:49:11.791105 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:49:11.790830 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-f649f474-wcs8f" Apr 21 15:49:11.803303 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:49:11.803284 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-f649f474-wcs8f" Apr 21 15:49:12.104169 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:49:12.104133 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-f649f474-wcs8f" Apr 21 15:49:43.462461 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:49:43.462425 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-f649f474-wcs8f"] Apr 21 15:49:43.462951 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:49:43.462743 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-f649f474-wcs8f" podUID="8f5312cc-8836-479a-9c43-8b9708268f65" containerName="main" containerID="cri-o://8d66869124251fa3d520403b3d450e5bcbe2a31626df865cce2ee7bb40469ce7" gracePeriod=30 Apr 21 15:49:43.711243 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:49:43.711221 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-f649f474-wcs8f" Apr 21 15:49:43.741848 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:49:43.741750 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n4mfb\" (UniqueName: \"kubernetes.io/projected/8f5312cc-8836-479a-9c43-8b9708268f65-kube-api-access-n4mfb\") pod \"8f5312cc-8836-479a-9c43-8b9708268f65\" (UID: \"8f5312cc-8836-479a-9c43-8b9708268f65\") " Apr 21 15:49:43.741848 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:49:43.741844 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/8f5312cc-8836-479a-9c43-8b9708268f65-dshm\") pod \"8f5312cc-8836-479a-9c43-8b9708268f65\" (UID: \"8f5312cc-8836-479a-9c43-8b9708268f65\") " Apr 21 15:49:43.742045 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:49:43.741954 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/8f5312cc-8836-479a-9c43-8b9708268f65-home\") pod \"8f5312cc-8836-479a-9c43-8b9708268f65\" (UID: \"8f5312cc-8836-479a-9c43-8b9708268f65\") " Apr 21 15:49:43.742045 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:49:43.741995 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8f5312cc-8836-479a-9c43-8b9708268f65-kserve-provision-location\") pod \"8f5312cc-8836-479a-9c43-8b9708268f65\" (UID: \"8f5312cc-8836-479a-9c43-8b9708268f65\") " Apr 21 15:49:43.742152 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:49:43.742046 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/8f5312cc-8836-479a-9c43-8b9708268f65-model-cache\") pod \"8f5312cc-8836-479a-9c43-8b9708268f65\" (UID: \"8f5312cc-8836-479a-9c43-8b9708268f65\") " Apr 21 15:49:43.742152 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:49:43.742095 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/8f5312cc-8836-479a-9c43-8b9708268f65-tls-certs\") pod \"8f5312cc-8836-479a-9c43-8b9708268f65\" (UID: \"8f5312cc-8836-479a-9c43-8b9708268f65\") " Apr 21 15:49:43.742271 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:49:43.742197 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f5312cc-8836-479a-9c43-8b9708268f65-home" (OuterVolumeSpecName: "home") pod "8f5312cc-8836-479a-9c43-8b9708268f65" (UID: "8f5312cc-8836-479a-9c43-8b9708268f65"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 15:49:43.742453 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:49:43.742421 2573 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/8f5312cc-8836-479a-9c43-8b9708268f65-home\") on node \"ip-10-0-136-123.ec2.internal\" DevicePath \"\"" Apr 21 15:49:43.742579 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:49:43.742533 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f5312cc-8836-479a-9c43-8b9708268f65-model-cache" (OuterVolumeSpecName: "model-cache") pod "8f5312cc-8836-479a-9c43-8b9708268f65" (UID: "8f5312cc-8836-479a-9c43-8b9708268f65"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 15:49:43.744731 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:49:43.744667 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f5312cc-8836-479a-9c43-8b9708268f65-dshm" (OuterVolumeSpecName: "dshm") pod "8f5312cc-8836-479a-9c43-8b9708268f65" (UID: "8f5312cc-8836-479a-9c43-8b9708268f65"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 15:49:43.744731 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:49:43.744667 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f5312cc-8836-479a-9c43-8b9708268f65-kube-api-access-n4mfb" (OuterVolumeSpecName: "kube-api-access-n4mfb") pod "8f5312cc-8836-479a-9c43-8b9708268f65" (UID: "8f5312cc-8836-479a-9c43-8b9708268f65"). InnerVolumeSpecName "kube-api-access-n4mfb". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 15:49:43.744950 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:49:43.744885 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f5312cc-8836-479a-9c43-8b9708268f65-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "8f5312cc-8836-479a-9c43-8b9708268f65" (UID: "8f5312cc-8836-479a-9c43-8b9708268f65"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 15:49:43.797900 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:49:43.797865 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f5312cc-8836-479a-9c43-8b9708268f65-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "8f5312cc-8836-479a-9c43-8b9708268f65" (UID: "8f5312cc-8836-479a-9c43-8b9708268f65"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 15:49:43.843851 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:49:43.843823 2573 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8f5312cc-8836-479a-9c43-8b9708268f65-kserve-provision-location\") on node \"ip-10-0-136-123.ec2.internal\" DevicePath \"\"" Apr 21 15:49:43.843851 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:49:43.843848 2573 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/8f5312cc-8836-479a-9c43-8b9708268f65-model-cache\") on node \"ip-10-0-136-123.ec2.internal\" DevicePath \"\"" Apr 21 15:49:43.843998 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:49:43.843859 2573 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/8f5312cc-8836-479a-9c43-8b9708268f65-tls-certs\") on node \"ip-10-0-136-123.ec2.internal\" DevicePath \"\"" Apr 21 15:49:43.843998 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:49:43.843868 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-n4mfb\" (UniqueName: \"kubernetes.io/projected/8f5312cc-8836-479a-9c43-8b9708268f65-kube-api-access-n4mfb\") on node \"ip-10-0-136-123.ec2.internal\" DevicePath \"\"" Apr 21 15:49:43.843998 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:49:43.843878 2573 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/8f5312cc-8836-479a-9c43-8b9708268f65-dshm\") on node \"ip-10-0-136-123.ec2.internal\" DevicePath \"\"" Apr 21 15:49:44.199285 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:49:44.199251 2573 generic.go:358] "Generic (PLEG): container finished" podID="8f5312cc-8836-479a-9c43-8b9708268f65" containerID="8d66869124251fa3d520403b3d450e5bcbe2a31626df865cce2ee7bb40469ce7" exitCode=0 Apr 21 15:49:44.199435 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:49:44.199316 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-f649f474-wcs8f" Apr 21 15:49:44.199435 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:49:44.199337 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-f649f474-wcs8f" event={"ID":"8f5312cc-8836-479a-9c43-8b9708268f65","Type":"ContainerDied","Data":"8d66869124251fa3d520403b3d450e5bcbe2a31626df865cce2ee7bb40469ce7"} Apr 21 15:49:44.199435 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:49:44.199375 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-f649f474-wcs8f" event={"ID":"8f5312cc-8836-479a-9c43-8b9708268f65","Type":"ContainerDied","Data":"ae05d2d357f4eae67d3e00c7be677c386cea1b0f90063b671e4679b0a461707f"} Apr 21 15:49:44.199435 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:49:44.199391 2573 scope.go:117] "RemoveContainer" containerID="8d66869124251fa3d520403b3d450e5bcbe2a31626df865cce2ee7bb40469ce7" Apr 21 15:49:44.208626 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:49:44.208610 2573 scope.go:117] "RemoveContainer" containerID="520a08a35a8c73ef8ea029da978bdb018593e99b9859eee789246e80e2317515" Apr 21 15:49:44.217596 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:49:44.217580 2573 scope.go:117] "RemoveContainer" containerID="8d66869124251fa3d520403b3d450e5bcbe2a31626df865cce2ee7bb40469ce7" Apr 21 15:49:44.217822 ip-10-0-136-123 kubenswrapper[2573]: E0421 15:49:44.217788 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8d66869124251fa3d520403b3d450e5bcbe2a31626df865cce2ee7bb40469ce7\": container with ID starting with 8d66869124251fa3d520403b3d450e5bcbe2a31626df865cce2ee7bb40469ce7 not found: ID does not exist" containerID="8d66869124251fa3d520403b3d450e5bcbe2a31626df865cce2ee7bb40469ce7" Apr 21 15:49:44.217930 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:49:44.217829 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8d66869124251fa3d520403b3d450e5bcbe2a31626df865cce2ee7bb40469ce7"} err="failed to get container status \"8d66869124251fa3d520403b3d450e5bcbe2a31626df865cce2ee7bb40469ce7\": rpc error: code = NotFound desc = could not find container \"8d66869124251fa3d520403b3d450e5bcbe2a31626df865cce2ee7bb40469ce7\": container with ID starting with 8d66869124251fa3d520403b3d450e5bcbe2a31626df865cce2ee7bb40469ce7 not found: ID does not exist" Apr 21 15:49:44.217930 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:49:44.217850 2573 scope.go:117] "RemoveContainer" containerID="520a08a35a8c73ef8ea029da978bdb018593e99b9859eee789246e80e2317515" Apr 21 15:49:44.218048 ip-10-0-136-123 kubenswrapper[2573]: E0421 15:49:44.218029 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"520a08a35a8c73ef8ea029da978bdb018593e99b9859eee789246e80e2317515\": container with ID starting with 520a08a35a8c73ef8ea029da978bdb018593e99b9859eee789246e80e2317515 not found: ID does not exist" containerID="520a08a35a8c73ef8ea029da978bdb018593e99b9859eee789246e80e2317515" Apr 21 15:49:44.218083 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:49:44.218057 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"520a08a35a8c73ef8ea029da978bdb018593e99b9859eee789246e80e2317515"} err="failed to get container status \"520a08a35a8c73ef8ea029da978bdb018593e99b9859eee789246e80e2317515\": rpc error: code = NotFound desc = could not find container \"520a08a35a8c73ef8ea029da978bdb018593e99b9859eee789246e80e2317515\": container with ID starting with 520a08a35a8c73ef8ea029da978bdb018593e99b9859eee789246e80e2317515 not found: ID does not exist" Apr 21 15:49:44.224983 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:49:44.224953 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-f649f474-wcs8f"] Apr 21 15:49:44.229532 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:49:44.229512 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-f649f474-wcs8f"] Apr 21 15:49:45.481383 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:49:45.481352 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f5312cc-8836-479a-9c43-8b9708268f65" path="/var/lib/kubelet/pods/8f5312cc-8836-479a-9c43-8b9708268f65/volumes" Apr 21 15:49:53.678853 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:49:53.678817 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-7dc6bb7d94-jcj2b"] Apr 21 15:49:53.679288 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:49:53.679127 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8f5312cc-8836-479a-9c43-8b9708268f65" containerName="main" Apr 21 15:49:53.679288 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:49:53.679138 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f5312cc-8836-479a-9c43-8b9708268f65" containerName="main" Apr 21 15:49:53.679288 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:49:53.679162 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8f5312cc-8836-479a-9c43-8b9708268f65" containerName="storage-initializer" Apr 21 15:49:53.679288 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:49:53.679168 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f5312cc-8836-479a-9c43-8b9708268f65" containerName="storage-initializer" Apr 21 15:49:53.679288 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:49:53.679228 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="8f5312cc-8836-479a-9c43-8b9708268f65" containerName="main" Apr 21 15:49:53.683617 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:49:53.683599 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-7dc6bb7d94-jcj2b" Apr 21 15:49:53.686173 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:49:53.686151 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-ha-replicas-test-kserve-self-signed-certs\"" Apr 21 15:49:53.690551 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:49:53.690525 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-7dc6bb7d94-jcj2b"] Apr 21 15:49:53.719666 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:49:53.719596 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/42f5c550-1b32-4dbe-b7f4-823d89e82b99-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-7dc6bb7d94-jcj2b\" (UID: \"42f5c550-1b32-4dbe-b7f4-823d89e82b99\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-7dc6bb7d94-jcj2b" Apr 21 15:49:53.719666 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:49:53.719641 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zsx7g\" (UniqueName: \"kubernetes.io/projected/42f5c550-1b32-4dbe-b7f4-823d89e82b99-kube-api-access-zsx7g\") pod \"scheduler-ha-replicas-test-kserve-7dc6bb7d94-jcj2b\" (UID: \"42f5c550-1b32-4dbe-b7f4-823d89e82b99\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-7dc6bb7d94-jcj2b" Apr 21 15:49:53.719865 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:49:53.719721 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/42f5c550-1b32-4dbe-b7f4-823d89e82b99-model-cache\") pod \"scheduler-ha-replicas-test-kserve-7dc6bb7d94-jcj2b\" (UID: \"42f5c550-1b32-4dbe-b7f4-823d89e82b99\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-7dc6bb7d94-jcj2b" Apr 21 15:49:53.719865 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:49:53.719819 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/42f5c550-1b32-4dbe-b7f4-823d89e82b99-home\") pod \"scheduler-ha-replicas-test-kserve-7dc6bb7d94-jcj2b\" (UID: \"42f5c550-1b32-4dbe-b7f4-823d89e82b99\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-7dc6bb7d94-jcj2b" Apr 21 15:49:53.719964 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:49:53.719854 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/42f5c550-1b32-4dbe-b7f4-823d89e82b99-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-7dc6bb7d94-jcj2b\" (UID: \"42f5c550-1b32-4dbe-b7f4-823d89e82b99\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-7dc6bb7d94-jcj2b" Apr 21 15:49:53.719964 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:49:53.719914 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/42f5c550-1b32-4dbe-b7f4-823d89e82b99-dshm\") pod \"scheduler-ha-replicas-test-kserve-7dc6bb7d94-jcj2b\" (UID: \"42f5c550-1b32-4dbe-b7f4-823d89e82b99\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-7dc6bb7d94-jcj2b" Apr 21 15:49:53.820755 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:49:53.820722 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/42f5c550-1b32-4dbe-b7f4-823d89e82b99-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-7dc6bb7d94-jcj2b\" (UID: \"42f5c550-1b32-4dbe-b7f4-823d89e82b99\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-7dc6bb7d94-jcj2b" Apr 21 15:49:53.820890 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:49:53.820761 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zsx7g\" (UniqueName: \"kubernetes.io/projected/42f5c550-1b32-4dbe-b7f4-823d89e82b99-kube-api-access-zsx7g\") pod \"scheduler-ha-replicas-test-kserve-7dc6bb7d94-jcj2b\" (UID: \"42f5c550-1b32-4dbe-b7f4-823d89e82b99\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-7dc6bb7d94-jcj2b" Apr 21 15:49:53.820890 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:49:53.820843 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/42f5c550-1b32-4dbe-b7f4-823d89e82b99-model-cache\") pod \"scheduler-ha-replicas-test-kserve-7dc6bb7d94-jcj2b\" (UID: \"42f5c550-1b32-4dbe-b7f4-823d89e82b99\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-7dc6bb7d94-jcj2b" Apr 21 15:49:53.820890 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:49:53.820884 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/42f5c550-1b32-4dbe-b7f4-823d89e82b99-home\") pod \"scheduler-ha-replicas-test-kserve-7dc6bb7d94-jcj2b\" (UID: \"42f5c550-1b32-4dbe-b7f4-823d89e82b99\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-7dc6bb7d94-jcj2b" Apr 21 15:49:53.821037 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:49:53.820908 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/42f5c550-1b32-4dbe-b7f4-823d89e82b99-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-7dc6bb7d94-jcj2b\" (UID: \"42f5c550-1b32-4dbe-b7f4-823d89e82b99\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-7dc6bb7d94-jcj2b" Apr 21 15:49:53.821037 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:49:53.820936 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/42f5c550-1b32-4dbe-b7f4-823d89e82b99-dshm\") pod \"scheduler-ha-replicas-test-kserve-7dc6bb7d94-jcj2b\" (UID: \"42f5c550-1b32-4dbe-b7f4-823d89e82b99\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-7dc6bb7d94-jcj2b" Apr 21 15:49:53.821199 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:49:53.821178 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/42f5c550-1b32-4dbe-b7f4-823d89e82b99-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-7dc6bb7d94-jcj2b\" (UID: \"42f5c550-1b32-4dbe-b7f4-823d89e82b99\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-7dc6bb7d94-jcj2b" Apr 21 15:49:53.821262 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:49:53.821204 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/42f5c550-1b32-4dbe-b7f4-823d89e82b99-home\") pod \"scheduler-ha-replicas-test-kserve-7dc6bb7d94-jcj2b\" (UID: \"42f5c550-1b32-4dbe-b7f4-823d89e82b99\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-7dc6bb7d94-jcj2b" Apr 21 15:49:53.821323 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:49:53.821261 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/42f5c550-1b32-4dbe-b7f4-823d89e82b99-model-cache\") pod \"scheduler-ha-replicas-test-kserve-7dc6bb7d94-jcj2b\" (UID: \"42f5c550-1b32-4dbe-b7f4-823d89e82b99\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-7dc6bb7d94-jcj2b" Apr 21 15:49:53.823032 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:49:53.823012 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/42f5c550-1b32-4dbe-b7f4-823d89e82b99-dshm\") pod \"scheduler-ha-replicas-test-kserve-7dc6bb7d94-jcj2b\" (UID: \"42f5c550-1b32-4dbe-b7f4-823d89e82b99\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-7dc6bb7d94-jcj2b" Apr 21 15:49:53.823339 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:49:53.823319 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/42f5c550-1b32-4dbe-b7f4-823d89e82b99-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-7dc6bb7d94-jcj2b\" (UID: \"42f5c550-1b32-4dbe-b7f4-823d89e82b99\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-7dc6bb7d94-jcj2b" Apr 21 15:49:53.831851 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:49:53.831826 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zsx7g\" (UniqueName: \"kubernetes.io/projected/42f5c550-1b32-4dbe-b7f4-823d89e82b99-kube-api-access-zsx7g\") pod \"scheduler-ha-replicas-test-kserve-7dc6bb7d94-jcj2b\" (UID: \"42f5c550-1b32-4dbe-b7f4-823d89e82b99\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-7dc6bb7d94-jcj2b" Apr 21 15:49:54.014845 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:49:54.014739 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-7dc6bb7d94-jcj2b" Apr 21 15:49:54.119930 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:49:54.119894 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7458cc8jzbdd"] Apr 21 15:49:54.125112 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:49:54.125087 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7458cc8jzbdd" Apr 21 15:49:54.127899 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:49:54.127861 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-ha-replicas-test-epp-sa-dockercfg-wfr98\"" Apr 21 15:49:54.134907 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:49:54.134867 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7458cc8jzbdd"] Apr 21 15:49:54.158177 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:49:54.158114 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-7dc6bb7d94-jcj2b"] Apr 21 15:49:54.161648 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:49:54.161622 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod42f5c550_1b32_4dbe_b7f4_823d89e82b99.slice/crio-990328414b5f0b9f542a8940cedfd4ca7c2d448c99cc7e699d3c73342497e31c WatchSource:0}: Error finding container 990328414b5f0b9f542a8940cedfd4ca7c2d448c99cc7e699d3c73342497e31c: Status 404 returned error can't find the container with id 990328414b5f0b9f542a8940cedfd4ca7c2d448c99cc7e699d3c73342497e31c Apr 21 15:49:54.225079 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:49:54.225046 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/401db9f3-89d8-4f6f-a516-424052654194-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-7458cc8jzbdd\" (UID: \"401db9f3-89d8-4f6f-a516-424052654194\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7458cc8jzbdd" Apr 21 15:49:54.225191 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:49:54.225109 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/401db9f3-89d8-4f6f-a516-424052654194-tokenizer-tmp\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-7458cc8jzbdd\" (UID: \"401db9f3-89d8-4f6f-a516-424052654194\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7458cc8jzbdd" Apr 21 15:49:54.225191 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:49:54.225153 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/401db9f3-89d8-4f6f-a516-424052654194-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-7458cc8jzbdd\" (UID: \"401db9f3-89d8-4f6f-a516-424052654194\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7458cc8jzbdd" Apr 21 15:49:54.225191 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:49:54.225175 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/401db9f3-89d8-4f6f-a516-424052654194-tokenizer-cache\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-7458cc8jzbdd\" (UID: \"401db9f3-89d8-4f6f-a516-424052654194\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7458cc8jzbdd" Apr 21 15:49:54.225361 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:49:54.225198 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fpmvs\" (UniqueName: \"kubernetes.io/projected/401db9f3-89d8-4f6f-a516-424052654194-kube-api-access-fpmvs\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-7458cc8jzbdd\" (UID: \"401db9f3-89d8-4f6f-a516-424052654194\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7458cc8jzbdd" Apr 21 15:49:54.225361 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:49:54.225243 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/401db9f3-89d8-4f6f-a516-424052654194-tokenizer-uds\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-7458cc8jzbdd\" (UID: \"401db9f3-89d8-4f6f-a516-424052654194\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7458cc8jzbdd" Apr 21 15:49:54.233893 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:49:54.233865 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-7dc6bb7d94-jcj2b" event={"ID":"42f5c550-1b32-4dbe-b7f4-823d89e82b99","Type":"ContainerStarted","Data":"7e926786edb23bcc3999297574d0b920c74edff06fcf24640f09c01ec1c3332c"} Apr 21 15:49:54.234007 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:49:54.233898 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-7dc6bb7d94-jcj2b" event={"ID":"42f5c550-1b32-4dbe-b7f4-823d89e82b99","Type":"ContainerStarted","Data":"990328414b5f0b9f542a8940cedfd4ca7c2d448c99cc7e699d3c73342497e31c"} Apr 21 15:49:54.326593 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:49:54.326562 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/401db9f3-89d8-4f6f-a516-424052654194-tokenizer-cache\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-7458cc8jzbdd\" (UID: \"401db9f3-89d8-4f6f-a516-424052654194\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7458cc8jzbdd" Apr 21 15:49:54.326758 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:49:54.326599 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fpmvs\" (UniqueName: \"kubernetes.io/projected/401db9f3-89d8-4f6f-a516-424052654194-kube-api-access-fpmvs\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-7458cc8jzbdd\" (UID: \"401db9f3-89d8-4f6f-a516-424052654194\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7458cc8jzbdd" Apr 21 15:49:54.326758 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:49:54.326623 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/401db9f3-89d8-4f6f-a516-424052654194-tokenizer-uds\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-7458cc8jzbdd\" (UID: \"401db9f3-89d8-4f6f-a516-424052654194\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7458cc8jzbdd" Apr 21 15:49:54.326758 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:49:54.326641 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/401db9f3-89d8-4f6f-a516-424052654194-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-7458cc8jzbdd\" (UID: \"401db9f3-89d8-4f6f-a516-424052654194\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7458cc8jzbdd" Apr 21 15:49:54.326758 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:49:54.326707 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/401db9f3-89d8-4f6f-a516-424052654194-tokenizer-tmp\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-7458cc8jzbdd\" (UID: \"401db9f3-89d8-4f6f-a516-424052654194\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7458cc8jzbdd" Apr 21 15:49:54.327022 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:49:54.326780 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/401db9f3-89d8-4f6f-a516-424052654194-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-7458cc8jzbdd\" (UID: \"401db9f3-89d8-4f6f-a516-424052654194\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7458cc8jzbdd" Apr 21 15:49:54.327083 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:49:54.327040 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/401db9f3-89d8-4f6f-a516-424052654194-tokenizer-cache\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-7458cc8jzbdd\" (UID: \"401db9f3-89d8-4f6f-a516-424052654194\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7458cc8jzbdd" Apr 21 15:49:54.327259 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:49:54.327196 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/401db9f3-89d8-4f6f-a516-424052654194-tokenizer-tmp\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-7458cc8jzbdd\" (UID: \"401db9f3-89d8-4f6f-a516-424052654194\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7458cc8jzbdd" Apr 21 15:49:54.327259 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:49:54.327218 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/401db9f3-89d8-4f6f-a516-424052654194-tokenizer-uds\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-7458cc8jzbdd\" (UID: \"401db9f3-89d8-4f6f-a516-424052654194\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7458cc8jzbdd" Apr 21 15:49:54.327259 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:49:54.327238 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/401db9f3-89d8-4f6f-a516-424052654194-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-7458cc8jzbdd\" (UID: \"401db9f3-89d8-4f6f-a516-424052654194\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7458cc8jzbdd" Apr 21 15:49:54.329085 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:49:54.329063 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/401db9f3-89d8-4f6f-a516-424052654194-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-7458cc8jzbdd\" (UID: \"401db9f3-89d8-4f6f-a516-424052654194\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7458cc8jzbdd" Apr 21 15:49:54.337009 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:49:54.336987 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fpmvs\" (UniqueName: \"kubernetes.io/projected/401db9f3-89d8-4f6f-a516-424052654194-kube-api-access-fpmvs\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-7458cc8jzbdd\" (UID: \"401db9f3-89d8-4f6f-a516-424052654194\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7458cc8jzbdd" Apr 21 15:49:54.437053 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:49:54.437011 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7458cc8jzbdd" Apr 21 15:49:54.585840 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:49:54.585791 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7458cc8jzbdd"] Apr 21 15:49:55.239457 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:49:55.239415 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7458cc8jzbdd" event={"ID":"401db9f3-89d8-4f6f-a516-424052654194","Type":"ContainerStarted","Data":"33db7cb2b1ccd75aafcb8a9f04f352f9e3da79c08925e2250c4f183da14731ac"} Apr 21 15:49:55.239457 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:49:55.239456 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7458cc8jzbdd" event={"ID":"401db9f3-89d8-4f6f-a516-424052654194","Type":"ContainerStarted","Data":"1725f48a91ab37002ae8cfb844fab46f8d6ef3b4193918d9b6e9e6972e7ddd9f"} Apr 21 15:49:56.244769 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:49:56.244733 2573 generic.go:358] "Generic (PLEG): container finished" podID="401db9f3-89d8-4f6f-a516-424052654194" containerID="33db7cb2b1ccd75aafcb8a9f04f352f9e3da79c08925e2250c4f183da14731ac" exitCode=0 Apr 21 15:49:56.245152 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:49:56.244825 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7458cc8jzbdd" event={"ID":"401db9f3-89d8-4f6f-a516-424052654194","Type":"ContainerDied","Data":"33db7cb2b1ccd75aafcb8a9f04f352f9e3da79c08925e2250c4f183da14731ac"} Apr 21 15:49:57.347535 ip-10-0-136-123 kubenswrapper[2573]: E0421 15:49:57.347506 2573 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod401db9f3_89d8_4f6f_a516_424052654194.slice/crio-665699fbd5228b1c86a0c94b18bbf31f32c9c2ec848e9b702a29c836998ad6cf.scope\": RecentStats: unable to find data in memory cache]" Apr 21 15:49:58.255924 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:49:58.255880 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7458cc8jzbdd" event={"ID":"401db9f3-89d8-4f6f-a516-424052654194","Type":"ContainerStarted","Data":"665699fbd5228b1c86a0c94b18bbf31f32c9c2ec848e9b702a29c836998ad6cf"} Apr 21 15:49:59.272191 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:49:59.272144 2573 generic.go:358] "Generic (PLEG): container finished" podID="42f5c550-1b32-4dbe-b7f4-823d89e82b99" containerID="7e926786edb23bcc3999297574d0b920c74edff06fcf24640f09c01ec1c3332c" exitCode=0 Apr 21 15:49:59.272701 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:49:59.272201 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-7dc6bb7d94-jcj2b" event={"ID":"42f5c550-1b32-4dbe-b7f4-823d89e82b99","Type":"ContainerDied","Data":"7e926786edb23bcc3999297574d0b920c74edff06fcf24640f09c01ec1c3332c"} Apr 21 15:50:00.278958 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:50:00.278921 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-7dc6bb7d94-jcj2b" event={"ID":"42f5c550-1b32-4dbe-b7f4-823d89e82b99","Type":"ContainerStarted","Data":"7764b5e6a06472ba581e5042b234e84184904d4743faad3273b4c8082dc13f71"} Apr 21 15:50:00.305566 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:50:00.305509 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-7dc6bb7d94-jcj2b" podStartSLOduration=7.305492615 podStartE2EDuration="7.305492615s" podCreationTimestamp="2026-04-21 15:49:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 15:50:00.298514909 +0000 UTC m=+895.428568543" watchObservedRunningTime="2026-04-21 15:50:00.305492615 +0000 UTC m=+895.435546263" Apr 21 15:50:04.015530 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:50:04.015487 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-7dc6bb7d94-jcj2b" Apr 21 15:50:04.015972 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:50:04.015586 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-7dc6bb7d94-jcj2b" Apr 21 15:50:04.031458 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:50:04.031426 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-7dc6bb7d94-jcj2b" Apr 21 15:50:04.314064 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:50:04.314029 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-7dc6bb7d94-jcj2b" Apr 21 15:50:05.434537 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:50:05.434509 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9v44z_a8821bf6-e244-4b55-bfcc-7d85dec39bc4/ovn-acl-logging/0.log" Apr 21 15:50:05.435364 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:50:05.435341 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9v44z_a8821bf6-e244-4b55-bfcc-7d85dec39bc4/ovn-acl-logging/0.log" Apr 21 15:50:27.390205 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:50:27.390168 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7458cc8jzbdd" event={"ID":"401db9f3-89d8-4f6f-a516-424052654194","Type":"ContainerStarted","Data":"19e3e74ee093fe21c049427a94a40a5be8ad4c7a5f93f9308bbc5e565a0274d4"} Apr 21 15:50:27.390656 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:50:27.390430 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7458cc8jzbdd" Apr 21 15:50:27.392904 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:50:27.392863 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7458cc8jzbdd" podUID="401db9f3-89d8-4f6f-a516-424052654194" containerName="main" probeResult="failure" output="service unhealthy (responded with \"NOT_SERVING\")" Apr 21 15:50:27.412014 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:50:27.411974 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7458cc8jzbdd" podStartSLOduration=3.219204534 podStartE2EDuration="33.411958517s" podCreationTimestamp="2026-04-21 15:49:54 +0000 UTC" firstStartedPulling="2026-04-21 15:49:56.245946277 +0000 UTC m=+891.375999890" lastFinishedPulling="2026-04-21 15:50:26.438700259 +0000 UTC m=+921.568753873" observedRunningTime="2026-04-21 15:50:27.411147048 +0000 UTC m=+922.541200683" watchObservedRunningTime="2026-04-21 15:50:27.411958517 +0000 UTC m=+922.542012150" Apr 21 15:50:28.395291 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:50:28.395252 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7458cc8jzbdd" podUID="401db9f3-89d8-4f6f-a516-424052654194" containerName="main" probeResult="failure" output="service unhealthy (responded with \"NOT_SERVING\")" Apr 21 15:50:28.801632 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:50:28.801534 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-7dc6bb7d94-jcj2b"] Apr 21 15:50:28.801994 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:50:28.801936 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-7dc6bb7d94-jcj2b" podUID="42f5c550-1b32-4dbe-b7f4-823d89e82b99" containerName="main" containerID="cri-o://7764b5e6a06472ba581e5042b234e84184904d4743faad3273b4c8082dc13f71" gracePeriod=30 Apr 21 15:50:28.808346 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:50:28.808311 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7458cc8jzbdd"] Apr 21 15:50:29.064309 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:50:29.064285 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-7dc6bb7d94-jcj2b" Apr 21 15:50:29.149998 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:50:29.149963 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/42f5c550-1b32-4dbe-b7f4-823d89e82b99-tls-certs\") pod \"42f5c550-1b32-4dbe-b7f4-823d89e82b99\" (UID: \"42f5c550-1b32-4dbe-b7f4-823d89e82b99\") " Apr 21 15:50:29.150172 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:50:29.150009 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/42f5c550-1b32-4dbe-b7f4-823d89e82b99-dshm\") pod \"42f5c550-1b32-4dbe-b7f4-823d89e82b99\" (UID: \"42f5c550-1b32-4dbe-b7f4-823d89e82b99\") " Apr 21 15:50:29.150172 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:50:29.150042 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/42f5c550-1b32-4dbe-b7f4-823d89e82b99-kserve-provision-location\") pod \"42f5c550-1b32-4dbe-b7f4-823d89e82b99\" (UID: \"42f5c550-1b32-4dbe-b7f4-823d89e82b99\") " Apr 21 15:50:29.150172 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:50:29.150112 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/42f5c550-1b32-4dbe-b7f4-823d89e82b99-home\") pod \"42f5c550-1b32-4dbe-b7f4-823d89e82b99\" (UID: \"42f5c550-1b32-4dbe-b7f4-823d89e82b99\") " Apr 21 15:50:29.150172 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:50:29.150138 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/42f5c550-1b32-4dbe-b7f4-823d89e82b99-model-cache\") pod \"42f5c550-1b32-4dbe-b7f4-823d89e82b99\" (UID: \"42f5c550-1b32-4dbe-b7f4-823d89e82b99\") " Apr 21 15:50:29.150172 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:50:29.150168 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zsx7g\" (UniqueName: \"kubernetes.io/projected/42f5c550-1b32-4dbe-b7f4-823d89e82b99-kube-api-access-zsx7g\") pod \"42f5c550-1b32-4dbe-b7f4-823d89e82b99\" (UID: \"42f5c550-1b32-4dbe-b7f4-823d89e82b99\") " Apr 21 15:50:29.150432 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:50:29.150394 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/42f5c550-1b32-4dbe-b7f4-823d89e82b99-home" (OuterVolumeSpecName: "home") pod "42f5c550-1b32-4dbe-b7f4-823d89e82b99" (UID: "42f5c550-1b32-4dbe-b7f4-823d89e82b99"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 15:50:29.150492 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:50:29.150432 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/42f5c550-1b32-4dbe-b7f4-823d89e82b99-model-cache" (OuterVolumeSpecName: "model-cache") pod "42f5c550-1b32-4dbe-b7f4-823d89e82b99" (UID: "42f5c550-1b32-4dbe-b7f4-823d89e82b99"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 15:50:29.152244 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:50:29.152214 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/42f5c550-1b32-4dbe-b7f4-823d89e82b99-dshm" (OuterVolumeSpecName: "dshm") pod "42f5c550-1b32-4dbe-b7f4-823d89e82b99" (UID: "42f5c550-1b32-4dbe-b7f4-823d89e82b99"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 15:50:29.152370 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:50:29.152333 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/42f5c550-1b32-4dbe-b7f4-823d89e82b99-kube-api-access-zsx7g" (OuterVolumeSpecName: "kube-api-access-zsx7g") pod "42f5c550-1b32-4dbe-b7f4-823d89e82b99" (UID: "42f5c550-1b32-4dbe-b7f4-823d89e82b99"). InnerVolumeSpecName "kube-api-access-zsx7g". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 15:50:29.152478 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:50:29.152456 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42f5c550-1b32-4dbe-b7f4-823d89e82b99-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "42f5c550-1b32-4dbe-b7f4-823d89e82b99" (UID: "42f5c550-1b32-4dbe-b7f4-823d89e82b99"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 15:50:29.205775 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:50:29.205733 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/42f5c550-1b32-4dbe-b7f4-823d89e82b99-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "42f5c550-1b32-4dbe-b7f4-823d89e82b99" (UID: "42f5c550-1b32-4dbe-b7f4-823d89e82b99"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 15:50:29.250811 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:50:29.250773 2573 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/42f5c550-1b32-4dbe-b7f4-823d89e82b99-tls-certs\") on node \"ip-10-0-136-123.ec2.internal\" DevicePath \"\"" Apr 21 15:50:29.250963 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:50:29.250825 2573 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/42f5c550-1b32-4dbe-b7f4-823d89e82b99-dshm\") on node \"ip-10-0-136-123.ec2.internal\" DevicePath \"\"" Apr 21 15:50:29.250963 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:50:29.250837 2573 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/42f5c550-1b32-4dbe-b7f4-823d89e82b99-kserve-provision-location\") on node \"ip-10-0-136-123.ec2.internal\" DevicePath \"\"" Apr 21 15:50:29.250963 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:50:29.250846 2573 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/42f5c550-1b32-4dbe-b7f4-823d89e82b99-home\") on node \"ip-10-0-136-123.ec2.internal\" DevicePath \"\"" Apr 21 15:50:29.250963 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:50:29.250855 2573 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/42f5c550-1b32-4dbe-b7f4-823d89e82b99-model-cache\") on node \"ip-10-0-136-123.ec2.internal\" DevicePath \"\"" Apr 21 15:50:29.250963 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:50:29.250867 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-zsx7g\" (UniqueName: \"kubernetes.io/projected/42f5c550-1b32-4dbe-b7f4-823d89e82b99-kube-api-access-zsx7g\") on node \"ip-10-0-136-123.ec2.internal\" DevicePath \"\"" Apr 21 15:50:29.399031 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:50:29.398985 2573 generic.go:358] "Generic (PLEG): container finished" podID="42f5c550-1b32-4dbe-b7f4-823d89e82b99" containerID="7764b5e6a06472ba581e5042b234e84184904d4743faad3273b4c8082dc13f71" exitCode=0 Apr 21 15:50:29.399469 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:50:29.399073 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-7dc6bb7d94-jcj2b" Apr 21 15:50:29.399469 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:50:29.399072 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-7dc6bb7d94-jcj2b" event={"ID":"42f5c550-1b32-4dbe-b7f4-823d89e82b99","Type":"ContainerDied","Data":"7764b5e6a06472ba581e5042b234e84184904d4743faad3273b4c8082dc13f71"} Apr 21 15:50:29.399469 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:50:29.399118 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-7dc6bb7d94-jcj2b" event={"ID":"42f5c550-1b32-4dbe-b7f4-823d89e82b99","Type":"ContainerDied","Data":"990328414b5f0b9f542a8940cedfd4ca7c2d448c99cc7e699d3c73342497e31c"} Apr 21 15:50:29.399469 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:50:29.399139 2573 scope.go:117] "RemoveContainer" containerID="7764b5e6a06472ba581e5042b234e84184904d4743faad3273b4c8082dc13f71" Apr 21 15:50:29.399683 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:50:29.399584 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7458cc8jzbdd" podUID="401db9f3-89d8-4f6f-a516-424052654194" containerName="main" containerID="cri-o://665699fbd5228b1c86a0c94b18bbf31f32c9c2ec848e9b702a29c836998ad6cf" gracePeriod=30 Apr 21 15:50:29.399683 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:50:29.399623 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7458cc8jzbdd" podUID="401db9f3-89d8-4f6f-a516-424052654194" containerName="tokenizer" containerID="cri-o://19e3e74ee093fe21c049427a94a40a5be8ad4c7a5f93f9308bbc5e565a0274d4" gracePeriod=30 Apr 21 15:50:29.401461 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:50:29.401432 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7458cc8jzbdd" podUID="401db9f3-89d8-4f6f-a516-424052654194" containerName="main" probeResult="failure" output="service unhealthy (responded with \"NOT_SERVING\")" Apr 21 15:50:29.409985 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:50:29.409611 2573 scope.go:117] "RemoveContainer" containerID="7e926786edb23bcc3999297574d0b920c74edff06fcf24640f09c01ec1c3332c" Apr 21 15:50:29.425327 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:50:29.425303 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-7dc6bb7d94-jcj2b"] Apr 21 15:50:29.427920 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:50:29.427900 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-7dc6bb7d94-jcj2b"] Apr 21 15:50:29.481303 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:50:29.481275 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="42f5c550-1b32-4dbe-b7f4-823d89e82b99" path="/var/lib/kubelet/pods/42f5c550-1b32-4dbe-b7f4-823d89e82b99/volumes" Apr 21 15:50:29.564788 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:50:29.564763 2573 scope.go:117] "RemoveContainer" containerID="7764b5e6a06472ba581e5042b234e84184904d4743faad3273b4c8082dc13f71" Apr 21 15:50:29.565181 ip-10-0-136-123 kubenswrapper[2573]: E0421 15:50:29.565151 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7764b5e6a06472ba581e5042b234e84184904d4743faad3273b4c8082dc13f71\": container with ID starting with 7764b5e6a06472ba581e5042b234e84184904d4743faad3273b4c8082dc13f71 not found: ID does not exist" containerID="7764b5e6a06472ba581e5042b234e84184904d4743faad3273b4c8082dc13f71" Apr 21 15:50:29.565288 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:50:29.565187 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7764b5e6a06472ba581e5042b234e84184904d4743faad3273b4c8082dc13f71"} err="failed to get container status \"7764b5e6a06472ba581e5042b234e84184904d4743faad3273b4c8082dc13f71\": rpc error: code = NotFound desc = could not find container \"7764b5e6a06472ba581e5042b234e84184904d4743faad3273b4c8082dc13f71\": container with ID starting with 7764b5e6a06472ba581e5042b234e84184904d4743faad3273b4c8082dc13f71 not found: ID does not exist" Apr 21 15:50:29.565288 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:50:29.565208 2573 scope.go:117] "RemoveContainer" containerID="7e926786edb23bcc3999297574d0b920c74edff06fcf24640f09c01ec1c3332c" Apr 21 15:50:29.565513 ip-10-0-136-123 kubenswrapper[2573]: E0421 15:50:29.565499 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7e926786edb23bcc3999297574d0b920c74edff06fcf24640f09c01ec1c3332c\": container with ID starting with 7e926786edb23bcc3999297574d0b920c74edff06fcf24640f09c01ec1c3332c not found: ID does not exist" containerID="7e926786edb23bcc3999297574d0b920c74edff06fcf24640f09c01ec1c3332c" Apr 21 15:50:29.565556 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:50:29.565516 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e926786edb23bcc3999297574d0b920c74edff06fcf24640f09c01ec1c3332c"} err="failed to get container status \"7e926786edb23bcc3999297574d0b920c74edff06fcf24640f09c01ec1c3332c\": rpc error: code = NotFound desc = could not find container \"7e926786edb23bcc3999297574d0b920c74edff06fcf24640f09c01ec1c3332c\": container with ID starting with 7e926786edb23bcc3999297574d0b920c74edff06fcf24640f09c01ec1c3332c not found: ID does not exist" Apr 21 15:50:30.404774 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:50:30.404740 2573 generic.go:358] "Generic (PLEG): container finished" podID="401db9f3-89d8-4f6f-a516-424052654194" containerID="665699fbd5228b1c86a0c94b18bbf31f32c9c2ec848e9b702a29c836998ad6cf" exitCode=0 Apr 21 15:50:30.404774 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:50:30.404786 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7458cc8jzbdd" event={"ID":"401db9f3-89d8-4f6f-a516-424052654194","Type":"ContainerDied","Data":"665699fbd5228b1c86a0c94b18bbf31f32c9c2ec848e9b702a29c836998ad6cf"} Apr 21 15:50:34.438145 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:50:34.438108 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7458cc8jzbdd" Apr 21 15:50:39.400630 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:50:39.400595 2573 logging.go:55] [core] [Channel #26 SubChannel #27]grpc: addrConn.createTransport failed to connect to {Addr: "10.132.0.33:9003", ServerName: "10.132.0.33:9003", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 10.132.0.33:9003: connect: connection refused" Apr 21 15:50:40.401262 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:50:40.401217 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7458cc8jzbdd" podUID="401db9f3-89d8-4f6f-a516-424052654194" containerName="main" probeResult="failure" output="timeout: failed to connect service \"10.132.0.33:9003\" within 1s: context deadline exceeded" Apr 21 15:50:41.289260 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:50:41.289225 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7b7596699d-pj7cj"] Apr 21 15:50:41.289580 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:50:41.289567 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="42f5c550-1b32-4dbe-b7f4-823d89e82b99" containerName="storage-initializer" Apr 21 15:50:41.289629 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:50:41.289581 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="42f5c550-1b32-4dbe-b7f4-823d89e82b99" containerName="storage-initializer" Apr 21 15:50:41.289629 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:50:41.289599 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="42f5c550-1b32-4dbe-b7f4-823d89e82b99" containerName="main" Apr 21 15:50:41.289629 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:50:41.289604 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="42f5c550-1b32-4dbe-b7f4-823d89e82b99" containerName="main" Apr 21 15:50:41.289736 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:50:41.289656 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="42f5c550-1b32-4dbe-b7f4-823d89e82b99" containerName="main" Apr 21 15:50:41.918219 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:50:41.918181 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7b7596699d-pj7cj" Apr 21 15:50:41.921306 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:50:41.921283 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"precise-prefix-cache-test-kserve-self-signed-certs\"" Apr 21 15:50:41.923941 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:50:41.923915 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7b7596699d-pj7cj"] Apr 21 15:50:41.961441 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:50:41.961412 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/723467bf-4ad7-47ef-8561-95afa5e563ed-model-cache\") pod \"precise-prefix-cache-test-kserve-7b7596699d-pj7cj\" (UID: \"723467bf-4ad7-47ef-8561-95afa5e563ed\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7b7596699d-pj7cj" Apr 21 15:50:41.961562 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:50:41.961456 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/723467bf-4ad7-47ef-8561-95afa5e563ed-dshm\") pod \"precise-prefix-cache-test-kserve-7b7596699d-pj7cj\" (UID: \"723467bf-4ad7-47ef-8561-95afa5e563ed\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7b7596699d-pj7cj" Apr 21 15:50:41.961562 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:50:41.961484 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/723467bf-4ad7-47ef-8561-95afa5e563ed-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-7b7596699d-pj7cj\" (UID: \"723467bf-4ad7-47ef-8561-95afa5e563ed\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7b7596699d-pj7cj" Apr 21 15:50:41.961671 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:50:41.961580 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/723467bf-4ad7-47ef-8561-95afa5e563ed-tls-certs\") pod \"precise-prefix-cache-test-kserve-7b7596699d-pj7cj\" (UID: \"723467bf-4ad7-47ef-8561-95afa5e563ed\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7b7596699d-pj7cj" Apr 21 15:50:41.961671 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:50:41.961632 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/723467bf-4ad7-47ef-8561-95afa5e563ed-home\") pod \"precise-prefix-cache-test-kserve-7b7596699d-pj7cj\" (UID: \"723467bf-4ad7-47ef-8561-95afa5e563ed\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7b7596699d-pj7cj" Apr 21 15:50:41.961742 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:50:41.961685 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8pvts\" (UniqueName: \"kubernetes.io/projected/723467bf-4ad7-47ef-8561-95afa5e563ed-kube-api-access-8pvts\") pod \"precise-prefix-cache-test-kserve-7b7596699d-pj7cj\" (UID: \"723467bf-4ad7-47ef-8561-95afa5e563ed\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7b7596699d-pj7cj" Apr 21 15:50:42.062895 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:50:42.062862 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/723467bf-4ad7-47ef-8561-95afa5e563ed-tls-certs\") pod \"precise-prefix-cache-test-kserve-7b7596699d-pj7cj\" (UID: \"723467bf-4ad7-47ef-8561-95afa5e563ed\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7b7596699d-pj7cj" Apr 21 15:50:42.063057 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:50:42.062908 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/723467bf-4ad7-47ef-8561-95afa5e563ed-home\") pod \"precise-prefix-cache-test-kserve-7b7596699d-pj7cj\" (UID: \"723467bf-4ad7-47ef-8561-95afa5e563ed\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7b7596699d-pj7cj" Apr 21 15:50:42.063057 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:50:42.062938 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8pvts\" (UniqueName: \"kubernetes.io/projected/723467bf-4ad7-47ef-8561-95afa5e563ed-kube-api-access-8pvts\") pod \"precise-prefix-cache-test-kserve-7b7596699d-pj7cj\" (UID: \"723467bf-4ad7-47ef-8561-95afa5e563ed\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7b7596699d-pj7cj" Apr 21 15:50:42.063057 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:50:42.062971 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/723467bf-4ad7-47ef-8561-95afa5e563ed-model-cache\") pod \"precise-prefix-cache-test-kserve-7b7596699d-pj7cj\" (UID: \"723467bf-4ad7-47ef-8561-95afa5e563ed\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7b7596699d-pj7cj" Apr 21 15:50:42.063057 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:50:42.062997 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/723467bf-4ad7-47ef-8561-95afa5e563ed-dshm\") pod \"precise-prefix-cache-test-kserve-7b7596699d-pj7cj\" (UID: \"723467bf-4ad7-47ef-8561-95afa5e563ed\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7b7596699d-pj7cj" Apr 21 15:50:42.063057 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:50:42.063030 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/723467bf-4ad7-47ef-8561-95afa5e563ed-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-7b7596699d-pj7cj\" (UID: \"723467bf-4ad7-47ef-8561-95afa5e563ed\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7b7596699d-pj7cj" Apr 21 15:50:42.063457 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:50:42.063424 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/723467bf-4ad7-47ef-8561-95afa5e563ed-model-cache\") pod \"precise-prefix-cache-test-kserve-7b7596699d-pj7cj\" (UID: \"723467bf-4ad7-47ef-8561-95afa5e563ed\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7b7596699d-pj7cj" Apr 21 15:50:42.063584 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:50:42.063498 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/723467bf-4ad7-47ef-8561-95afa5e563ed-home\") pod \"precise-prefix-cache-test-kserve-7b7596699d-pj7cj\" (UID: \"723467bf-4ad7-47ef-8561-95afa5e563ed\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7b7596699d-pj7cj" Apr 21 15:50:42.063584 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:50:42.063528 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/723467bf-4ad7-47ef-8561-95afa5e563ed-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-7b7596699d-pj7cj\" (UID: \"723467bf-4ad7-47ef-8561-95afa5e563ed\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7b7596699d-pj7cj" Apr 21 15:50:42.065217 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:50:42.065188 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/723467bf-4ad7-47ef-8561-95afa5e563ed-dshm\") pod \"precise-prefix-cache-test-kserve-7b7596699d-pj7cj\" (UID: \"723467bf-4ad7-47ef-8561-95afa5e563ed\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7b7596699d-pj7cj" Apr 21 15:50:42.065461 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:50:42.065446 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/723467bf-4ad7-47ef-8561-95afa5e563ed-tls-certs\") pod \"precise-prefix-cache-test-kserve-7b7596699d-pj7cj\" (UID: \"723467bf-4ad7-47ef-8561-95afa5e563ed\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7b7596699d-pj7cj" Apr 21 15:50:42.071071 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:50:42.071045 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8pvts\" (UniqueName: \"kubernetes.io/projected/723467bf-4ad7-47ef-8561-95afa5e563ed-kube-api-access-8pvts\") pod \"precise-prefix-cache-test-kserve-7b7596699d-pj7cj\" (UID: \"723467bf-4ad7-47ef-8561-95afa5e563ed\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7b7596699d-pj7cj" Apr 21 15:50:42.230101 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:50:42.230019 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7b7596699d-pj7cj" Apr 21 15:50:42.349779 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:50:42.349749 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7b7596699d-pj7cj"] Apr 21 15:50:42.352437 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:50:42.352402 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod723467bf_4ad7_47ef_8561_95afa5e563ed.slice/crio-af8c7c8022e3e291616f43198e97cd75bd3ccf0ebe5c62f78213e91532901d1d WatchSource:0}: Error finding container af8c7c8022e3e291616f43198e97cd75bd3ccf0ebe5c62f78213e91532901d1d: Status 404 returned error can't find the container with id af8c7c8022e3e291616f43198e97cd75bd3ccf0ebe5c62f78213e91532901d1d Apr 21 15:50:42.447783 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:50:42.447748 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7b7596699d-pj7cj" event={"ID":"723467bf-4ad7-47ef-8561-95afa5e563ed","Type":"ContainerStarted","Data":"af8c7c8022e3e291616f43198e97cd75bd3ccf0ebe5c62f78213e91532901d1d"} Apr 21 15:50:43.452023 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:50:43.451988 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7b7596699d-pj7cj" event={"ID":"723467bf-4ad7-47ef-8561-95afa5e563ed","Type":"ContainerStarted","Data":"5b14827dc1ff3456ba10a48837fb189721386fa5aee5921eccddf2a6f3498840"} Apr 21 15:50:47.468734 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:50:47.468698 2573 generic.go:358] "Generic (PLEG): container finished" podID="723467bf-4ad7-47ef-8561-95afa5e563ed" containerID="5b14827dc1ff3456ba10a48837fb189721386fa5aee5921eccddf2a6f3498840" exitCode=0 Apr 21 15:50:47.469163 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:50:47.468773 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7b7596699d-pj7cj" event={"ID":"723467bf-4ad7-47ef-8561-95afa5e563ed","Type":"ContainerDied","Data":"5b14827dc1ff3456ba10a48837fb189721386fa5aee5921eccddf2a6f3498840"} Apr 21 15:50:48.473758 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:50:48.473722 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7b7596699d-pj7cj" event={"ID":"723467bf-4ad7-47ef-8561-95afa5e563ed","Type":"ContainerStarted","Data":"5fbfd9e061fa6aad8a5d671b329c9abc806a68a8348cd5f8292692bb6e3ed7a4"} Apr 21 15:50:48.495156 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:50:48.495106 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7b7596699d-pj7cj" podStartSLOduration=7.495089227 podStartE2EDuration="7.495089227s" podCreationTimestamp="2026-04-21 15:50:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 15:50:48.493776689 +0000 UTC m=+943.623830326" watchObservedRunningTime="2026-04-21 15:50:48.495089227 +0000 UTC m=+943.625142862" Apr 21 15:50:49.400940 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:50:49.400909 2573 logging.go:55] [core] [Channel #28 SubChannel #29]grpc: addrConn.createTransport failed to connect to {Addr: "10.132.0.33:9003", ServerName: "10.132.0.33:9003", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 10.132.0.33:9003: connect: connection refused" Apr 21 15:50:50.401129 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:50:50.401083 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7458cc8jzbdd" podUID="401db9f3-89d8-4f6f-a516-424052654194" containerName="main" probeResult="failure" output="timeout: failed to connect service \"10.132.0.33:9003\" within 1s: context deadline exceeded" Apr 21 15:50:52.230160 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:50:52.230122 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7b7596699d-pj7cj" Apr 21 15:50:52.230160 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:50:52.230168 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7b7596699d-pj7cj" Apr 21 15:50:52.242656 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:50:52.242625 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7b7596699d-pj7cj" Apr 21 15:50:52.497067 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:50:52.496998 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7b7596699d-pj7cj" Apr 21 15:50:59.400414 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:50:59.400382 2573 logging.go:55] [core] [Channel #30 SubChannel #31]grpc: addrConn.createTransport failed to connect to {Addr: "10.132.0.33:9003", ServerName: "10.132.0.33:9003", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 10.132.0.33:9003: connect: connection refused" Apr 21 15:50:59.508746 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:50:59.508721 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_scheduler-ha-replicas-test-kserve-router-scheduler-7458cc8jzbdd_401db9f3-89d8-4f6f-a516-424052654194/tokenizer/0.log" Apr 21 15:50:59.509327 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:50:59.509301 2573 generic.go:358] "Generic (PLEG): container finished" podID="401db9f3-89d8-4f6f-a516-424052654194" containerID="19e3e74ee093fe21c049427a94a40a5be8ad4c7a5f93f9308bbc5e565a0274d4" exitCode=137 Apr 21 15:50:59.509408 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:50:59.509384 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7458cc8jzbdd" event={"ID":"401db9f3-89d8-4f6f-a516-424052654194","Type":"ContainerDied","Data":"19e3e74ee093fe21c049427a94a40a5be8ad4c7a5f93f9308bbc5e565a0274d4"} Apr 21 15:51:00.062948 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:51:00.062924 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_scheduler-ha-replicas-test-kserve-router-scheduler-7458cc8jzbdd_401db9f3-89d8-4f6f-a516-424052654194/tokenizer/0.log" Apr 21 15:51:00.063612 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:51:00.063592 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7458cc8jzbdd" Apr 21 15:51:00.220161 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:51:00.220087 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/401db9f3-89d8-4f6f-a516-424052654194-tokenizer-tmp\") pod \"401db9f3-89d8-4f6f-a516-424052654194\" (UID: \"401db9f3-89d8-4f6f-a516-424052654194\") " Apr 21 15:51:00.220161 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:51:00.220133 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/401db9f3-89d8-4f6f-a516-424052654194-tokenizer-cache\") pod \"401db9f3-89d8-4f6f-a516-424052654194\" (UID: \"401db9f3-89d8-4f6f-a516-424052654194\") " Apr 21 15:51:00.220161 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:51:00.220160 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/401db9f3-89d8-4f6f-a516-424052654194-kserve-provision-location\") pod \"401db9f3-89d8-4f6f-a516-424052654194\" (UID: \"401db9f3-89d8-4f6f-a516-424052654194\") " Apr 21 15:51:00.220385 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:51:00.220178 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fpmvs\" (UniqueName: \"kubernetes.io/projected/401db9f3-89d8-4f6f-a516-424052654194-kube-api-access-fpmvs\") pod \"401db9f3-89d8-4f6f-a516-424052654194\" (UID: \"401db9f3-89d8-4f6f-a516-424052654194\") " Apr 21 15:51:00.220385 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:51:00.220225 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/401db9f3-89d8-4f6f-a516-424052654194-tls-certs\") pod \"401db9f3-89d8-4f6f-a516-424052654194\" (UID: \"401db9f3-89d8-4f6f-a516-424052654194\") " Apr 21 15:51:00.220385 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:51:00.220261 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/401db9f3-89d8-4f6f-a516-424052654194-tokenizer-uds\") pod \"401db9f3-89d8-4f6f-a516-424052654194\" (UID: \"401db9f3-89d8-4f6f-a516-424052654194\") " Apr 21 15:51:00.220535 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:51:00.220401 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/401db9f3-89d8-4f6f-a516-424052654194-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "401db9f3-89d8-4f6f-a516-424052654194" (UID: "401db9f3-89d8-4f6f-a516-424052654194"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 15:51:00.220535 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:51:00.220481 2573 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/401db9f3-89d8-4f6f-a516-424052654194-tokenizer-cache\") on node \"ip-10-0-136-123.ec2.internal\" DevicePath \"\"" Apr 21 15:51:00.220535 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:51:00.220500 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/401db9f3-89d8-4f6f-a516-424052654194-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "401db9f3-89d8-4f6f-a516-424052654194" (UID: "401db9f3-89d8-4f6f-a516-424052654194"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 15:51:00.220688 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:51:00.220668 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/401db9f3-89d8-4f6f-a516-424052654194-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "401db9f3-89d8-4f6f-a516-424052654194" (UID: "401db9f3-89d8-4f6f-a516-424052654194"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 15:51:00.220917 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:51:00.220896 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/401db9f3-89d8-4f6f-a516-424052654194-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "401db9f3-89d8-4f6f-a516-424052654194" (UID: "401db9f3-89d8-4f6f-a516-424052654194"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 15:51:00.222359 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:51:00.222338 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/401db9f3-89d8-4f6f-a516-424052654194-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "401db9f3-89d8-4f6f-a516-424052654194" (UID: "401db9f3-89d8-4f6f-a516-424052654194"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 15:51:00.222405 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:51:00.222373 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/401db9f3-89d8-4f6f-a516-424052654194-kube-api-access-fpmvs" (OuterVolumeSpecName: "kube-api-access-fpmvs") pod "401db9f3-89d8-4f6f-a516-424052654194" (UID: "401db9f3-89d8-4f6f-a516-424052654194"). InnerVolumeSpecName "kube-api-access-fpmvs". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 15:51:00.321536 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:51:00.321499 2573 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/401db9f3-89d8-4f6f-a516-424052654194-tls-certs\") on node \"ip-10-0-136-123.ec2.internal\" DevicePath \"\"" Apr 21 15:51:00.321536 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:51:00.321531 2573 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/401db9f3-89d8-4f6f-a516-424052654194-tokenizer-uds\") on node \"ip-10-0-136-123.ec2.internal\" DevicePath \"\"" Apr 21 15:51:00.321536 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:51:00.321540 2573 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/401db9f3-89d8-4f6f-a516-424052654194-tokenizer-tmp\") on node \"ip-10-0-136-123.ec2.internal\" DevicePath \"\"" Apr 21 15:51:00.321762 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:51:00.321551 2573 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/401db9f3-89d8-4f6f-a516-424052654194-kserve-provision-location\") on node \"ip-10-0-136-123.ec2.internal\" DevicePath \"\"" Apr 21 15:51:00.321762 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:51:00.321561 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-fpmvs\" (UniqueName: \"kubernetes.io/projected/401db9f3-89d8-4f6f-a516-424052654194-kube-api-access-fpmvs\") on node \"ip-10-0-136-123.ec2.internal\" DevicePath \"\"" Apr 21 15:51:00.400474 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:51:00.400429 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7458cc8jzbdd" podUID="401db9f3-89d8-4f6f-a516-424052654194" containerName="main" probeResult="failure" output="timeout: failed to connect service \"10.132.0.33:9003\" within 1s: context deadline exceeded" Apr 21 15:51:00.515588 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:51:00.515517 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_scheduler-ha-replicas-test-kserve-router-scheduler-7458cc8jzbdd_401db9f3-89d8-4f6f-a516-424052654194/tokenizer/0.log" Apr 21 15:51:00.516228 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:51:00.516192 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7458cc8jzbdd" event={"ID":"401db9f3-89d8-4f6f-a516-424052654194","Type":"ContainerDied","Data":"1725f48a91ab37002ae8cfb844fab46f8d6ef3b4193918d9b6e9e6972e7ddd9f"} Apr 21 15:51:00.516348 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:51:00.516235 2573 scope.go:117] "RemoveContainer" containerID="19e3e74ee093fe21c049427a94a40a5be8ad4c7a5f93f9308bbc5e565a0274d4" Apr 21 15:51:00.516348 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:51:00.516240 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7458cc8jzbdd" Apr 21 15:51:00.524477 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:51:00.524420 2573 scope.go:117] "RemoveContainer" containerID="665699fbd5228b1c86a0c94b18bbf31f32c9c2ec848e9b702a29c836998ad6cf" Apr 21 15:51:00.531858 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:51:00.531841 2573 scope.go:117] "RemoveContainer" containerID="33db7cb2b1ccd75aafcb8a9f04f352f9e3da79c08925e2250c4f183da14731ac" Apr 21 15:51:00.539174 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:51:00.539143 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7458cc8jzbdd"] Apr 21 15:51:00.542461 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:51:00.542441 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7458cc8jzbdd"] Apr 21 15:51:01.482156 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:51:01.482120 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="401db9f3-89d8-4f6f-a516-424052654194" path="/var/lib/kubelet/pods/401db9f3-89d8-4f6f-a516-424052654194/volumes" Apr 21 15:51:14.499391 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:51:14.499352 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7b7596699d-pj7cj"] Apr 21 15:51:14.501769 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:51:14.499627 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7b7596699d-pj7cj" podUID="723467bf-4ad7-47ef-8561-95afa5e563ed" containerName="main" containerID="cri-o://5fbfd9e061fa6aad8a5d671b329c9abc806a68a8348cd5f8292692bb6e3ed7a4" gracePeriod=30 Apr 21 15:51:14.752277 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:51:14.752212 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7b7596699d-pj7cj" Apr 21 15:51:14.950226 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:51:14.950190 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8pvts\" (UniqueName: \"kubernetes.io/projected/723467bf-4ad7-47ef-8561-95afa5e563ed-kube-api-access-8pvts\") pod \"723467bf-4ad7-47ef-8561-95afa5e563ed\" (UID: \"723467bf-4ad7-47ef-8561-95afa5e563ed\") " Apr 21 15:51:14.950226 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:51:14.950232 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/723467bf-4ad7-47ef-8561-95afa5e563ed-kserve-provision-location\") pod \"723467bf-4ad7-47ef-8561-95afa5e563ed\" (UID: \"723467bf-4ad7-47ef-8561-95afa5e563ed\") " Apr 21 15:51:14.950466 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:51:14.950259 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/723467bf-4ad7-47ef-8561-95afa5e563ed-dshm\") pod \"723467bf-4ad7-47ef-8561-95afa5e563ed\" (UID: \"723467bf-4ad7-47ef-8561-95afa5e563ed\") " Apr 21 15:51:14.950466 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:51:14.950292 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/723467bf-4ad7-47ef-8561-95afa5e563ed-model-cache\") pod \"723467bf-4ad7-47ef-8561-95afa5e563ed\" (UID: \"723467bf-4ad7-47ef-8561-95afa5e563ed\") " Apr 21 15:51:14.950466 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:51:14.950340 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/723467bf-4ad7-47ef-8561-95afa5e563ed-tls-certs\") pod \"723467bf-4ad7-47ef-8561-95afa5e563ed\" (UID: \"723467bf-4ad7-47ef-8561-95afa5e563ed\") " Apr 21 15:51:14.950466 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:51:14.950378 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/723467bf-4ad7-47ef-8561-95afa5e563ed-home\") pod \"723467bf-4ad7-47ef-8561-95afa5e563ed\" (UID: \"723467bf-4ad7-47ef-8561-95afa5e563ed\") " Apr 21 15:51:14.950678 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:51:14.950588 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/723467bf-4ad7-47ef-8561-95afa5e563ed-model-cache" (OuterVolumeSpecName: "model-cache") pod "723467bf-4ad7-47ef-8561-95afa5e563ed" (UID: "723467bf-4ad7-47ef-8561-95afa5e563ed"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 15:51:14.950725 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:51:14.950671 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/723467bf-4ad7-47ef-8561-95afa5e563ed-home" (OuterVolumeSpecName: "home") pod "723467bf-4ad7-47ef-8561-95afa5e563ed" (UID: "723467bf-4ad7-47ef-8561-95afa5e563ed"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 15:51:14.952517 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:51:14.952481 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/723467bf-4ad7-47ef-8561-95afa5e563ed-dshm" (OuterVolumeSpecName: "dshm") pod "723467bf-4ad7-47ef-8561-95afa5e563ed" (UID: "723467bf-4ad7-47ef-8561-95afa5e563ed"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 15:51:14.952517 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:51:14.952481 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/723467bf-4ad7-47ef-8561-95afa5e563ed-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "723467bf-4ad7-47ef-8561-95afa5e563ed" (UID: "723467bf-4ad7-47ef-8561-95afa5e563ed"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 15:51:14.952517 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:51:14.952492 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/723467bf-4ad7-47ef-8561-95afa5e563ed-kube-api-access-8pvts" (OuterVolumeSpecName: "kube-api-access-8pvts") pod "723467bf-4ad7-47ef-8561-95afa5e563ed" (UID: "723467bf-4ad7-47ef-8561-95afa5e563ed"). InnerVolumeSpecName "kube-api-access-8pvts". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 15:51:15.004625 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:51:15.004547 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/723467bf-4ad7-47ef-8561-95afa5e563ed-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "723467bf-4ad7-47ef-8561-95afa5e563ed" (UID: "723467bf-4ad7-47ef-8561-95afa5e563ed"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 15:51:15.051318 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:51:15.051285 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-8pvts\" (UniqueName: \"kubernetes.io/projected/723467bf-4ad7-47ef-8561-95afa5e563ed-kube-api-access-8pvts\") on node \"ip-10-0-136-123.ec2.internal\" DevicePath \"\"" Apr 21 15:51:15.051318 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:51:15.051312 2573 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/723467bf-4ad7-47ef-8561-95afa5e563ed-kserve-provision-location\") on node \"ip-10-0-136-123.ec2.internal\" DevicePath \"\"" Apr 21 15:51:15.051318 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:51:15.051324 2573 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/723467bf-4ad7-47ef-8561-95afa5e563ed-dshm\") on node \"ip-10-0-136-123.ec2.internal\" DevicePath \"\"" Apr 21 15:51:15.051533 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:51:15.051333 2573 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/723467bf-4ad7-47ef-8561-95afa5e563ed-model-cache\") on node \"ip-10-0-136-123.ec2.internal\" DevicePath \"\"" Apr 21 15:51:15.051533 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:51:15.051343 2573 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/723467bf-4ad7-47ef-8561-95afa5e563ed-tls-certs\") on node \"ip-10-0-136-123.ec2.internal\" DevicePath \"\"" Apr 21 15:51:15.051533 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:51:15.051350 2573 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/723467bf-4ad7-47ef-8561-95afa5e563ed-home\") on node \"ip-10-0-136-123.ec2.internal\" DevicePath \"\"" Apr 21 15:51:15.573288 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:51:15.573248 2573 generic.go:358] "Generic (PLEG): container finished" podID="723467bf-4ad7-47ef-8561-95afa5e563ed" containerID="5fbfd9e061fa6aad8a5d671b329c9abc806a68a8348cd5f8292692bb6e3ed7a4" exitCode=0 Apr 21 15:51:15.573738 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:51:15.573369 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7b7596699d-pj7cj" Apr 21 15:51:15.573738 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:51:15.573364 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7b7596699d-pj7cj" event={"ID":"723467bf-4ad7-47ef-8561-95afa5e563ed","Type":"ContainerDied","Data":"5fbfd9e061fa6aad8a5d671b329c9abc806a68a8348cd5f8292692bb6e3ed7a4"} Apr 21 15:51:15.573738 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:51:15.573491 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7b7596699d-pj7cj" event={"ID":"723467bf-4ad7-47ef-8561-95afa5e563ed","Type":"ContainerDied","Data":"af8c7c8022e3e291616f43198e97cd75bd3ccf0ebe5c62f78213e91532901d1d"} Apr 21 15:51:15.573738 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:51:15.573516 2573 scope.go:117] "RemoveContainer" containerID="5fbfd9e061fa6aad8a5d671b329c9abc806a68a8348cd5f8292692bb6e3ed7a4" Apr 21 15:51:15.582585 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:51:15.582546 2573 scope.go:117] "RemoveContainer" containerID="5b14827dc1ff3456ba10a48837fb189721386fa5aee5921eccddf2a6f3498840" Apr 21 15:51:15.594281 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:51:15.594258 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7b7596699d-pj7cj"] Apr 21 15:51:15.599839 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:51:15.599814 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7b7596699d-pj7cj"] Apr 21 15:51:15.643940 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:51:15.643920 2573 scope.go:117] "RemoveContainer" containerID="5fbfd9e061fa6aad8a5d671b329c9abc806a68a8348cd5f8292692bb6e3ed7a4" Apr 21 15:51:15.644243 ip-10-0-136-123 kubenswrapper[2573]: E0421 15:51:15.644222 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5fbfd9e061fa6aad8a5d671b329c9abc806a68a8348cd5f8292692bb6e3ed7a4\": container with ID starting with 5fbfd9e061fa6aad8a5d671b329c9abc806a68a8348cd5f8292692bb6e3ed7a4 not found: ID does not exist" containerID="5fbfd9e061fa6aad8a5d671b329c9abc806a68a8348cd5f8292692bb6e3ed7a4" Apr 21 15:51:15.644292 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:51:15.644253 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5fbfd9e061fa6aad8a5d671b329c9abc806a68a8348cd5f8292692bb6e3ed7a4"} err="failed to get container status \"5fbfd9e061fa6aad8a5d671b329c9abc806a68a8348cd5f8292692bb6e3ed7a4\": rpc error: code = NotFound desc = could not find container \"5fbfd9e061fa6aad8a5d671b329c9abc806a68a8348cd5f8292692bb6e3ed7a4\": container with ID starting with 5fbfd9e061fa6aad8a5d671b329c9abc806a68a8348cd5f8292692bb6e3ed7a4 not found: ID does not exist" Apr 21 15:51:15.644292 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:51:15.644275 2573 scope.go:117] "RemoveContainer" containerID="5b14827dc1ff3456ba10a48837fb189721386fa5aee5921eccddf2a6f3498840" Apr 21 15:51:15.644534 ip-10-0-136-123 kubenswrapper[2573]: E0421 15:51:15.644519 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5b14827dc1ff3456ba10a48837fb189721386fa5aee5921eccddf2a6f3498840\": container with ID starting with 5b14827dc1ff3456ba10a48837fb189721386fa5aee5921eccddf2a6f3498840 not found: ID does not exist" containerID="5b14827dc1ff3456ba10a48837fb189721386fa5aee5921eccddf2a6f3498840" Apr 21 15:51:15.644578 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:51:15.644538 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5b14827dc1ff3456ba10a48837fb189721386fa5aee5921eccddf2a6f3498840"} err="failed to get container status \"5b14827dc1ff3456ba10a48837fb189721386fa5aee5921eccddf2a6f3498840\": rpc error: code = NotFound desc = could not find container \"5b14827dc1ff3456ba10a48837fb189721386fa5aee5921eccddf2a6f3498840\": container with ID starting with 5b14827dc1ff3456ba10a48837fb189721386fa5aee5921eccddf2a6f3498840 not found: ID does not exist" Apr 21 15:51:17.481890 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:51:17.481854 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="723467bf-4ad7-47ef-8561-95afa5e563ed" path="/var/lib/kubelet/pods/723467bf-4ad7-47ef-8561-95afa5e563ed/volumes" Apr 21 15:51:25.971546 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:51:25.971468 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-7b655f99d9-4s9g6"] Apr 21 15:51:25.971917 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:51:25.971832 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="401db9f3-89d8-4f6f-a516-424052654194" containerName="storage-initializer" Apr 21 15:51:25.971917 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:51:25.971845 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="401db9f3-89d8-4f6f-a516-424052654194" containerName="storage-initializer" Apr 21 15:51:25.971917 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:51:25.971862 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="723467bf-4ad7-47ef-8561-95afa5e563ed" containerName="main" Apr 21 15:51:25.971917 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:51:25.971867 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="723467bf-4ad7-47ef-8561-95afa5e563ed" containerName="main" Apr 21 15:51:25.971917 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:51:25.971881 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="723467bf-4ad7-47ef-8561-95afa5e563ed" containerName="storage-initializer" Apr 21 15:51:25.971917 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:51:25.971886 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="723467bf-4ad7-47ef-8561-95afa5e563ed" containerName="storage-initializer" Apr 21 15:51:25.971917 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:51:25.971894 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="401db9f3-89d8-4f6f-a516-424052654194" containerName="main" Apr 21 15:51:25.971917 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:51:25.971898 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="401db9f3-89d8-4f6f-a516-424052654194" containerName="main" Apr 21 15:51:25.971917 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:51:25.971905 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="401db9f3-89d8-4f6f-a516-424052654194" containerName="tokenizer" Apr 21 15:51:25.971917 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:51:25.971910 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="401db9f3-89d8-4f6f-a516-424052654194" containerName="tokenizer" Apr 21 15:51:25.972214 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:51:25.971955 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="401db9f3-89d8-4f6f-a516-424052654194" containerName="main" Apr 21 15:51:25.972214 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:51:25.971964 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="723467bf-4ad7-47ef-8561-95afa5e563ed" containerName="main" Apr 21 15:51:25.972214 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:51:25.971973 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="401db9f3-89d8-4f6f-a516-424052654194" containerName="tokenizer" Apr 21 15:51:25.977180 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:51:25.977159 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-7b655f99d9-4s9g6" Apr 21 15:51:25.979996 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:51:25.979975 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"stop-feature-test-kserve-self-signed-certs\"" Apr 21 15:51:25.985324 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:51:25.985301 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-7b655f99d9-4s9g6"] Apr 21 15:51:26.045138 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:51:26.045104 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/5ab1c193-73c8-471b-96d9-c21e1942317e-tls-certs\") pod \"stop-feature-test-kserve-7b655f99d9-4s9g6\" (UID: \"5ab1c193-73c8-471b-96d9-c21e1942317e\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-7b655f99d9-4s9g6" Apr 21 15:51:26.045304 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:51:26.045161 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/5ab1c193-73c8-471b-96d9-c21e1942317e-model-cache\") pod \"stop-feature-test-kserve-7b655f99d9-4s9g6\" (UID: \"5ab1c193-73c8-471b-96d9-c21e1942317e\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-7b655f99d9-4s9g6" Apr 21 15:51:26.045304 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:51:26.045207 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/5ab1c193-73c8-471b-96d9-c21e1942317e-dshm\") pod \"stop-feature-test-kserve-7b655f99d9-4s9g6\" (UID: \"5ab1c193-73c8-471b-96d9-c21e1942317e\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-7b655f99d9-4s9g6" Apr 21 15:51:26.045304 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:51:26.045246 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/5ab1c193-73c8-471b-96d9-c21e1942317e-home\") pod \"stop-feature-test-kserve-7b655f99d9-4s9g6\" (UID: \"5ab1c193-73c8-471b-96d9-c21e1942317e\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-7b655f99d9-4s9g6" Apr 21 15:51:26.045417 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:51:26.045309 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5ab1c193-73c8-471b-96d9-c21e1942317e-kserve-provision-location\") pod \"stop-feature-test-kserve-7b655f99d9-4s9g6\" (UID: \"5ab1c193-73c8-471b-96d9-c21e1942317e\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-7b655f99d9-4s9g6" Apr 21 15:51:26.045417 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:51:26.045340 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fmcrg\" (UniqueName: \"kubernetes.io/projected/5ab1c193-73c8-471b-96d9-c21e1942317e-kube-api-access-fmcrg\") pod \"stop-feature-test-kserve-7b655f99d9-4s9g6\" (UID: \"5ab1c193-73c8-471b-96d9-c21e1942317e\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-7b655f99d9-4s9g6" Apr 21 15:51:26.146783 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:51:26.146749 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/5ab1c193-73c8-471b-96d9-c21e1942317e-dshm\") pod \"stop-feature-test-kserve-7b655f99d9-4s9g6\" (UID: \"5ab1c193-73c8-471b-96d9-c21e1942317e\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-7b655f99d9-4s9g6" Apr 21 15:51:26.146998 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:51:26.146814 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/5ab1c193-73c8-471b-96d9-c21e1942317e-home\") pod \"stop-feature-test-kserve-7b655f99d9-4s9g6\" (UID: \"5ab1c193-73c8-471b-96d9-c21e1942317e\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-7b655f99d9-4s9g6" Apr 21 15:51:26.146998 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:51:26.146862 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5ab1c193-73c8-471b-96d9-c21e1942317e-kserve-provision-location\") pod \"stop-feature-test-kserve-7b655f99d9-4s9g6\" (UID: \"5ab1c193-73c8-471b-96d9-c21e1942317e\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-7b655f99d9-4s9g6" Apr 21 15:51:26.146998 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:51:26.146887 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fmcrg\" (UniqueName: \"kubernetes.io/projected/5ab1c193-73c8-471b-96d9-c21e1942317e-kube-api-access-fmcrg\") pod \"stop-feature-test-kserve-7b655f99d9-4s9g6\" (UID: \"5ab1c193-73c8-471b-96d9-c21e1942317e\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-7b655f99d9-4s9g6" Apr 21 15:51:26.146998 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:51:26.146930 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/5ab1c193-73c8-471b-96d9-c21e1942317e-tls-certs\") pod \"stop-feature-test-kserve-7b655f99d9-4s9g6\" (UID: \"5ab1c193-73c8-471b-96d9-c21e1942317e\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-7b655f99d9-4s9g6" Apr 21 15:51:26.146998 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:51:26.146974 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/5ab1c193-73c8-471b-96d9-c21e1942317e-model-cache\") pod \"stop-feature-test-kserve-7b655f99d9-4s9g6\" (UID: \"5ab1c193-73c8-471b-96d9-c21e1942317e\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-7b655f99d9-4s9g6" Apr 21 15:51:26.147357 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:51:26.147337 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5ab1c193-73c8-471b-96d9-c21e1942317e-kserve-provision-location\") pod \"stop-feature-test-kserve-7b655f99d9-4s9g6\" (UID: \"5ab1c193-73c8-471b-96d9-c21e1942317e\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-7b655f99d9-4s9g6" Apr 21 15:51:26.147425 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:51:26.147371 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/5ab1c193-73c8-471b-96d9-c21e1942317e-model-cache\") pod \"stop-feature-test-kserve-7b655f99d9-4s9g6\" (UID: \"5ab1c193-73c8-471b-96d9-c21e1942317e\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-7b655f99d9-4s9g6" Apr 21 15:51:26.147425 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:51:26.147371 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/5ab1c193-73c8-471b-96d9-c21e1942317e-home\") pod \"stop-feature-test-kserve-7b655f99d9-4s9g6\" (UID: \"5ab1c193-73c8-471b-96d9-c21e1942317e\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-7b655f99d9-4s9g6" Apr 21 15:51:26.149044 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:51:26.149020 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/5ab1c193-73c8-471b-96d9-c21e1942317e-dshm\") pod \"stop-feature-test-kserve-7b655f99d9-4s9g6\" (UID: \"5ab1c193-73c8-471b-96d9-c21e1942317e\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-7b655f99d9-4s9g6" Apr 21 15:51:26.149317 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:51:26.149298 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/5ab1c193-73c8-471b-96d9-c21e1942317e-tls-certs\") pod \"stop-feature-test-kserve-7b655f99d9-4s9g6\" (UID: \"5ab1c193-73c8-471b-96d9-c21e1942317e\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-7b655f99d9-4s9g6" Apr 21 15:51:26.155448 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:51:26.155409 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fmcrg\" (UniqueName: \"kubernetes.io/projected/5ab1c193-73c8-471b-96d9-c21e1942317e-kube-api-access-fmcrg\") pod \"stop-feature-test-kserve-7b655f99d9-4s9g6\" (UID: \"5ab1c193-73c8-471b-96d9-c21e1942317e\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-7b655f99d9-4s9g6" Apr 21 15:51:26.289005 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:51:26.288921 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-7b655f99d9-4s9g6" Apr 21 15:51:26.420839 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:51:26.420810 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-7b655f99d9-4s9g6"] Apr 21 15:51:26.423127 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:51:26.423093 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5ab1c193_73c8_471b_96d9_c21e1942317e.slice/crio-44998776a7bef74bccec014766c6e027b1cb3a0a1a7adaf800d3ffbe8a19233b WatchSource:0}: Error finding container 44998776a7bef74bccec014766c6e027b1cb3a0a1a7adaf800d3ffbe8a19233b: Status 404 returned error can't find the container with id 44998776a7bef74bccec014766c6e027b1cb3a0a1a7adaf800d3ffbe8a19233b Apr 21 15:51:26.618701 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:51:26.618658 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-7b655f99d9-4s9g6" event={"ID":"5ab1c193-73c8-471b-96d9-c21e1942317e","Type":"ContainerStarted","Data":"df26f1a038f877da01b8e807f8f812e62f8fc6d607505c4597a27c4af743eb9e"} Apr 21 15:51:26.618877 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:51:26.618706 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-7b655f99d9-4s9g6" event={"ID":"5ab1c193-73c8-471b-96d9-c21e1942317e","Type":"ContainerStarted","Data":"44998776a7bef74bccec014766c6e027b1cb3a0a1a7adaf800d3ffbe8a19233b"} Apr 21 15:51:28.144914 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:51:28.144879 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-84bb9f8458kjmrb"] Apr 21 15:51:28.145374 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:51:28.145256 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-84bb9f8458kjmrb" podUID="dd5ff7b2-09a9-4ff5-8da2-69a2d3a5a925" containerName="main" containerID="cri-o://dc7be136e2fb715f602ad3a17f5c15176f7252bae3fb8ceb51a90b577d078566" gracePeriod=30 Apr 21 15:51:28.405520 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:51:28.405456 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-84bb9f8458kjmrb" Apr 21 15:51:28.468679 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:51:28.468643 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2trll\" (UniqueName: \"kubernetes.io/projected/dd5ff7b2-09a9-4ff5-8da2-69a2d3a5a925-kube-api-access-2trll\") pod \"dd5ff7b2-09a9-4ff5-8da2-69a2d3a5a925\" (UID: \"dd5ff7b2-09a9-4ff5-8da2-69a2d3a5a925\") " Apr 21 15:51:28.468884 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:51:28.468692 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/dd5ff7b2-09a9-4ff5-8da2-69a2d3a5a925-kserve-provision-location\") pod \"dd5ff7b2-09a9-4ff5-8da2-69a2d3a5a925\" (UID: \"dd5ff7b2-09a9-4ff5-8da2-69a2d3a5a925\") " Apr 21 15:51:28.468884 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:51:28.468719 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/dd5ff7b2-09a9-4ff5-8da2-69a2d3a5a925-model-cache\") pod \"dd5ff7b2-09a9-4ff5-8da2-69a2d3a5a925\" (UID: \"dd5ff7b2-09a9-4ff5-8da2-69a2d3a5a925\") " Apr 21 15:51:28.468884 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:51:28.468756 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/dd5ff7b2-09a9-4ff5-8da2-69a2d3a5a925-tls-certs\") pod \"dd5ff7b2-09a9-4ff5-8da2-69a2d3a5a925\" (UID: \"dd5ff7b2-09a9-4ff5-8da2-69a2d3a5a925\") " Apr 21 15:51:28.468884 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:51:28.468780 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/dd5ff7b2-09a9-4ff5-8da2-69a2d3a5a925-dshm\") pod \"dd5ff7b2-09a9-4ff5-8da2-69a2d3a5a925\" (UID: \"dd5ff7b2-09a9-4ff5-8da2-69a2d3a5a925\") " Apr 21 15:51:28.468884 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:51:28.468824 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/dd5ff7b2-09a9-4ff5-8da2-69a2d3a5a925-home\") pod \"dd5ff7b2-09a9-4ff5-8da2-69a2d3a5a925\" (UID: \"dd5ff7b2-09a9-4ff5-8da2-69a2d3a5a925\") " Apr 21 15:51:28.469157 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:51:28.469118 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dd5ff7b2-09a9-4ff5-8da2-69a2d3a5a925-home" (OuterVolumeSpecName: "home") pod "dd5ff7b2-09a9-4ff5-8da2-69a2d3a5a925" (UID: "dd5ff7b2-09a9-4ff5-8da2-69a2d3a5a925"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 15:51:28.469157 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:51:28.469115 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dd5ff7b2-09a9-4ff5-8da2-69a2d3a5a925-model-cache" (OuterVolumeSpecName: "model-cache") pod "dd5ff7b2-09a9-4ff5-8da2-69a2d3a5a925" (UID: "dd5ff7b2-09a9-4ff5-8da2-69a2d3a5a925"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 15:51:28.470980 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:51:28.470948 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd5ff7b2-09a9-4ff5-8da2-69a2d3a5a925-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "dd5ff7b2-09a9-4ff5-8da2-69a2d3a5a925" (UID: "dd5ff7b2-09a9-4ff5-8da2-69a2d3a5a925"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 15:51:28.470980 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:51:28.470967 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd5ff7b2-09a9-4ff5-8da2-69a2d3a5a925-kube-api-access-2trll" (OuterVolumeSpecName: "kube-api-access-2trll") pod "dd5ff7b2-09a9-4ff5-8da2-69a2d3a5a925" (UID: "dd5ff7b2-09a9-4ff5-8da2-69a2d3a5a925"). InnerVolumeSpecName "kube-api-access-2trll". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 15:51:28.471175 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:51:28.470997 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dd5ff7b2-09a9-4ff5-8da2-69a2d3a5a925-dshm" (OuterVolumeSpecName: "dshm") pod "dd5ff7b2-09a9-4ff5-8da2-69a2d3a5a925" (UID: "dd5ff7b2-09a9-4ff5-8da2-69a2d3a5a925"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 15:51:28.524147 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:51:28.524097 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dd5ff7b2-09a9-4ff5-8da2-69a2d3a5a925-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "dd5ff7b2-09a9-4ff5-8da2-69a2d3a5a925" (UID: "dd5ff7b2-09a9-4ff5-8da2-69a2d3a5a925"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 15:51:28.570140 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:51:28.570106 2573 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/dd5ff7b2-09a9-4ff5-8da2-69a2d3a5a925-home\") on node \"ip-10-0-136-123.ec2.internal\" DevicePath \"\"" Apr 21 15:51:28.570140 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:51:28.570135 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-2trll\" (UniqueName: \"kubernetes.io/projected/dd5ff7b2-09a9-4ff5-8da2-69a2d3a5a925-kube-api-access-2trll\") on node \"ip-10-0-136-123.ec2.internal\" DevicePath \"\"" Apr 21 15:51:28.570140 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:51:28.570145 2573 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/dd5ff7b2-09a9-4ff5-8da2-69a2d3a5a925-kserve-provision-location\") on node \"ip-10-0-136-123.ec2.internal\" DevicePath \"\"" Apr 21 15:51:28.570405 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:51:28.570156 2573 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/dd5ff7b2-09a9-4ff5-8da2-69a2d3a5a925-model-cache\") on node \"ip-10-0-136-123.ec2.internal\" DevicePath \"\"" Apr 21 15:51:28.570405 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:51:28.570166 2573 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/dd5ff7b2-09a9-4ff5-8da2-69a2d3a5a925-tls-certs\") on node \"ip-10-0-136-123.ec2.internal\" DevicePath \"\"" Apr 21 15:51:28.570405 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:51:28.570174 2573 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/dd5ff7b2-09a9-4ff5-8da2-69a2d3a5a925-dshm\") on node \"ip-10-0-136-123.ec2.internal\" DevicePath \"\"" Apr 21 15:51:28.629078 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:51:28.629031 2573 generic.go:358] "Generic (PLEG): container finished" podID="dd5ff7b2-09a9-4ff5-8da2-69a2d3a5a925" containerID="dc7be136e2fb715f602ad3a17f5c15176f7252bae3fb8ceb51a90b577d078566" exitCode=0 Apr 21 15:51:28.629078 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:51:28.629070 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-84bb9f8458kjmrb" event={"ID":"dd5ff7b2-09a9-4ff5-8da2-69a2d3a5a925","Type":"ContainerDied","Data":"dc7be136e2fb715f602ad3a17f5c15176f7252bae3fb8ceb51a90b577d078566"} Apr 21 15:51:28.629288 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:51:28.629108 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-84bb9f8458kjmrb" event={"ID":"dd5ff7b2-09a9-4ff5-8da2-69a2d3a5a925","Type":"ContainerDied","Data":"bec082787ed8b1181e7a9fc5187d403c52f09846d981728d6893811e25308fb2"} Apr 21 15:51:28.629288 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:51:28.629125 2573 scope.go:117] "RemoveContainer" containerID="dc7be136e2fb715f602ad3a17f5c15176f7252bae3fb8ceb51a90b577d078566" Apr 21 15:51:28.629288 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:51:28.629142 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-84bb9f8458kjmrb" Apr 21 15:51:28.637742 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:51:28.637720 2573 scope.go:117] "RemoveContainer" containerID="67304691b01969687ff2abae07fcbc0f9a5f5065e72f9b7dc6c3a52ce2db71f1" Apr 21 15:51:28.654493 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:51:28.654462 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-84bb9f8458kjmrb"] Apr 21 15:51:28.660424 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:51:28.656735 2573 scope.go:117] "RemoveContainer" containerID="dc7be136e2fb715f602ad3a17f5c15176f7252bae3fb8ceb51a90b577d078566" Apr 21 15:51:28.660424 ip-10-0-136-123 kubenswrapper[2573]: E0421 15:51:28.657381 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dc7be136e2fb715f602ad3a17f5c15176f7252bae3fb8ceb51a90b577d078566\": container with ID starting with dc7be136e2fb715f602ad3a17f5c15176f7252bae3fb8ceb51a90b577d078566 not found: ID does not exist" containerID="dc7be136e2fb715f602ad3a17f5c15176f7252bae3fb8ceb51a90b577d078566" Apr 21 15:51:28.660424 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:51:28.657413 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dc7be136e2fb715f602ad3a17f5c15176f7252bae3fb8ceb51a90b577d078566"} err="failed to get container status \"dc7be136e2fb715f602ad3a17f5c15176f7252bae3fb8ceb51a90b577d078566\": rpc error: code = NotFound desc = could not find container \"dc7be136e2fb715f602ad3a17f5c15176f7252bae3fb8ceb51a90b577d078566\": container with ID starting with dc7be136e2fb715f602ad3a17f5c15176f7252bae3fb8ceb51a90b577d078566 not found: ID does not exist" Apr 21 15:51:28.660424 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:51:28.657441 2573 scope.go:117] "RemoveContainer" containerID="67304691b01969687ff2abae07fcbc0f9a5f5065e72f9b7dc6c3a52ce2db71f1" Apr 21 15:51:28.660424 ip-10-0-136-123 kubenswrapper[2573]: E0421 15:51:28.657791 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"67304691b01969687ff2abae07fcbc0f9a5f5065e72f9b7dc6c3a52ce2db71f1\": container with ID starting with 67304691b01969687ff2abae07fcbc0f9a5f5065e72f9b7dc6c3a52ce2db71f1 not found: ID does not exist" containerID="67304691b01969687ff2abae07fcbc0f9a5f5065e72f9b7dc6c3a52ce2db71f1" Apr 21 15:51:28.660424 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:51:28.657838 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"67304691b01969687ff2abae07fcbc0f9a5f5065e72f9b7dc6c3a52ce2db71f1"} err="failed to get container status \"67304691b01969687ff2abae07fcbc0f9a5f5065e72f9b7dc6c3a52ce2db71f1\": rpc error: code = NotFound desc = could not find container \"67304691b01969687ff2abae07fcbc0f9a5f5065e72f9b7dc6c3a52ce2db71f1\": container with ID starting with 67304691b01969687ff2abae07fcbc0f9a5f5065e72f9b7dc6c3a52ce2db71f1 not found: ID does not exist" Apr 21 15:51:28.660424 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:51:28.660344 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-84bb9f8458kjmrb"] Apr 21 15:51:29.481207 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:51:29.481172 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dd5ff7b2-09a9-4ff5-8da2-69a2d3a5a925" path="/var/lib/kubelet/pods/dd5ff7b2-09a9-4ff5-8da2-69a2d3a5a925/volumes" Apr 21 15:51:31.640522 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:51:31.640485 2573 generic.go:358] "Generic (PLEG): container finished" podID="5ab1c193-73c8-471b-96d9-c21e1942317e" containerID="df26f1a038f877da01b8e807f8f812e62f8fc6d607505c4597a27c4af743eb9e" exitCode=0 Apr 21 15:51:31.640891 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:51:31.640553 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-7b655f99d9-4s9g6" event={"ID":"5ab1c193-73c8-471b-96d9-c21e1942317e","Type":"ContainerDied","Data":"df26f1a038f877da01b8e807f8f812e62f8fc6d607505c4597a27c4af743eb9e"} Apr 21 15:51:50.587152 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:51:50.587120 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-74d49769d7rh9zf"] Apr 21 15:51:50.587611 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:51:50.587452 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="dd5ff7b2-09a9-4ff5-8da2-69a2d3a5a925" containerName="main" Apr 21 15:51:50.587611 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:51:50.587463 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd5ff7b2-09a9-4ff5-8da2-69a2d3a5a925" containerName="main" Apr 21 15:51:50.587611 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:51:50.587480 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="dd5ff7b2-09a9-4ff5-8da2-69a2d3a5a925" containerName="storage-initializer" Apr 21 15:51:50.587611 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:51:50.587486 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd5ff7b2-09a9-4ff5-8da2-69a2d3a5a925" containerName="storage-initializer" Apr 21 15:51:50.587611 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:51:50.587550 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="dd5ff7b2-09a9-4ff5-8da2-69a2d3a5a925" containerName="main" Apr 21 15:51:50.643731 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:51:50.643700 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-74d49769d7rh9zf"] Apr 21 15:51:50.643913 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:51:50.643860 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-74d49769d7rh9zf" Apr 21 15:51:50.646521 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:51:50.646498 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvdde380eaa9fe1facad32d45131f9e34d-kserve-self-signed-certs\"" Apr 21 15:51:50.768758 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:51:50.768723 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/48562357-724a-429c-b5df-83c134ccdde5-dshm\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-74d49769d7rh9zf\" (UID: \"48562357-724a-429c-b5df-83c134ccdde5\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-74d49769d7rh9zf" Apr 21 15:51:50.768758 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:51:50.768758 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/48562357-724a-429c-b5df-83c134ccdde5-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-74d49769d7rh9zf\" (UID: \"48562357-724a-429c-b5df-83c134ccdde5\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-74d49769d7rh9zf" Apr 21 15:51:50.769005 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:51:50.768782 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/48562357-724a-429c-b5df-83c134ccdde5-home\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-74d49769d7rh9zf\" (UID: \"48562357-724a-429c-b5df-83c134ccdde5\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-74d49769d7rh9zf" Apr 21 15:51:50.769005 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:51:50.768889 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7b87q\" (UniqueName: \"kubernetes.io/projected/48562357-724a-429c-b5df-83c134ccdde5-kube-api-access-7b87q\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-74d49769d7rh9zf\" (UID: \"48562357-724a-429c-b5df-83c134ccdde5\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-74d49769d7rh9zf" Apr 21 15:51:50.769005 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:51:50.768985 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/48562357-724a-429c-b5df-83c134ccdde5-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-74d49769d7rh9zf\" (UID: \"48562357-724a-429c-b5df-83c134ccdde5\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-74d49769d7rh9zf" Apr 21 15:51:50.769005 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:51:50.769005 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/48562357-724a-429c-b5df-83c134ccdde5-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-74d49769d7rh9zf\" (UID: \"48562357-724a-429c-b5df-83c134ccdde5\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-74d49769d7rh9zf" Apr 21 15:51:50.842723 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:51:50.842631 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schetgt8d"] Apr 21 15:51:50.868120 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:51:50.868091 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schetgt8d"] Apr 21 15:51:50.868298 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:51:50.868217 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schetgt8d" Apr 21 15:51:50.869542 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:51:50.869515 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/48562357-724a-429c-b5df-83c134ccdde5-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-74d49769d7rh9zf\" (UID: \"48562357-724a-429c-b5df-83c134ccdde5\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-74d49769d7rh9zf" Apr 21 15:51:50.869671 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:51:50.869549 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/48562357-724a-429c-b5df-83c134ccdde5-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-74d49769d7rh9zf\" (UID: \"48562357-724a-429c-b5df-83c134ccdde5\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-74d49769d7rh9zf" Apr 21 15:51:50.869739 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:51:50.869676 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/48562357-724a-429c-b5df-83c134ccdde5-dshm\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-74d49769d7rh9zf\" (UID: \"48562357-724a-429c-b5df-83c134ccdde5\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-74d49769d7rh9zf" Apr 21 15:51:50.869739 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:51:50.869717 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/48562357-724a-429c-b5df-83c134ccdde5-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-74d49769d7rh9zf\" (UID: \"48562357-724a-429c-b5df-83c134ccdde5\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-74d49769d7rh9zf" Apr 21 15:51:50.869944 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:51:50.869760 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/48562357-724a-429c-b5df-83c134ccdde5-home\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-74d49769d7rh9zf\" (UID: \"48562357-724a-429c-b5df-83c134ccdde5\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-74d49769d7rh9zf" Apr 21 15:51:50.869944 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:51:50.869827 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7b87q\" (UniqueName: \"kubernetes.io/projected/48562357-724a-429c-b5df-83c134ccdde5-kube-api-access-7b87q\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-74d49769d7rh9zf\" (UID: \"48562357-724a-429c-b5df-83c134ccdde5\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-74d49769d7rh9zf" Apr 21 15:51:50.869944 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:51:50.869938 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/48562357-724a-429c-b5df-83c134ccdde5-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-74d49769d7rh9zf\" (UID: \"48562357-724a-429c-b5df-83c134ccdde5\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-74d49769d7rh9zf" Apr 21 15:51:50.870090 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:51:50.870044 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/48562357-724a-429c-b5df-83c134ccdde5-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-74d49769d7rh9zf\" (UID: \"48562357-724a-429c-b5df-83c134ccdde5\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-74d49769d7rh9zf" Apr 21 15:51:50.870189 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:51:50.870169 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/48562357-724a-429c-b5df-83c134ccdde5-home\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-74d49769d7rh9zf\" (UID: \"48562357-724a-429c-b5df-83c134ccdde5\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-74d49769d7rh9zf" Apr 21 15:51:50.870905 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:51:50.870876 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvc-model-fb-opt-125m-route-55f7ae4a-epp-sa-dockercfg-sslvj\"" Apr 21 15:51:50.872215 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:51:50.872193 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/48562357-724a-429c-b5df-83c134ccdde5-dshm\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-74d49769d7rh9zf\" (UID: \"48562357-724a-429c-b5df-83c134ccdde5\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-74d49769d7rh9zf" Apr 21 15:51:50.872320 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:51:50.872261 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/48562357-724a-429c-b5df-83c134ccdde5-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-74d49769d7rh9zf\" (UID: \"48562357-724a-429c-b5df-83c134ccdde5\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-74d49769d7rh9zf" Apr 21 15:51:50.882123 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:51:50.882098 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7b87q\" (UniqueName: \"kubernetes.io/projected/48562357-724a-429c-b5df-83c134ccdde5-kube-api-access-7b87q\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-74d49769d7rh9zf\" (UID: \"48562357-724a-429c-b5df-83c134ccdde5\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-74d49769d7rh9zf" Apr 21 15:51:50.954258 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:51:50.954221 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-74d49769d7rh9zf" Apr 21 15:51:50.970501 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:51:50.970467 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/e4221cc6-8994-49d8-b3d6-7cae9fd91f54-tokenizer-uds\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schetgt8d\" (UID: \"e4221cc6-8994-49d8-b3d6-7cae9fd91f54\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schetgt8d" Apr 21 15:51:50.970622 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:51:50.970545 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/e4221cc6-8994-49d8-b3d6-7cae9fd91f54-tokenizer-tmp\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schetgt8d\" (UID: \"e4221cc6-8994-49d8-b3d6-7cae9fd91f54\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schetgt8d" Apr 21 15:51:50.970622 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:51:50.970576 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/e4221cc6-8994-49d8-b3d6-7cae9fd91f54-tokenizer-cache\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schetgt8d\" (UID: \"e4221cc6-8994-49d8-b3d6-7cae9fd91f54\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schetgt8d" Apr 21 15:51:50.970742 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:51:50.970631 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e4221cc6-8994-49d8-b3d6-7cae9fd91f54-tls-certs\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schetgt8d\" (UID: \"e4221cc6-8994-49d8-b3d6-7cae9fd91f54\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schetgt8d" Apr 21 15:51:50.970742 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:51:50.970682 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e4221cc6-8994-49d8-b3d6-7cae9fd91f54-kserve-provision-location\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schetgt8d\" (UID: \"e4221cc6-8994-49d8-b3d6-7cae9fd91f54\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schetgt8d" Apr 21 15:51:50.970742 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:51:50.970722 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m4gjr\" (UniqueName: \"kubernetes.io/projected/e4221cc6-8994-49d8-b3d6-7cae9fd91f54-kube-api-access-m4gjr\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schetgt8d\" (UID: \"e4221cc6-8994-49d8-b3d6-7cae9fd91f54\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schetgt8d" Apr 21 15:51:51.075896 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:51:51.072916 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/e4221cc6-8994-49d8-b3d6-7cae9fd91f54-tokenizer-tmp\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schetgt8d\" (UID: \"e4221cc6-8994-49d8-b3d6-7cae9fd91f54\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schetgt8d" Apr 21 15:51:51.075896 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:51:51.072976 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/e4221cc6-8994-49d8-b3d6-7cae9fd91f54-tokenizer-cache\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schetgt8d\" (UID: \"e4221cc6-8994-49d8-b3d6-7cae9fd91f54\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schetgt8d" Apr 21 15:51:51.075896 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:51:51.073045 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e4221cc6-8994-49d8-b3d6-7cae9fd91f54-tls-certs\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schetgt8d\" (UID: \"e4221cc6-8994-49d8-b3d6-7cae9fd91f54\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schetgt8d" Apr 21 15:51:51.075896 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:51:51.073108 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e4221cc6-8994-49d8-b3d6-7cae9fd91f54-kserve-provision-location\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schetgt8d\" (UID: \"e4221cc6-8994-49d8-b3d6-7cae9fd91f54\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schetgt8d" Apr 21 15:51:51.075896 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:51:51.073152 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m4gjr\" (UniqueName: \"kubernetes.io/projected/e4221cc6-8994-49d8-b3d6-7cae9fd91f54-kube-api-access-m4gjr\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schetgt8d\" (UID: \"e4221cc6-8994-49d8-b3d6-7cae9fd91f54\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schetgt8d" Apr 21 15:51:51.075896 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:51:51.073180 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/e4221cc6-8994-49d8-b3d6-7cae9fd91f54-tokenizer-uds\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schetgt8d\" (UID: \"e4221cc6-8994-49d8-b3d6-7cae9fd91f54\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schetgt8d" Apr 21 15:51:51.075896 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:51:51.073677 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/e4221cc6-8994-49d8-b3d6-7cae9fd91f54-tokenizer-uds\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schetgt8d\" (UID: \"e4221cc6-8994-49d8-b3d6-7cae9fd91f54\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schetgt8d" Apr 21 15:51:51.075896 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:51:51.073957 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/e4221cc6-8994-49d8-b3d6-7cae9fd91f54-tokenizer-tmp\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schetgt8d\" (UID: \"e4221cc6-8994-49d8-b3d6-7cae9fd91f54\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schetgt8d" Apr 21 15:51:51.075896 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:51:51.074178 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/e4221cc6-8994-49d8-b3d6-7cae9fd91f54-tokenizer-cache\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schetgt8d\" (UID: \"e4221cc6-8994-49d8-b3d6-7cae9fd91f54\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schetgt8d" Apr 21 15:51:51.075896 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:51:51.074574 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e4221cc6-8994-49d8-b3d6-7cae9fd91f54-kserve-provision-location\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schetgt8d\" (UID: \"e4221cc6-8994-49d8-b3d6-7cae9fd91f54\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schetgt8d" Apr 21 15:51:51.085137 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:51:51.085066 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m4gjr\" (UniqueName: \"kubernetes.io/projected/e4221cc6-8994-49d8-b3d6-7cae9fd91f54-kube-api-access-m4gjr\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schetgt8d\" (UID: \"e4221cc6-8994-49d8-b3d6-7cae9fd91f54\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schetgt8d" Apr 21 15:51:51.086604 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:51:51.086577 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e4221cc6-8994-49d8-b3d6-7cae9fd91f54-tls-certs\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schetgt8d\" (UID: \"e4221cc6-8994-49d8-b3d6-7cae9fd91f54\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schetgt8d" Apr 21 15:51:51.110129 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:51:51.110050 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-74d49769d7rh9zf"] Apr 21 15:51:51.114150 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:51:51.114125 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod48562357_724a_429c_b5df_83c134ccdde5.slice/crio-9f056a79751fe0324f2ecc2aebdea67ad1d2e8fc623c865f06cebc624dceef1f WatchSource:0}: Error finding container 9f056a79751fe0324f2ecc2aebdea67ad1d2e8fc623c865f06cebc624dceef1f: Status 404 returned error can't find the container with id 9f056a79751fe0324f2ecc2aebdea67ad1d2e8fc623c865f06cebc624dceef1f Apr 21 15:51:51.191767 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:51:51.191744 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schetgt8d" Apr 21 15:51:51.335542 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:51:51.335509 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schetgt8d"] Apr 21 15:51:51.338901 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:51:51.338862 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode4221cc6_8994_49d8_b3d6_7cae9fd91f54.slice/crio-6c8789026ef8d2608c45075b92f41716751baea67cc5f50e2d535e87a6e8eca4 WatchSource:0}: Error finding container 6c8789026ef8d2608c45075b92f41716751baea67cc5f50e2d535e87a6e8eca4: Status 404 returned error can't find the container with id 6c8789026ef8d2608c45075b92f41716751baea67cc5f50e2d535e87a6e8eca4 Apr 21 15:51:51.731132 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:51:51.731039 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-74d49769d7rh9zf" event={"ID":"48562357-724a-429c-b5df-83c134ccdde5","Type":"ContainerStarted","Data":"94a74bc2439309ee40cf217645f5f34d7ba4ab5639df16c139824562265160b9"} Apr 21 15:51:51.731132 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:51:51.731090 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-74d49769d7rh9zf" event={"ID":"48562357-724a-429c-b5df-83c134ccdde5","Type":"ContainerStarted","Data":"9f056a79751fe0324f2ecc2aebdea67ad1d2e8fc623c865f06cebc624dceef1f"} Apr 21 15:51:51.732667 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:51:51.732635 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schetgt8d" event={"ID":"e4221cc6-8994-49d8-b3d6-7cae9fd91f54","Type":"ContainerStarted","Data":"03e776825761d06e8ce34c14f523f3c1dc174810d8cda8e35423e3d9c483c57c"} Apr 21 15:51:51.732778 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:51:51.732674 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schetgt8d" event={"ID":"e4221cc6-8994-49d8-b3d6-7cae9fd91f54","Type":"ContainerStarted","Data":"6c8789026ef8d2608c45075b92f41716751baea67cc5f50e2d535e87a6e8eca4"} Apr 21 15:51:52.738253 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:51:52.738166 2573 generic.go:358] "Generic (PLEG): container finished" podID="e4221cc6-8994-49d8-b3d6-7cae9fd91f54" containerID="03e776825761d06e8ce34c14f523f3c1dc174810d8cda8e35423e3d9c483c57c" exitCode=0 Apr 21 15:51:52.738679 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:51:52.738301 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schetgt8d" event={"ID":"e4221cc6-8994-49d8-b3d6-7cae9fd91f54","Type":"ContainerDied","Data":"03e776825761d06e8ce34c14f523f3c1dc174810d8cda8e35423e3d9c483c57c"} Apr 21 15:51:53.744187 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:51:53.744128 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schetgt8d" event={"ID":"e4221cc6-8994-49d8-b3d6-7cae9fd91f54","Type":"ContainerStarted","Data":"180527e394b6c3e51b6cce37904a3942a59ef53cad5d59071cb81294f75b10fe"} Apr 21 15:51:53.744187 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:51:53.744189 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schetgt8d" event={"ID":"e4221cc6-8994-49d8-b3d6-7cae9fd91f54","Type":"ContainerStarted","Data":"62d7602bf1fa3255adcc6e68e747159424cefe82279e60cdff1568919c12db0a"} Apr 21 15:51:53.744703 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:51:53.744255 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schetgt8d" Apr 21 15:51:53.773861 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:51:53.773782 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schetgt8d" podStartSLOduration=3.773762958 podStartE2EDuration="3.773762958s" podCreationTimestamp="2026-04-21 15:51:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 15:51:53.768178317 +0000 UTC m=+1008.898231976" watchObservedRunningTime="2026-04-21 15:51:53.773762958 +0000 UTC m=+1008.903816593" Apr 21 15:51:56.756437 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:51:56.756401 2573 generic.go:358] "Generic (PLEG): container finished" podID="48562357-724a-429c-b5df-83c134ccdde5" containerID="94a74bc2439309ee40cf217645f5f34d7ba4ab5639df16c139824562265160b9" exitCode=0 Apr 21 15:51:56.756900 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:51:56.756452 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-74d49769d7rh9zf" event={"ID":"48562357-724a-429c-b5df-83c134ccdde5","Type":"ContainerDied","Data":"94a74bc2439309ee40cf217645f5f34d7ba4ab5639df16c139824562265160b9"} Apr 21 15:52:01.192541 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:52:01.192161 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schetgt8d" Apr 21 15:52:01.192541 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:52:01.192213 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schetgt8d" Apr 21 15:52:01.193700 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:52:01.193614 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schetgt8d" podUID="e4221cc6-8994-49d8-b3d6-7cae9fd91f54" containerName="tokenizer" probeResult="failure" output="Get \"http://10.132.0.37:8082/healthz\": dial tcp 10.132.0.37:8082: connect: connection refused" Apr 21 15:52:11.194417 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:52:11.194322 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schetgt8d" Apr 21 15:52:11.195892 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:52:11.195860 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schetgt8d" Apr 21 15:52:13.832442 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:52:13.832405 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-7b655f99d9-4s9g6" event={"ID":"5ab1c193-73c8-471b-96d9-c21e1942317e","Type":"ContainerStarted","Data":"e430244df67b783432e1ce36bbc4c481e2a5137bd9a9f9304ea94f83af6ef0f9"} Apr 21 15:52:13.834174 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:52:13.834140 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-74d49769d7rh9zf" event={"ID":"48562357-724a-429c-b5df-83c134ccdde5","Type":"ContainerStarted","Data":"b5edf2968504e946118e067c24f0513df4da558de42fe795454b2d72c92b2834"} Apr 21 15:52:13.866915 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:52:13.866855 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/stop-feature-test-kserve-7b655f99d9-4s9g6" podStartSLOduration=7.878643562 podStartE2EDuration="48.866835471s" podCreationTimestamp="2026-04-21 15:51:25 +0000 UTC" firstStartedPulling="2026-04-21 15:51:31.641662185 +0000 UTC m=+986.771715797" lastFinishedPulling="2026-04-21 15:52:12.629854078 +0000 UTC m=+1027.759907706" observedRunningTime="2026-04-21 15:52:13.864143263 +0000 UTC m=+1028.994196897" watchObservedRunningTime="2026-04-21 15:52:13.866835471 +0000 UTC m=+1028.996889107" Apr 21 15:52:13.891862 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:52:13.891788 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-74d49769d7rh9zf" podStartSLOduration=7.724866592 podStartE2EDuration="23.891771888s" podCreationTimestamp="2026-04-21 15:51:50 +0000 UTC" firstStartedPulling="2026-04-21 15:51:56.757569385 +0000 UTC m=+1011.887623000" lastFinishedPulling="2026-04-21 15:52:12.92447467 +0000 UTC m=+1028.054528296" observedRunningTime="2026-04-21 15:52:13.889496961 +0000 UTC m=+1029.019550596" watchObservedRunningTime="2026-04-21 15:52:13.891771888 +0000 UTC m=+1029.021825526" Apr 21 15:52:16.289136 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:52:16.289090 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/stop-feature-test-kserve-7b655f99d9-4s9g6" Apr 21 15:52:16.289136 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:52:16.289142 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-7b655f99d9-4s9g6" Apr 21 15:52:16.290687 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:52:16.290655 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-7b655f99d9-4s9g6" podUID="5ab1c193-73c8-471b-96d9-c21e1942317e" containerName="main" probeResult="failure" output="Get \"https://10.132.0.35:8000/health\": dial tcp 10.132.0.35:8000: connect: connection refused" Apr 21 15:52:20.954836 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:52:20.954787 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-74d49769d7rh9zf" Apr 21 15:52:20.954836 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:52:20.954846 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-74d49769d7rh9zf" Apr 21 15:52:20.956632 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:52:20.956587 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-74d49769d7rh9zf" podUID="48562357-724a-429c-b5df-83c134ccdde5" containerName="main" probeResult="failure" output="Get \"https://10.132.0.36:8000/health\": dial tcp 10.132.0.36:8000: connect: connection refused" Apr 21 15:52:26.290247 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:52:26.290198 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-7b655f99d9-4s9g6" podUID="5ab1c193-73c8-471b-96d9-c21e1942317e" containerName="main" probeResult="failure" output="Get \"https://10.132.0.35:8000/health\": dial tcp 10.132.0.35:8000: connect: connection refused" Apr 21 15:52:30.954921 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:52:30.954870 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-74d49769d7rh9zf" podUID="48562357-724a-429c-b5df-83c134ccdde5" containerName="main" probeResult="failure" output="Get \"https://10.132.0.36:8000/health\": dial tcp 10.132.0.36:8000: connect: connection refused" Apr 21 15:52:31.826968 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:52:31.826938 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schetgt8d" Apr 21 15:52:36.289555 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:52:36.289514 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-7b655f99d9-4s9g6" podUID="5ab1c193-73c8-471b-96d9-c21e1942317e" containerName="main" probeResult="failure" output="Get \"https://10.132.0.35:8000/health\": dial tcp 10.132.0.35:8000: connect: connection refused" Apr 21 15:52:40.955432 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:52:40.955373 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-74d49769d7rh9zf" podUID="48562357-724a-429c-b5df-83c134ccdde5" containerName="main" probeResult="failure" output="Get \"https://10.132.0.36:8000/health\": dial tcp 10.132.0.36:8000: connect: connection refused" Apr 21 15:52:46.289380 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:52:46.289325 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-7b655f99d9-4s9g6" podUID="5ab1c193-73c8-471b-96d9-c21e1942317e" containerName="main" probeResult="failure" output="Get \"https://10.132.0.35:8000/health\": dial tcp 10.132.0.35:8000: connect: connection refused" Apr 21 15:52:50.955627 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:52:50.955580 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-74d49769d7rh9zf" podUID="48562357-724a-429c-b5df-83c134ccdde5" containerName="main" probeResult="failure" output="Get \"https://10.132.0.36:8000/health\": dial tcp 10.132.0.36:8000: connect: connection refused" Apr 21 15:52:56.289998 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:52:56.289903 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-7b655f99d9-4s9g6" podUID="5ab1c193-73c8-471b-96d9-c21e1942317e" containerName="main" probeResult="failure" output="Get \"https://10.132.0.35:8000/health\": dial tcp 10.132.0.35:8000: connect: connection refused" Apr 21 15:53:00.954784 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:53:00.954729 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-74d49769d7rh9zf" podUID="48562357-724a-429c-b5df-83c134ccdde5" containerName="main" probeResult="failure" output="Get \"https://10.132.0.36:8000/health\": dial tcp 10.132.0.36:8000: connect: connection refused" Apr 21 15:53:06.289765 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:53:06.289715 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-7b655f99d9-4s9g6" podUID="5ab1c193-73c8-471b-96d9-c21e1942317e" containerName="main" probeResult="failure" output="Get \"https://10.132.0.35:8000/health\": dial tcp 10.132.0.35:8000: connect: connection refused" Apr 21 15:53:10.954634 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:53:10.954593 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-74d49769d7rh9zf" podUID="48562357-724a-429c-b5df-83c134ccdde5" containerName="main" probeResult="failure" output="Get \"https://10.132.0.36:8000/health\": dial tcp 10.132.0.36:8000: connect: connection refused" Apr 21 15:53:16.289697 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:53:16.289654 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-7b655f99d9-4s9g6" podUID="5ab1c193-73c8-471b-96d9-c21e1942317e" containerName="main" probeResult="failure" output="Get \"https://10.132.0.35:8000/health\": dial tcp 10.132.0.35:8000: connect: connection refused" Apr 21 15:53:20.955083 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:53:20.955037 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-74d49769d7rh9zf" podUID="48562357-724a-429c-b5df-83c134ccdde5" containerName="main" probeResult="failure" output="Get \"https://10.132.0.36:8000/health\": dial tcp 10.132.0.36:8000: connect: connection refused" Apr 21 15:53:26.289928 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:53:26.289870 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-7b655f99d9-4s9g6" podUID="5ab1c193-73c8-471b-96d9-c21e1942317e" containerName="main" probeResult="failure" output="Get \"https://10.132.0.35:8000/health\": dial tcp 10.132.0.35:8000: connect: connection refused" Apr 21 15:53:30.955664 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:53:30.955617 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-74d49769d7rh9zf" podUID="48562357-724a-429c-b5df-83c134ccdde5" containerName="main" probeResult="failure" output="Get \"https://10.132.0.36:8000/health\": dial tcp 10.132.0.36:8000: connect: connection refused" Apr 21 15:53:36.290010 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:53:36.289965 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-7b655f99d9-4s9g6" podUID="5ab1c193-73c8-471b-96d9-c21e1942317e" containerName="main" probeResult="failure" output="Get \"https://10.132.0.35:8000/health\": dial tcp 10.132.0.35:8000: connect: connection refused" Apr 21 15:53:40.955597 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:53:40.955555 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-74d49769d7rh9zf" podUID="48562357-724a-429c-b5df-83c134ccdde5" containerName="main" probeResult="failure" output="Get \"https://10.132.0.36:8000/health\": dial tcp 10.132.0.36:8000: connect: connection refused" Apr 21 15:53:46.289508 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:53:46.289452 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-7b655f99d9-4s9g6" podUID="5ab1c193-73c8-471b-96d9-c21e1942317e" containerName="main" probeResult="failure" output="Get \"https://10.132.0.35:8000/health\": dial tcp 10.132.0.35:8000: connect: connection refused" Apr 21 15:53:50.954880 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:53:50.954830 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-74d49769d7rh9zf" podUID="48562357-724a-429c-b5df-83c134ccdde5" containerName="main" probeResult="failure" output="Get \"https://10.132.0.36:8000/health\": dial tcp 10.132.0.36:8000: connect: connection refused" Apr 21 15:53:56.289859 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:53:56.289770 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-7b655f99d9-4s9g6" podUID="5ab1c193-73c8-471b-96d9-c21e1942317e" containerName="main" probeResult="failure" output="Get \"https://10.132.0.35:8000/health\": dial tcp 10.132.0.35:8000: connect: connection refused" Apr 21 15:54:00.955168 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:54:00.955114 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-74d49769d7rh9zf" podUID="48562357-724a-429c-b5df-83c134ccdde5" containerName="main" probeResult="failure" output="Get \"https://10.132.0.36:8000/health\": dial tcp 10.132.0.36:8000: connect: connection refused" Apr 21 15:54:06.289728 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:54:06.289679 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-7b655f99d9-4s9g6" podUID="5ab1c193-73c8-471b-96d9-c21e1942317e" containerName="main" probeResult="failure" output="Get \"https://10.132.0.35:8000/health\": dial tcp 10.132.0.35:8000: connect: connection refused" Apr 21 15:54:10.954783 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:54:10.954722 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-74d49769d7rh9zf" podUID="48562357-724a-429c-b5df-83c134ccdde5" containerName="main" probeResult="failure" output="Get \"https://10.132.0.36:8000/health\": dial tcp 10.132.0.36:8000: connect: connection refused" Apr 21 15:54:16.304143 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:54:16.304105 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/stop-feature-test-kserve-7b655f99d9-4s9g6" Apr 21 15:54:16.312000 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:54:16.311974 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-7b655f99d9-4s9g6" Apr 21 15:54:17.574517 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:54:17.574482 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-7b655f99d9-4s9g6"] Apr 21 15:54:18.294020 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:54:18.293982 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/stop-feature-test-kserve-7b655f99d9-4s9g6" podUID="5ab1c193-73c8-471b-96d9-c21e1942317e" containerName="main" containerID="cri-o://e430244df67b783432e1ce36bbc4c481e2a5137bd9a9f9304ea94f83af6ef0f9" gracePeriod=30 Apr 21 15:54:20.955276 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:54:20.955232 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-74d49769d7rh9zf" podUID="48562357-724a-429c-b5df-83c134ccdde5" containerName="main" probeResult="failure" output="Get \"https://10.132.0.36:8000/health\": dial tcp 10.132.0.36:8000: connect: connection refused" Apr 21 15:54:30.965234 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:54:30.965150 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-74d49769d7rh9zf" Apr 21 15:54:30.973014 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:54:30.972983 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-74d49769d7rh9zf" Apr 21 15:54:48.565400 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:54:48.565376 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-7b655f99d9-4s9g6_5ab1c193-73c8-471b-96d9-c21e1942317e/main/0.log" Apr 21 15:54:48.565743 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:54:48.565730 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-7b655f99d9-4s9g6" Apr 21 15:54:48.651649 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:54:48.651544 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/5ab1c193-73c8-471b-96d9-c21e1942317e-model-cache\") pod \"5ab1c193-73c8-471b-96d9-c21e1942317e\" (UID: \"5ab1c193-73c8-471b-96d9-c21e1942317e\") " Apr 21 15:54:48.651857 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:54:48.651707 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5ab1c193-73c8-471b-96d9-c21e1942317e-model-cache" (OuterVolumeSpecName: "model-cache") pod "5ab1c193-73c8-471b-96d9-c21e1942317e" (UID: "5ab1c193-73c8-471b-96d9-c21e1942317e"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 15:54:48.651857 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:54:48.651739 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fmcrg\" (UniqueName: \"kubernetes.io/projected/5ab1c193-73c8-471b-96d9-c21e1942317e-kube-api-access-fmcrg\") pod \"5ab1c193-73c8-471b-96d9-c21e1942317e\" (UID: \"5ab1c193-73c8-471b-96d9-c21e1942317e\") " Apr 21 15:54:48.651857 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:54:48.651781 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/5ab1c193-73c8-471b-96d9-c21e1942317e-dshm\") pod \"5ab1c193-73c8-471b-96d9-c21e1942317e\" (UID: \"5ab1c193-73c8-471b-96d9-c21e1942317e\") " Apr 21 15:54:48.652038 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:54:48.651864 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/5ab1c193-73c8-471b-96d9-c21e1942317e-home\") pod \"5ab1c193-73c8-471b-96d9-c21e1942317e\" (UID: \"5ab1c193-73c8-471b-96d9-c21e1942317e\") " Apr 21 15:54:48.652038 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:54:48.651921 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/5ab1c193-73c8-471b-96d9-c21e1942317e-tls-certs\") pod \"5ab1c193-73c8-471b-96d9-c21e1942317e\" (UID: \"5ab1c193-73c8-471b-96d9-c21e1942317e\") " Apr 21 15:54:48.652038 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:54:48.651951 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5ab1c193-73c8-471b-96d9-c21e1942317e-kserve-provision-location\") pod \"5ab1c193-73c8-471b-96d9-c21e1942317e\" (UID: \"5ab1c193-73c8-471b-96d9-c21e1942317e\") " Apr 21 15:54:48.652274 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:54:48.652239 2573 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/5ab1c193-73c8-471b-96d9-c21e1942317e-model-cache\") on node \"ip-10-0-136-123.ec2.internal\" DevicePath \"\"" Apr 21 15:54:48.652337 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:54:48.652264 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5ab1c193-73c8-471b-96d9-c21e1942317e-home" (OuterVolumeSpecName: "home") pod "5ab1c193-73c8-471b-96d9-c21e1942317e" (UID: "5ab1c193-73c8-471b-96d9-c21e1942317e"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 15:54:48.654168 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:54:48.654138 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ab1c193-73c8-471b-96d9-c21e1942317e-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "5ab1c193-73c8-471b-96d9-c21e1942317e" (UID: "5ab1c193-73c8-471b-96d9-c21e1942317e"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 15:54:48.654375 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:54:48.654355 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5ab1c193-73c8-471b-96d9-c21e1942317e-dshm" (OuterVolumeSpecName: "dshm") pod "5ab1c193-73c8-471b-96d9-c21e1942317e" (UID: "5ab1c193-73c8-471b-96d9-c21e1942317e"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 15:54:48.654441 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:54:48.654376 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ab1c193-73c8-471b-96d9-c21e1942317e-kube-api-access-fmcrg" (OuterVolumeSpecName: "kube-api-access-fmcrg") pod "5ab1c193-73c8-471b-96d9-c21e1942317e" (UID: "5ab1c193-73c8-471b-96d9-c21e1942317e"). InnerVolumeSpecName "kube-api-access-fmcrg". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 15:54:48.722943 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:54:48.722897 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5ab1c193-73c8-471b-96d9-c21e1942317e-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "5ab1c193-73c8-471b-96d9-c21e1942317e" (UID: "5ab1c193-73c8-471b-96d9-c21e1942317e"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 15:54:48.753649 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:54:48.753608 2573 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/5ab1c193-73c8-471b-96d9-c21e1942317e-tls-certs\") on node \"ip-10-0-136-123.ec2.internal\" DevicePath \"\"" Apr 21 15:54:48.753649 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:54:48.753641 2573 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5ab1c193-73c8-471b-96d9-c21e1942317e-kserve-provision-location\") on node \"ip-10-0-136-123.ec2.internal\" DevicePath \"\"" Apr 21 15:54:48.753649 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:54:48.753653 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-fmcrg\" (UniqueName: \"kubernetes.io/projected/5ab1c193-73c8-471b-96d9-c21e1942317e-kube-api-access-fmcrg\") on node \"ip-10-0-136-123.ec2.internal\" DevicePath \"\"" Apr 21 15:54:48.753908 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:54:48.753662 2573 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/5ab1c193-73c8-471b-96d9-c21e1942317e-dshm\") on node \"ip-10-0-136-123.ec2.internal\" DevicePath \"\"" Apr 21 15:54:48.753908 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:54:48.753672 2573 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/5ab1c193-73c8-471b-96d9-c21e1942317e-home\") on node \"ip-10-0-136-123.ec2.internal\" DevicePath \"\"" Apr 21 15:54:49.404880 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:54:49.404844 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-7b655f99d9-4s9g6_5ab1c193-73c8-471b-96d9-c21e1942317e/main/0.log" Apr 21 15:54:49.405232 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:54:49.405208 2573 generic.go:358] "Generic (PLEG): container finished" podID="5ab1c193-73c8-471b-96d9-c21e1942317e" containerID="e430244df67b783432e1ce36bbc4c481e2a5137bd9a9f9304ea94f83af6ef0f9" exitCode=137 Apr 21 15:54:49.405339 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:54:49.405256 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-7b655f99d9-4s9g6" event={"ID":"5ab1c193-73c8-471b-96d9-c21e1942317e","Type":"ContainerDied","Data":"e430244df67b783432e1ce36bbc4c481e2a5137bd9a9f9304ea94f83af6ef0f9"} Apr 21 15:54:49.405339 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:54:49.405279 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-7b655f99d9-4s9g6" event={"ID":"5ab1c193-73c8-471b-96d9-c21e1942317e","Type":"ContainerDied","Data":"44998776a7bef74bccec014766c6e027b1cb3a0a1a7adaf800d3ffbe8a19233b"} Apr 21 15:54:49.405339 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:54:49.405294 2573 scope.go:117] "RemoveContainer" containerID="e430244df67b783432e1ce36bbc4c481e2a5137bd9a9f9304ea94f83af6ef0f9" Apr 21 15:54:49.405339 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:54:49.405299 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-7b655f99d9-4s9g6" Apr 21 15:54:49.427031 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:54:49.427008 2573 scope.go:117] "RemoveContainer" containerID="df26f1a038f877da01b8e807f8f812e62f8fc6d607505c4597a27c4af743eb9e" Apr 21 15:54:49.433397 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:54:49.433371 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-7b655f99d9-4s9g6"] Apr 21 15:54:49.438912 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:54:49.438881 2573 scope.go:117] "RemoveContainer" containerID="e430244df67b783432e1ce36bbc4c481e2a5137bd9a9f9304ea94f83af6ef0f9" Apr 21 15:54:49.439348 ip-10-0-136-123 kubenswrapper[2573]: E0421 15:54:49.439316 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e430244df67b783432e1ce36bbc4c481e2a5137bd9a9f9304ea94f83af6ef0f9\": container with ID starting with e430244df67b783432e1ce36bbc4c481e2a5137bd9a9f9304ea94f83af6ef0f9 not found: ID does not exist" containerID="e430244df67b783432e1ce36bbc4c481e2a5137bd9a9f9304ea94f83af6ef0f9" Apr 21 15:54:49.439431 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:54:49.439359 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e430244df67b783432e1ce36bbc4c481e2a5137bd9a9f9304ea94f83af6ef0f9"} err="failed to get container status \"e430244df67b783432e1ce36bbc4c481e2a5137bd9a9f9304ea94f83af6ef0f9\": rpc error: code = NotFound desc = could not find container \"e430244df67b783432e1ce36bbc4c481e2a5137bd9a9f9304ea94f83af6ef0f9\": container with ID starting with e430244df67b783432e1ce36bbc4c481e2a5137bd9a9f9304ea94f83af6ef0f9 not found: ID does not exist" Apr 21 15:54:49.439431 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:54:49.439384 2573 scope.go:117] "RemoveContainer" containerID="df26f1a038f877da01b8e807f8f812e62f8fc6d607505c4597a27c4af743eb9e" Apr 21 15:54:49.439512 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:54:49.439482 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-7b655f99d9-4s9g6"] Apr 21 15:54:49.439740 ip-10-0-136-123 kubenswrapper[2573]: E0421 15:54:49.439712 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"df26f1a038f877da01b8e807f8f812e62f8fc6d607505c4597a27c4af743eb9e\": container with ID starting with df26f1a038f877da01b8e807f8f812e62f8fc6d607505c4597a27c4af743eb9e not found: ID does not exist" containerID="df26f1a038f877da01b8e807f8f812e62f8fc6d607505c4597a27c4af743eb9e" Apr 21 15:54:49.439818 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:54:49.439745 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df26f1a038f877da01b8e807f8f812e62f8fc6d607505c4597a27c4af743eb9e"} err="failed to get container status \"df26f1a038f877da01b8e807f8f812e62f8fc6d607505c4597a27c4af743eb9e\": rpc error: code = NotFound desc = could not find container \"df26f1a038f877da01b8e807f8f812e62f8fc6d607505c4597a27c4af743eb9e\": container with ID starting with df26f1a038f877da01b8e807f8f812e62f8fc6d607505c4597a27c4af743eb9e not found: ID does not exist" Apr 21 15:54:49.482002 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:54:49.481969 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5ab1c193-73c8-471b-96d9-c21e1942317e" path="/var/lib/kubelet/pods/5ab1c193-73c8-471b-96d9-c21e1942317e/volumes" Apr 21 15:54:53.515846 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:54:53.515806 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-74d49769d7rh9zf"] Apr 21 15:54:53.516273 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:54:53.516184 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-74d49769d7rh9zf" podUID="48562357-724a-429c-b5df-83c134ccdde5" containerName="main" containerID="cri-o://b5edf2968504e946118e067c24f0513df4da558de42fe795454b2d72c92b2834" gracePeriod=30 Apr 21 15:54:53.525294 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:54:53.525262 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schetgt8d"] Apr 21 15:54:53.525556 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:54:53.525526 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schetgt8d" podUID="e4221cc6-8994-49d8-b3d6-7cae9fd91f54" containerName="main" containerID="cri-o://62d7602bf1fa3255adcc6e68e747159424cefe82279e60cdff1568919c12db0a" gracePeriod=30 Apr 21 15:54:53.525699 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:54:53.525602 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schetgt8d" podUID="e4221cc6-8994-49d8-b3d6-7cae9fd91f54" containerName="tokenizer" containerID="cri-o://180527e394b6c3e51b6cce37904a3942a59ef53cad5d59071cb81294f75b10fe" gracePeriod=30 Apr 21 15:54:54.436488 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:54:54.436444 2573 generic.go:358] "Generic (PLEG): container finished" podID="e4221cc6-8994-49d8-b3d6-7cae9fd91f54" containerID="62d7602bf1fa3255adcc6e68e747159424cefe82279e60cdff1568919c12db0a" exitCode=0 Apr 21 15:54:54.436672 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:54:54.436507 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schetgt8d" event={"ID":"e4221cc6-8994-49d8-b3d6-7cae9fd91f54","Type":"ContainerDied","Data":"62d7602bf1fa3255adcc6e68e747159424cefe82279e60cdff1568919c12db0a"} Apr 21 15:54:54.881476 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:54:54.881447 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schetgt8d" Apr 21 15:54:55.010041 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:54:55.009946 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e4221cc6-8994-49d8-b3d6-7cae9fd91f54-kserve-provision-location\") pod \"e4221cc6-8994-49d8-b3d6-7cae9fd91f54\" (UID: \"e4221cc6-8994-49d8-b3d6-7cae9fd91f54\") " Apr 21 15:54:55.010041 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:54:55.009985 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e4221cc6-8994-49d8-b3d6-7cae9fd91f54-tls-certs\") pod \"e4221cc6-8994-49d8-b3d6-7cae9fd91f54\" (UID: \"e4221cc6-8994-49d8-b3d6-7cae9fd91f54\") " Apr 21 15:54:55.010273 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:54:55.010043 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/e4221cc6-8994-49d8-b3d6-7cae9fd91f54-tokenizer-cache\") pod \"e4221cc6-8994-49d8-b3d6-7cae9fd91f54\" (UID: \"e4221cc6-8994-49d8-b3d6-7cae9fd91f54\") " Apr 21 15:54:55.010273 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:54:55.010080 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/e4221cc6-8994-49d8-b3d6-7cae9fd91f54-tokenizer-tmp\") pod \"e4221cc6-8994-49d8-b3d6-7cae9fd91f54\" (UID: \"e4221cc6-8994-49d8-b3d6-7cae9fd91f54\") " Apr 21 15:54:55.010385 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:54:55.010290 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m4gjr\" (UniqueName: \"kubernetes.io/projected/e4221cc6-8994-49d8-b3d6-7cae9fd91f54-kube-api-access-m4gjr\") pod \"e4221cc6-8994-49d8-b3d6-7cae9fd91f54\" (UID: \"e4221cc6-8994-49d8-b3d6-7cae9fd91f54\") " Apr 21 15:54:55.010385 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:54:55.010322 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e4221cc6-8994-49d8-b3d6-7cae9fd91f54-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "e4221cc6-8994-49d8-b3d6-7cae9fd91f54" (UID: "e4221cc6-8994-49d8-b3d6-7cae9fd91f54"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 15:54:55.010385 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:54:55.010336 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/e4221cc6-8994-49d8-b3d6-7cae9fd91f54-tokenizer-uds\") pod \"e4221cc6-8994-49d8-b3d6-7cae9fd91f54\" (UID: \"e4221cc6-8994-49d8-b3d6-7cae9fd91f54\") " Apr 21 15:54:55.010520 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:54:55.010410 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e4221cc6-8994-49d8-b3d6-7cae9fd91f54-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "e4221cc6-8994-49d8-b3d6-7cae9fd91f54" (UID: "e4221cc6-8994-49d8-b3d6-7cae9fd91f54"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 15:54:55.010607 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:54:55.010580 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e4221cc6-8994-49d8-b3d6-7cae9fd91f54-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "e4221cc6-8994-49d8-b3d6-7cae9fd91f54" (UID: "e4221cc6-8994-49d8-b3d6-7cae9fd91f54"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 15:54:55.010659 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:54:55.010632 2573 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/e4221cc6-8994-49d8-b3d6-7cae9fd91f54-tokenizer-cache\") on node \"ip-10-0-136-123.ec2.internal\" DevicePath \"\"" Apr 21 15:54:55.010659 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:54:55.010651 2573 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/e4221cc6-8994-49d8-b3d6-7cae9fd91f54-tokenizer-tmp\") on node \"ip-10-0-136-123.ec2.internal\" DevicePath \"\"" Apr 21 15:54:55.010888 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:54:55.010868 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e4221cc6-8994-49d8-b3d6-7cae9fd91f54-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "e4221cc6-8994-49d8-b3d6-7cae9fd91f54" (UID: "e4221cc6-8994-49d8-b3d6-7cae9fd91f54"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 15:54:55.012210 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:54:55.012188 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4221cc6-8994-49d8-b3d6-7cae9fd91f54-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "e4221cc6-8994-49d8-b3d6-7cae9fd91f54" (UID: "e4221cc6-8994-49d8-b3d6-7cae9fd91f54"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 15:54:55.012329 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:54:55.012305 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e4221cc6-8994-49d8-b3d6-7cae9fd91f54-kube-api-access-m4gjr" (OuterVolumeSpecName: "kube-api-access-m4gjr") pod "e4221cc6-8994-49d8-b3d6-7cae9fd91f54" (UID: "e4221cc6-8994-49d8-b3d6-7cae9fd91f54"). InnerVolumeSpecName "kube-api-access-m4gjr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 15:54:55.111709 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:54:55.111672 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-m4gjr\" (UniqueName: \"kubernetes.io/projected/e4221cc6-8994-49d8-b3d6-7cae9fd91f54-kube-api-access-m4gjr\") on node \"ip-10-0-136-123.ec2.internal\" DevicePath \"\"" Apr 21 15:54:55.111709 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:54:55.111704 2573 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/e4221cc6-8994-49d8-b3d6-7cae9fd91f54-tokenizer-uds\") on node \"ip-10-0-136-123.ec2.internal\" DevicePath \"\"" Apr 21 15:54:55.111709 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:54:55.111715 2573 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e4221cc6-8994-49d8-b3d6-7cae9fd91f54-kserve-provision-location\") on node \"ip-10-0-136-123.ec2.internal\" DevicePath \"\"" Apr 21 15:54:55.111976 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:54:55.111726 2573 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e4221cc6-8994-49d8-b3d6-7cae9fd91f54-tls-certs\") on node \"ip-10-0-136-123.ec2.internal\" DevicePath \"\"" Apr 21 15:54:55.441923 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:54:55.441887 2573 generic.go:358] "Generic (PLEG): container finished" podID="e4221cc6-8994-49d8-b3d6-7cae9fd91f54" containerID="180527e394b6c3e51b6cce37904a3942a59ef53cad5d59071cb81294f75b10fe" exitCode=0 Apr 21 15:54:55.442104 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:54:55.441960 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schetgt8d" Apr 21 15:54:55.442104 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:54:55.441980 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schetgt8d" event={"ID":"e4221cc6-8994-49d8-b3d6-7cae9fd91f54","Type":"ContainerDied","Data":"180527e394b6c3e51b6cce37904a3942a59ef53cad5d59071cb81294f75b10fe"} Apr 21 15:54:55.442104 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:54:55.442022 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schetgt8d" event={"ID":"e4221cc6-8994-49d8-b3d6-7cae9fd91f54","Type":"ContainerDied","Data":"6c8789026ef8d2608c45075b92f41716751baea67cc5f50e2d535e87a6e8eca4"} Apr 21 15:54:55.442104 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:54:55.442042 2573 scope.go:117] "RemoveContainer" containerID="180527e394b6c3e51b6cce37904a3942a59ef53cad5d59071cb81294f75b10fe" Apr 21 15:54:55.451144 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:54:55.451113 2573 scope.go:117] "RemoveContainer" containerID="62d7602bf1fa3255adcc6e68e747159424cefe82279e60cdff1568919c12db0a" Apr 21 15:54:55.458773 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:54:55.458754 2573 scope.go:117] "RemoveContainer" containerID="03e776825761d06e8ce34c14f523f3c1dc174810d8cda8e35423e3d9c483c57c" Apr 21 15:54:55.466519 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:54:55.466490 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schetgt8d"] Apr 21 15:54:55.468757 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:54:55.468737 2573 scope.go:117] "RemoveContainer" containerID="180527e394b6c3e51b6cce37904a3942a59ef53cad5d59071cb81294f75b10fe" Apr 21 15:54:55.468881 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:54:55.468858 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schetgt8d"] Apr 21 15:54:55.469080 ip-10-0-136-123 kubenswrapper[2573]: E0421 15:54:55.469047 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"180527e394b6c3e51b6cce37904a3942a59ef53cad5d59071cb81294f75b10fe\": container with ID starting with 180527e394b6c3e51b6cce37904a3942a59ef53cad5d59071cb81294f75b10fe not found: ID does not exist" containerID="180527e394b6c3e51b6cce37904a3942a59ef53cad5d59071cb81294f75b10fe" Apr 21 15:54:55.469148 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:54:55.469094 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"180527e394b6c3e51b6cce37904a3942a59ef53cad5d59071cb81294f75b10fe"} err="failed to get container status \"180527e394b6c3e51b6cce37904a3942a59ef53cad5d59071cb81294f75b10fe\": rpc error: code = NotFound desc = could not find container \"180527e394b6c3e51b6cce37904a3942a59ef53cad5d59071cb81294f75b10fe\": container with ID starting with 180527e394b6c3e51b6cce37904a3942a59ef53cad5d59071cb81294f75b10fe not found: ID does not exist" Apr 21 15:54:55.469148 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:54:55.469121 2573 scope.go:117] "RemoveContainer" containerID="62d7602bf1fa3255adcc6e68e747159424cefe82279e60cdff1568919c12db0a" Apr 21 15:54:55.469391 ip-10-0-136-123 kubenswrapper[2573]: E0421 15:54:55.469374 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"62d7602bf1fa3255adcc6e68e747159424cefe82279e60cdff1568919c12db0a\": container with ID starting with 62d7602bf1fa3255adcc6e68e747159424cefe82279e60cdff1568919c12db0a not found: ID does not exist" containerID="62d7602bf1fa3255adcc6e68e747159424cefe82279e60cdff1568919c12db0a" Apr 21 15:54:55.469430 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:54:55.469397 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"62d7602bf1fa3255adcc6e68e747159424cefe82279e60cdff1568919c12db0a"} err="failed to get container status \"62d7602bf1fa3255adcc6e68e747159424cefe82279e60cdff1568919c12db0a\": rpc error: code = NotFound desc = could not find container \"62d7602bf1fa3255adcc6e68e747159424cefe82279e60cdff1568919c12db0a\": container with ID starting with 62d7602bf1fa3255adcc6e68e747159424cefe82279e60cdff1568919c12db0a not found: ID does not exist" Apr 21 15:54:55.469430 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:54:55.469414 2573 scope.go:117] "RemoveContainer" containerID="03e776825761d06e8ce34c14f523f3c1dc174810d8cda8e35423e3d9c483c57c" Apr 21 15:54:55.469641 ip-10-0-136-123 kubenswrapper[2573]: E0421 15:54:55.469622 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"03e776825761d06e8ce34c14f523f3c1dc174810d8cda8e35423e3d9c483c57c\": container with ID starting with 03e776825761d06e8ce34c14f523f3c1dc174810d8cda8e35423e3d9c483c57c not found: ID does not exist" containerID="03e776825761d06e8ce34c14f523f3c1dc174810d8cda8e35423e3d9c483c57c" Apr 21 15:54:55.469687 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:54:55.469650 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"03e776825761d06e8ce34c14f523f3c1dc174810d8cda8e35423e3d9c483c57c"} err="failed to get container status \"03e776825761d06e8ce34c14f523f3c1dc174810d8cda8e35423e3d9c483c57c\": rpc error: code = NotFound desc = could not find container \"03e776825761d06e8ce34c14f523f3c1dc174810d8cda8e35423e3d9c483c57c\": container with ID starting with 03e776825761d06e8ce34c14f523f3c1dc174810d8cda8e35423e3d9c483c57c not found: ID does not exist" Apr 21 15:54:55.482475 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:54:55.482448 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e4221cc6-8994-49d8-b3d6-7cae9fd91f54" path="/var/lib/kubelet/pods/e4221cc6-8994-49d8-b3d6-7cae9fd91f54/volumes" Apr 21 15:55:05.458627 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:55:05.458592 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9v44z_a8821bf6-e244-4b55-bfcc-7d85dec39bc4/ovn-acl-logging/0.log" Apr 21 15:55:05.459503 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:55:05.459473 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9v44z_a8821bf6-e244-4b55-bfcc-7d85dec39bc4/ovn-acl-logging/0.log" Apr 21 15:55:05.890517 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:55:05.890485 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-985dcd4b5-xldg8"] Apr 21 15:55:05.890890 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:55:05.890866 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5ab1c193-73c8-471b-96d9-c21e1942317e" containerName="main" Apr 21 15:55:05.890890 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:55:05.890889 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ab1c193-73c8-471b-96d9-c21e1942317e" containerName="main" Apr 21 15:55:05.891077 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:55:05.890914 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e4221cc6-8994-49d8-b3d6-7cae9fd91f54" containerName="tokenizer" Apr 21 15:55:05.891077 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:55:05.890922 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4221cc6-8994-49d8-b3d6-7cae9fd91f54" containerName="tokenizer" Apr 21 15:55:05.891077 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:55:05.890933 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5ab1c193-73c8-471b-96d9-c21e1942317e" containerName="storage-initializer" Apr 21 15:55:05.891077 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:55:05.890941 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ab1c193-73c8-471b-96d9-c21e1942317e" containerName="storage-initializer" Apr 21 15:55:05.891077 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:55:05.890961 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e4221cc6-8994-49d8-b3d6-7cae9fd91f54" containerName="storage-initializer" Apr 21 15:55:05.891077 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:55:05.890968 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4221cc6-8994-49d8-b3d6-7cae9fd91f54" containerName="storage-initializer" Apr 21 15:55:05.891077 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:55:05.890979 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e4221cc6-8994-49d8-b3d6-7cae9fd91f54" containerName="main" Apr 21 15:55:05.891077 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:55:05.890986 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4221cc6-8994-49d8-b3d6-7cae9fd91f54" containerName="main" Apr 21 15:55:05.891077 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:55:05.891070 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="e4221cc6-8994-49d8-b3d6-7cae9fd91f54" containerName="main" Apr 21 15:55:05.891531 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:55:05.891083 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="e4221cc6-8994-49d8-b3d6-7cae9fd91f54" containerName="tokenizer" Apr 21 15:55:05.891531 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:55:05.891094 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="5ab1c193-73c8-471b-96d9-c21e1942317e" containerName="main" Apr 21 15:55:05.894417 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:55:05.894391 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-985dcd4b5-xldg8" Apr 21 15:55:05.897743 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:55:05.897697 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"custom-route-timeout-test-kserve-self-signed-certs\"" Apr 21 15:55:05.908524 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:55:05.908490 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-985dcd4b5-xldg8"] Apr 21 15:55:06.015919 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:55:06.015880 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tmdrm\" (UniqueName: \"kubernetes.io/projected/703b5a30-0d32-4af1-a6ae-7c5466d465ad-kube-api-access-tmdrm\") pod \"custom-route-timeout-test-kserve-985dcd4b5-xldg8\" (UID: \"703b5a30-0d32-4af1-a6ae-7c5466d465ad\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-985dcd4b5-xldg8" Apr 21 15:55:06.015919 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:55:06.015923 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/703b5a30-0d32-4af1-a6ae-7c5466d465ad-tls-certs\") pod \"custom-route-timeout-test-kserve-985dcd4b5-xldg8\" (UID: \"703b5a30-0d32-4af1-a6ae-7c5466d465ad\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-985dcd4b5-xldg8" Apr 21 15:55:06.016199 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:55:06.015958 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/703b5a30-0d32-4af1-a6ae-7c5466d465ad-home\") pod \"custom-route-timeout-test-kserve-985dcd4b5-xldg8\" (UID: \"703b5a30-0d32-4af1-a6ae-7c5466d465ad\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-985dcd4b5-xldg8" Apr 21 15:55:06.016199 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:55:06.016036 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/703b5a30-0d32-4af1-a6ae-7c5466d465ad-dshm\") pod \"custom-route-timeout-test-kserve-985dcd4b5-xldg8\" (UID: \"703b5a30-0d32-4af1-a6ae-7c5466d465ad\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-985dcd4b5-xldg8" Apr 21 15:55:06.016199 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:55:06.016081 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/703b5a30-0d32-4af1-a6ae-7c5466d465ad-model-cache\") pod \"custom-route-timeout-test-kserve-985dcd4b5-xldg8\" (UID: \"703b5a30-0d32-4af1-a6ae-7c5466d465ad\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-985dcd4b5-xldg8" Apr 21 15:55:06.016199 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:55:06.016160 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/703b5a30-0d32-4af1-a6ae-7c5466d465ad-kserve-provision-location\") pod \"custom-route-timeout-test-kserve-985dcd4b5-xldg8\" (UID: \"703b5a30-0d32-4af1-a6ae-7c5466d465ad\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-985dcd4b5-xldg8" Apr 21 15:55:06.117040 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:55:06.116994 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/703b5a30-0d32-4af1-a6ae-7c5466d465ad-home\") pod \"custom-route-timeout-test-kserve-985dcd4b5-xldg8\" (UID: \"703b5a30-0d32-4af1-a6ae-7c5466d465ad\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-985dcd4b5-xldg8" Apr 21 15:55:06.117212 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:55:06.117055 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/703b5a30-0d32-4af1-a6ae-7c5466d465ad-dshm\") pod \"custom-route-timeout-test-kserve-985dcd4b5-xldg8\" (UID: \"703b5a30-0d32-4af1-a6ae-7c5466d465ad\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-985dcd4b5-xldg8" Apr 21 15:55:06.117212 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:55:06.117083 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/703b5a30-0d32-4af1-a6ae-7c5466d465ad-model-cache\") pod \"custom-route-timeout-test-kserve-985dcd4b5-xldg8\" (UID: \"703b5a30-0d32-4af1-a6ae-7c5466d465ad\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-985dcd4b5-xldg8" Apr 21 15:55:06.117212 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:55:06.117137 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/703b5a30-0d32-4af1-a6ae-7c5466d465ad-kserve-provision-location\") pod \"custom-route-timeout-test-kserve-985dcd4b5-xldg8\" (UID: \"703b5a30-0d32-4af1-a6ae-7c5466d465ad\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-985dcd4b5-xldg8" Apr 21 15:55:06.117212 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:55:06.117193 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tmdrm\" (UniqueName: \"kubernetes.io/projected/703b5a30-0d32-4af1-a6ae-7c5466d465ad-kube-api-access-tmdrm\") pod \"custom-route-timeout-test-kserve-985dcd4b5-xldg8\" (UID: \"703b5a30-0d32-4af1-a6ae-7c5466d465ad\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-985dcd4b5-xldg8" Apr 21 15:55:06.117441 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:55:06.117223 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/703b5a30-0d32-4af1-a6ae-7c5466d465ad-tls-certs\") pod \"custom-route-timeout-test-kserve-985dcd4b5-xldg8\" (UID: \"703b5a30-0d32-4af1-a6ae-7c5466d465ad\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-985dcd4b5-xldg8" Apr 21 15:55:06.117499 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:55:06.117459 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/703b5a30-0d32-4af1-a6ae-7c5466d465ad-home\") pod \"custom-route-timeout-test-kserve-985dcd4b5-xldg8\" (UID: \"703b5a30-0d32-4af1-a6ae-7c5466d465ad\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-985dcd4b5-xldg8" Apr 21 15:55:06.117499 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:55:06.117490 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/703b5a30-0d32-4af1-a6ae-7c5466d465ad-model-cache\") pod \"custom-route-timeout-test-kserve-985dcd4b5-xldg8\" (UID: \"703b5a30-0d32-4af1-a6ae-7c5466d465ad\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-985dcd4b5-xldg8" Apr 21 15:55:06.117615 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:55:06.117592 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/703b5a30-0d32-4af1-a6ae-7c5466d465ad-kserve-provision-location\") pod \"custom-route-timeout-test-kserve-985dcd4b5-xldg8\" (UID: \"703b5a30-0d32-4af1-a6ae-7c5466d465ad\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-985dcd4b5-xldg8" Apr 21 15:55:06.119437 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:55:06.119406 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/703b5a30-0d32-4af1-a6ae-7c5466d465ad-dshm\") pod \"custom-route-timeout-test-kserve-985dcd4b5-xldg8\" (UID: \"703b5a30-0d32-4af1-a6ae-7c5466d465ad\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-985dcd4b5-xldg8" Apr 21 15:55:06.119726 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:55:06.119708 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/703b5a30-0d32-4af1-a6ae-7c5466d465ad-tls-certs\") pod \"custom-route-timeout-test-kserve-985dcd4b5-xldg8\" (UID: \"703b5a30-0d32-4af1-a6ae-7c5466d465ad\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-985dcd4b5-xldg8" Apr 21 15:55:06.128568 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:55:06.128530 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tmdrm\" (UniqueName: \"kubernetes.io/projected/703b5a30-0d32-4af1-a6ae-7c5466d465ad-kube-api-access-tmdrm\") pod \"custom-route-timeout-test-kserve-985dcd4b5-xldg8\" (UID: \"703b5a30-0d32-4af1-a6ae-7c5466d465ad\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-985dcd4b5-xldg8" Apr 21 15:55:06.144387 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:55:06.144324 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-5fc4dcb59gkng"] Apr 21 15:55:06.148038 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:55:06.148017 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-5fc4dcb59gkng" Apr 21 15:55:06.152971 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:55:06.152945 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"custom-route-timeout-test-epp-sa-dockercfg-r87lc\"" Apr 21 15:55:06.162626 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:55:06.162601 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-5fc4dcb59gkng"] Apr 21 15:55:06.208986 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:55:06.208941 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-985dcd4b5-xldg8" Apr 21 15:55:06.218028 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:55:06.217994 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/04a9d119-e0f0-49b7-83b1-4b09dc5d6cd7-tokenizer-cache\") pod \"custom-route-timeout-test-kserve-router-scheduler-5fc4dcb59gkng\" (UID: \"04a9d119-e0f0-49b7-83b1-4b09dc5d6cd7\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-5fc4dcb59gkng" Apr 21 15:55:06.218164 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:55:06.218032 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/04a9d119-e0f0-49b7-83b1-4b09dc5d6cd7-kserve-provision-location\") pod \"custom-route-timeout-test-kserve-router-scheduler-5fc4dcb59gkng\" (UID: \"04a9d119-e0f0-49b7-83b1-4b09dc5d6cd7\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-5fc4dcb59gkng" Apr 21 15:55:06.218164 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:55:06.218097 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/04a9d119-e0f0-49b7-83b1-4b09dc5d6cd7-tls-certs\") pod \"custom-route-timeout-test-kserve-router-scheduler-5fc4dcb59gkng\" (UID: \"04a9d119-e0f0-49b7-83b1-4b09dc5d6cd7\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-5fc4dcb59gkng" Apr 21 15:55:06.218272 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:55:06.218166 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mrnlc\" (UniqueName: \"kubernetes.io/projected/04a9d119-e0f0-49b7-83b1-4b09dc5d6cd7-kube-api-access-mrnlc\") pod \"custom-route-timeout-test-kserve-router-scheduler-5fc4dcb59gkng\" (UID: \"04a9d119-e0f0-49b7-83b1-4b09dc5d6cd7\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-5fc4dcb59gkng" Apr 21 15:55:06.218326 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:55:06.218272 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/04a9d119-e0f0-49b7-83b1-4b09dc5d6cd7-tokenizer-uds\") pod \"custom-route-timeout-test-kserve-router-scheduler-5fc4dcb59gkng\" (UID: \"04a9d119-e0f0-49b7-83b1-4b09dc5d6cd7\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-5fc4dcb59gkng" Apr 21 15:55:06.218326 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:55:06.218307 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/04a9d119-e0f0-49b7-83b1-4b09dc5d6cd7-tokenizer-tmp\") pod \"custom-route-timeout-test-kserve-router-scheduler-5fc4dcb59gkng\" (UID: \"04a9d119-e0f0-49b7-83b1-4b09dc5d6cd7\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-5fc4dcb59gkng" Apr 21 15:55:06.319542 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:55:06.319508 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/04a9d119-e0f0-49b7-83b1-4b09dc5d6cd7-tokenizer-uds\") pod \"custom-route-timeout-test-kserve-router-scheduler-5fc4dcb59gkng\" (UID: \"04a9d119-e0f0-49b7-83b1-4b09dc5d6cd7\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-5fc4dcb59gkng" Apr 21 15:55:06.319697 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:55:06.319554 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/04a9d119-e0f0-49b7-83b1-4b09dc5d6cd7-tokenizer-tmp\") pod \"custom-route-timeout-test-kserve-router-scheduler-5fc4dcb59gkng\" (UID: \"04a9d119-e0f0-49b7-83b1-4b09dc5d6cd7\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-5fc4dcb59gkng" Apr 21 15:55:06.319697 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:55:06.319639 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/04a9d119-e0f0-49b7-83b1-4b09dc5d6cd7-tokenizer-cache\") pod \"custom-route-timeout-test-kserve-router-scheduler-5fc4dcb59gkng\" (UID: \"04a9d119-e0f0-49b7-83b1-4b09dc5d6cd7\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-5fc4dcb59gkng" Apr 21 15:55:06.319697 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:55:06.319666 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/04a9d119-e0f0-49b7-83b1-4b09dc5d6cd7-kserve-provision-location\") pod \"custom-route-timeout-test-kserve-router-scheduler-5fc4dcb59gkng\" (UID: \"04a9d119-e0f0-49b7-83b1-4b09dc5d6cd7\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-5fc4dcb59gkng" Apr 21 15:55:06.319889 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:55:06.319707 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/04a9d119-e0f0-49b7-83b1-4b09dc5d6cd7-tls-certs\") pod \"custom-route-timeout-test-kserve-router-scheduler-5fc4dcb59gkng\" (UID: \"04a9d119-e0f0-49b7-83b1-4b09dc5d6cd7\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-5fc4dcb59gkng" Apr 21 15:55:06.319889 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:55:06.319733 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mrnlc\" (UniqueName: \"kubernetes.io/projected/04a9d119-e0f0-49b7-83b1-4b09dc5d6cd7-kube-api-access-mrnlc\") pod \"custom-route-timeout-test-kserve-router-scheduler-5fc4dcb59gkng\" (UID: \"04a9d119-e0f0-49b7-83b1-4b09dc5d6cd7\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-5fc4dcb59gkng" Apr 21 15:55:06.320058 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:55:06.320032 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/04a9d119-e0f0-49b7-83b1-4b09dc5d6cd7-tokenizer-uds\") pod \"custom-route-timeout-test-kserve-router-scheduler-5fc4dcb59gkng\" (UID: \"04a9d119-e0f0-49b7-83b1-4b09dc5d6cd7\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-5fc4dcb59gkng" Apr 21 15:55:06.320131 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:55:06.320095 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/04a9d119-e0f0-49b7-83b1-4b09dc5d6cd7-tokenizer-tmp\") pod \"custom-route-timeout-test-kserve-router-scheduler-5fc4dcb59gkng\" (UID: \"04a9d119-e0f0-49b7-83b1-4b09dc5d6cd7\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-5fc4dcb59gkng" Apr 21 15:55:06.320182 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:55:06.320141 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/04a9d119-e0f0-49b7-83b1-4b09dc5d6cd7-kserve-provision-location\") pod \"custom-route-timeout-test-kserve-router-scheduler-5fc4dcb59gkng\" (UID: \"04a9d119-e0f0-49b7-83b1-4b09dc5d6cd7\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-5fc4dcb59gkng" Apr 21 15:55:06.320246 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:55:06.320227 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/04a9d119-e0f0-49b7-83b1-4b09dc5d6cd7-tokenizer-cache\") pod \"custom-route-timeout-test-kserve-router-scheduler-5fc4dcb59gkng\" (UID: \"04a9d119-e0f0-49b7-83b1-4b09dc5d6cd7\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-5fc4dcb59gkng" Apr 21 15:55:06.322258 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:55:06.322235 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/04a9d119-e0f0-49b7-83b1-4b09dc5d6cd7-tls-certs\") pod \"custom-route-timeout-test-kserve-router-scheduler-5fc4dcb59gkng\" (UID: \"04a9d119-e0f0-49b7-83b1-4b09dc5d6cd7\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-5fc4dcb59gkng" Apr 21 15:55:06.330190 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:55:06.330168 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mrnlc\" (UniqueName: \"kubernetes.io/projected/04a9d119-e0f0-49b7-83b1-4b09dc5d6cd7-kube-api-access-mrnlc\") pod \"custom-route-timeout-test-kserve-router-scheduler-5fc4dcb59gkng\" (UID: \"04a9d119-e0f0-49b7-83b1-4b09dc5d6cd7\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-5fc4dcb59gkng" Apr 21 15:55:06.343116 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:55:06.343080 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-985dcd4b5-xldg8"] Apr 21 15:55:06.346315 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:55:06.346284 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod703b5a30_0d32_4af1_a6ae_7c5466d465ad.slice/crio-c446ea6a25e81e85fa0c7c4b8814d1946f1bc066bafce3efe67ba12ca6eea8e6 WatchSource:0}: Error finding container c446ea6a25e81e85fa0c7c4b8814d1946f1bc066bafce3efe67ba12ca6eea8e6: Status 404 returned error can't find the container with id c446ea6a25e81e85fa0c7c4b8814d1946f1bc066bafce3efe67ba12ca6eea8e6 Apr 21 15:55:06.348212 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:55:06.348193 2573 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 21 15:55:06.458917 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:55:06.458877 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-5fc4dcb59gkng" Apr 21 15:55:06.492104 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:55:06.492047 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-985dcd4b5-xldg8" event={"ID":"703b5a30-0d32-4af1-a6ae-7c5466d465ad","Type":"ContainerStarted","Data":"0fb2e89f170ca69cc245270204a95b6365ef662cc0c7970c3103b816b08e1253"} Apr 21 15:55:06.492104 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:55:06.492085 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-985dcd4b5-xldg8" event={"ID":"703b5a30-0d32-4af1-a6ae-7c5466d465ad","Type":"ContainerStarted","Data":"c446ea6a25e81e85fa0c7c4b8814d1946f1bc066bafce3efe67ba12ca6eea8e6"} Apr 21 15:55:06.602470 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:55:06.602442 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-5fc4dcb59gkng"] Apr 21 15:55:06.604487 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:55:06.604440 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod04a9d119_e0f0_49b7_83b1_4b09dc5d6cd7.slice/crio-88961c2f4993d380248bc481ea768da9a6b15d8c1f3eba98dcc24131e8cc9eb8 WatchSource:0}: Error finding container 88961c2f4993d380248bc481ea768da9a6b15d8c1f3eba98dcc24131e8cc9eb8: Status 404 returned error can't find the container with id 88961c2f4993d380248bc481ea768da9a6b15d8c1f3eba98dcc24131e8cc9eb8 Apr 21 15:55:07.496429 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:55:07.496387 2573 generic.go:358] "Generic (PLEG): container finished" podID="04a9d119-e0f0-49b7-83b1-4b09dc5d6cd7" containerID="9e9e842a28c5feb04091bea25aefa98dae37f4c0ca63d9bbc9ea82c402e9d634" exitCode=0 Apr 21 15:55:07.496951 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:55:07.496436 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-5fc4dcb59gkng" event={"ID":"04a9d119-e0f0-49b7-83b1-4b09dc5d6cd7","Type":"ContainerDied","Data":"9e9e842a28c5feb04091bea25aefa98dae37f4c0ca63d9bbc9ea82c402e9d634"} Apr 21 15:55:07.496951 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:55:07.496468 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-5fc4dcb59gkng" event={"ID":"04a9d119-e0f0-49b7-83b1-4b09dc5d6cd7","Type":"ContainerStarted","Data":"88961c2f4993d380248bc481ea768da9a6b15d8c1f3eba98dcc24131e8cc9eb8"} Apr 21 15:55:08.502572 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:55:08.502535 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-5fc4dcb59gkng" event={"ID":"04a9d119-e0f0-49b7-83b1-4b09dc5d6cd7","Type":"ContainerStarted","Data":"794f2a3a95c7d4e18a89f00a8a0ad56d5c7d9f979654313c05d6e8ce30a0dd5c"} Apr 21 15:55:08.502572 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:55:08.502577 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-5fc4dcb59gkng" event={"ID":"04a9d119-e0f0-49b7-83b1-4b09dc5d6cd7","Type":"ContainerStarted","Data":"3fea8f050e4b4a5180cda0da35acd51603d345da0fd319c1e4266d968a5d5663"} Apr 21 15:55:08.503083 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:55:08.502723 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-5fc4dcb59gkng" Apr 21 15:55:08.537044 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:55:08.536994 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-5fc4dcb59gkng" podStartSLOduration=2.536972891 podStartE2EDuration="2.536972891s" podCreationTimestamp="2026-04-21 15:55:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 15:55:08.534492554 +0000 UTC m=+1203.664546191" watchObservedRunningTime="2026-04-21 15:55:08.536972891 +0000 UTC m=+1203.667026526" Apr 21 15:55:11.520630 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:55:11.520590 2573 generic.go:358] "Generic (PLEG): container finished" podID="703b5a30-0d32-4af1-a6ae-7c5466d465ad" containerID="0fb2e89f170ca69cc245270204a95b6365ef662cc0c7970c3103b816b08e1253" exitCode=0 Apr 21 15:55:11.521153 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:55:11.520666 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-985dcd4b5-xldg8" event={"ID":"703b5a30-0d32-4af1-a6ae-7c5466d465ad","Type":"ContainerDied","Data":"0fb2e89f170ca69cc245270204a95b6365ef662cc0c7970c3103b816b08e1253"} Apr 21 15:55:12.526748 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:55:12.526705 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-985dcd4b5-xldg8" event={"ID":"703b5a30-0d32-4af1-a6ae-7c5466d465ad","Type":"ContainerStarted","Data":"54264cd3ef9d96e03e8f47b418b032dee3d67a41e60e01e40175216e76c1e1be"} Apr 21 15:55:12.551461 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:55:12.551406 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-985dcd4b5-xldg8" podStartSLOduration=7.551389607 podStartE2EDuration="7.551389607s" podCreationTimestamp="2026-04-21 15:55:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 15:55:12.548535913 +0000 UTC m=+1207.678589558" watchObservedRunningTime="2026-04-21 15:55:12.551389607 +0000 UTC m=+1207.681443244" Apr 21 15:55:16.209969 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:55:16.209925 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-985dcd4b5-xldg8" Apr 21 15:55:16.210451 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:55:16.209979 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-985dcd4b5-xldg8" Apr 21 15:55:16.211553 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:55:16.211523 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-985dcd4b5-xldg8" podUID="703b5a30-0d32-4af1-a6ae-7c5466d465ad" containerName="main" probeResult="failure" output="Get \"https://10.132.0.38:8000/health\": dial tcp 10.132.0.38:8000: connect: connection refused" Apr 21 15:55:16.459024 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:55:16.458979 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-5fc4dcb59gkng" Apr 21 15:55:16.459232 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:55:16.459025 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-5fc4dcb59gkng" Apr 21 15:55:16.461900 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:55:16.461821 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-5fc4dcb59gkng" Apr 21 15:55:16.543822 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:55:16.543774 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-5fc4dcb59gkng" Apr 21 15:55:23.825097 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:55:23.825043 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-74d49769d7rh9zf_48562357-724a-429c-b5df-83c134ccdde5/main/0.log" Apr 21 15:55:23.825508 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:55:23.825491 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-74d49769d7rh9zf" Apr 21 15:55:23.984902 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:55:23.984865 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/48562357-724a-429c-b5df-83c134ccdde5-home\") pod \"48562357-724a-429c-b5df-83c134ccdde5\" (UID: \"48562357-724a-429c-b5df-83c134ccdde5\") " Apr 21 15:55:23.985109 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:55:23.984934 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/48562357-724a-429c-b5df-83c134ccdde5-dshm\") pod \"48562357-724a-429c-b5df-83c134ccdde5\" (UID: \"48562357-724a-429c-b5df-83c134ccdde5\") " Apr 21 15:55:23.985109 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:55:23.984978 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/48562357-724a-429c-b5df-83c134ccdde5-kserve-provision-location\") pod \"48562357-724a-429c-b5df-83c134ccdde5\" (UID: \"48562357-724a-429c-b5df-83c134ccdde5\") " Apr 21 15:55:23.985109 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:55:23.985014 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/48562357-724a-429c-b5df-83c134ccdde5-model-cache\") pod \"48562357-724a-429c-b5df-83c134ccdde5\" (UID: \"48562357-724a-429c-b5df-83c134ccdde5\") " Apr 21 15:55:23.985109 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:55:23.985039 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/48562357-724a-429c-b5df-83c134ccdde5-tls-certs\") pod \"48562357-724a-429c-b5df-83c134ccdde5\" (UID: \"48562357-724a-429c-b5df-83c134ccdde5\") " Apr 21 15:55:23.985109 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:55:23.985078 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7b87q\" (UniqueName: \"kubernetes.io/projected/48562357-724a-429c-b5df-83c134ccdde5-kube-api-access-7b87q\") pod \"48562357-724a-429c-b5df-83c134ccdde5\" (UID: \"48562357-724a-429c-b5df-83c134ccdde5\") " Apr 21 15:55:23.985362 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:55:23.985296 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/48562357-724a-429c-b5df-83c134ccdde5-model-cache" (OuterVolumeSpecName: "model-cache") pod "48562357-724a-429c-b5df-83c134ccdde5" (UID: "48562357-724a-429c-b5df-83c134ccdde5"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 15:55:23.985362 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:55:23.985334 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/48562357-724a-429c-b5df-83c134ccdde5-home" (OuterVolumeSpecName: "home") pod "48562357-724a-429c-b5df-83c134ccdde5" (UID: "48562357-724a-429c-b5df-83c134ccdde5"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 15:55:23.985484 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:55:23.985465 2573 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/48562357-724a-429c-b5df-83c134ccdde5-home\") on node \"ip-10-0-136-123.ec2.internal\" DevicePath \"\"" Apr 21 15:55:23.985546 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:55:23.985488 2573 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/48562357-724a-429c-b5df-83c134ccdde5-model-cache\") on node \"ip-10-0-136-123.ec2.internal\" DevicePath \"\"" Apr 21 15:55:23.987116 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:55:23.987090 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/48562357-724a-429c-b5df-83c134ccdde5-dshm" (OuterVolumeSpecName: "dshm") pod "48562357-724a-429c-b5df-83c134ccdde5" (UID: "48562357-724a-429c-b5df-83c134ccdde5"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 15:55:23.987350 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:55:23.987331 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/48562357-724a-429c-b5df-83c134ccdde5-kube-api-access-7b87q" (OuterVolumeSpecName: "kube-api-access-7b87q") pod "48562357-724a-429c-b5df-83c134ccdde5" (UID: "48562357-724a-429c-b5df-83c134ccdde5"). InnerVolumeSpecName "kube-api-access-7b87q". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 15:55:23.987601 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:55:23.987566 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48562357-724a-429c-b5df-83c134ccdde5-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "48562357-724a-429c-b5df-83c134ccdde5" (UID: "48562357-724a-429c-b5df-83c134ccdde5"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 15:55:24.059542 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:55:24.059472 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/48562357-724a-429c-b5df-83c134ccdde5-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "48562357-724a-429c-b5df-83c134ccdde5" (UID: "48562357-724a-429c-b5df-83c134ccdde5"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 15:55:24.086681 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:55:24.086633 2573 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/48562357-724a-429c-b5df-83c134ccdde5-dshm\") on node \"ip-10-0-136-123.ec2.internal\" DevicePath \"\"" Apr 21 15:55:24.086835 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:55:24.086688 2573 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/48562357-724a-429c-b5df-83c134ccdde5-kserve-provision-location\") on node \"ip-10-0-136-123.ec2.internal\" DevicePath \"\"" Apr 21 15:55:24.086835 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:55:24.086706 2573 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/48562357-724a-429c-b5df-83c134ccdde5-tls-certs\") on node \"ip-10-0-136-123.ec2.internal\" DevicePath \"\"" Apr 21 15:55:24.086835 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:55:24.086724 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-7b87q\" (UniqueName: \"kubernetes.io/projected/48562357-724a-429c-b5df-83c134ccdde5-kube-api-access-7b87q\") on node \"ip-10-0-136-123.ec2.internal\" DevicePath \"\"" Apr 21 15:55:24.573129 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:55:24.573095 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-74d49769d7rh9zf_48562357-724a-429c-b5df-83c134ccdde5/main/0.log" Apr 21 15:55:24.573481 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:55:24.573456 2573 generic.go:358] "Generic (PLEG): container finished" podID="48562357-724a-429c-b5df-83c134ccdde5" containerID="b5edf2968504e946118e067c24f0513df4da558de42fe795454b2d72c92b2834" exitCode=137 Apr 21 15:55:24.573536 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:55:24.573525 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-74d49769d7rh9zf" event={"ID":"48562357-724a-429c-b5df-83c134ccdde5","Type":"ContainerDied","Data":"b5edf2968504e946118e067c24f0513df4da558de42fe795454b2d72c92b2834"} Apr 21 15:55:24.573594 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:55:24.573552 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-74d49769d7rh9zf" event={"ID":"48562357-724a-429c-b5df-83c134ccdde5","Type":"ContainerDied","Data":"9f056a79751fe0324f2ecc2aebdea67ad1d2e8fc623c865f06cebc624dceef1f"} Apr 21 15:55:24.573594 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:55:24.573568 2573 scope.go:117] "RemoveContainer" containerID="b5edf2968504e946118e067c24f0513df4da558de42fe795454b2d72c92b2834" Apr 21 15:55:24.573678 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:55:24.573592 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-74d49769d7rh9zf" Apr 21 15:55:24.595430 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:55:24.595402 2573 scope.go:117] "RemoveContainer" containerID="94a74bc2439309ee40cf217645f5f34d7ba4ab5639df16c139824562265160b9" Apr 21 15:55:24.602166 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:55:24.602140 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-74d49769d7rh9zf"] Apr 21 15:55:24.608229 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:55:24.608192 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-74d49769d7rh9zf"] Apr 21 15:55:24.613048 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:55:24.613016 2573 scope.go:117] "RemoveContainer" containerID="b5edf2968504e946118e067c24f0513df4da558de42fe795454b2d72c92b2834" Apr 21 15:55:24.613556 ip-10-0-136-123 kubenswrapper[2573]: E0421 15:55:24.613527 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b5edf2968504e946118e067c24f0513df4da558de42fe795454b2d72c92b2834\": container with ID starting with b5edf2968504e946118e067c24f0513df4da558de42fe795454b2d72c92b2834 not found: ID does not exist" containerID="b5edf2968504e946118e067c24f0513df4da558de42fe795454b2d72c92b2834" Apr 21 15:55:24.613621 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:55:24.613571 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b5edf2968504e946118e067c24f0513df4da558de42fe795454b2d72c92b2834"} err="failed to get container status \"b5edf2968504e946118e067c24f0513df4da558de42fe795454b2d72c92b2834\": rpc error: code = NotFound desc = could not find container \"b5edf2968504e946118e067c24f0513df4da558de42fe795454b2d72c92b2834\": container with ID starting with b5edf2968504e946118e067c24f0513df4da558de42fe795454b2d72c92b2834 not found: ID does not exist" Apr 21 15:55:24.613621 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:55:24.613614 2573 scope.go:117] "RemoveContainer" containerID="94a74bc2439309ee40cf217645f5f34d7ba4ab5639df16c139824562265160b9" Apr 21 15:55:24.613988 ip-10-0-136-123 kubenswrapper[2573]: E0421 15:55:24.613944 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"94a74bc2439309ee40cf217645f5f34d7ba4ab5639df16c139824562265160b9\": container with ID starting with 94a74bc2439309ee40cf217645f5f34d7ba4ab5639df16c139824562265160b9 not found: ID does not exist" containerID="94a74bc2439309ee40cf217645f5f34d7ba4ab5639df16c139824562265160b9" Apr 21 15:55:24.614057 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:55:24.614000 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"94a74bc2439309ee40cf217645f5f34d7ba4ab5639df16c139824562265160b9"} err="failed to get container status \"94a74bc2439309ee40cf217645f5f34d7ba4ab5639df16c139824562265160b9\": rpc error: code = NotFound desc = could not find container \"94a74bc2439309ee40cf217645f5f34d7ba4ab5639df16c139824562265160b9\": container with ID starting with 94a74bc2439309ee40cf217645f5f34d7ba4ab5639df16c139824562265160b9 not found: ID does not exist" Apr 21 15:55:25.481839 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:55:25.481787 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="48562357-724a-429c-b5df-83c134ccdde5" path="/var/lib/kubelet/pods/48562357-724a-429c-b5df-83c134ccdde5/volumes" Apr 21 15:55:26.210332 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:55:26.210282 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-985dcd4b5-xldg8" podUID="703b5a30-0d32-4af1-a6ae-7c5466d465ad" containerName="main" probeResult="failure" output="Get \"https://10.132.0.38:8000/health\": dial tcp 10.132.0.38:8000: connect: connection refused" Apr 21 15:55:36.210032 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:55:36.209988 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-985dcd4b5-xldg8" podUID="703b5a30-0d32-4af1-a6ae-7c5466d465ad" containerName="main" probeResult="failure" output="Get \"https://10.132.0.38:8000/health\": dial tcp 10.132.0.38:8000: connect: connection refused" Apr 21 15:55:38.553246 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:55:38.553215 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-5fc4dcb59gkng" Apr 21 15:55:46.209261 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:55:46.209215 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-985dcd4b5-xldg8" podUID="703b5a30-0d32-4af1-a6ae-7c5466d465ad" containerName="main" probeResult="failure" output="Get \"https://10.132.0.38:8000/health\": dial tcp 10.132.0.38:8000: connect: connection refused" Apr 21 15:55:56.210264 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:55:56.210170 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-985dcd4b5-xldg8" podUID="703b5a30-0d32-4af1-a6ae-7c5466d465ad" containerName="main" probeResult="failure" output="Get \"https://10.132.0.38:8000/health\": dial tcp 10.132.0.38:8000: connect: connection refused" Apr 21 15:56:06.210046 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:56:06.209997 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-985dcd4b5-xldg8" podUID="703b5a30-0d32-4af1-a6ae-7c5466d465ad" containerName="main" probeResult="failure" output="Get \"https://10.132.0.38:8000/health\": dial tcp 10.132.0.38:8000: connect: connection refused" Apr 21 15:56:16.210105 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:56:16.210057 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-985dcd4b5-xldg8" podUID="703b5a30-0d32-4af1-a6ae-7c5466d465ad" containerName="main" probeResult="failure" output="Get \"https://10.132.0.38:8000/health\": dial tcp 10.132.0.38:8000: connect: connection refused" Apr 21 15:56:26.210323 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:56:26.210278 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-985dcd4b5-xldg8" podUID="703b5a30-0d32-4af1-a6ae-7c5466d465ad" containerName="main" probeResult="failure" output="Get \"https://10.132.0.38:8000/health\": dial tcp 10.132.0.38:8000: connect: connection refused" Apr 21 15:56:36.209421 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:56:36.209377 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-985dcd4b5-xldg8" podUID="703b5a30-0d32-4af1-a6ae-7c5466d465ad" containerName="main" probeResult="failure" output="Get \"https://10.132.0.38:8000/health\": dial tcp 10.132.0.38:8000: connect: connection refused" Apr 21 15:56:46.219290 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:56:46.219258 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-985dcd4b5-xldg8" Apr 21 15:56:46.226945 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:56:46.226917 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-985dcd4b5-xldg8" Apr 21 15:56:52.504101 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:56:52.504049 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-985dcd4b5-xldg8"] Apr 21 15:56:52.505006 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:56:52.504951 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-985dcd4b5-xldg8" podUID="703b5a30-0d32-4af1-a6ae-7c5466d465ad" containerName="main" containerID="cri-o://54264cd3ef9d96e03e8f47b418b032dee3d67a41e60e01e40175216e76c1e1be" gracePeriod=30 Apr 21 15:56:52.506229 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:56:52.506207 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-5fc4dcb59gkng"] Apr 21 15:56:52.506566 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:56:52.506542 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-5fc4dcb59gkng" podUID="04a9d119-e0f0-49b7-83b1-4b09dc5d6cd7" containerName="main" containerID="cri-o://3fea8f050e4b4a5180cda0da35acd51603d345da0fd319c1e4266d968a5d5663" gracePeriod=30 Apr 21 15:56:52.506685 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:56:52.506632 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-5fc4dcb59gkng" podUID="04a9d119-e0f0-49b7-83b1-4b09dc5d6cd7" containerName="tokenizer" containerID="cri-o://794f2a3a95c7d4e18a89f00a8a0ad56d5c7d9f979654313c05d6e8ce30a0dd5c" gracePeriod=30 Apr 21 15:56:52.887278 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:56:52.887240 2573 generic.go:358] "Generic (PLEG): container finished" podID="04a9d119-e0f0-49b7-83b1-4b09dc5d6cd7" containerID="3fea8f050e4b4a5180cda0da35acd51603d345da0fd319c1e4266d968a5d5663" exitCode=0 Apr 21 15:56:52.887448 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:56:52.887284 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-5fc4dcb59gkng" event={"ID":"04a9d119-e0f0-49b7-83b1-4b09dc5d6cd7","Type":"ContainerDied","Data":"3fea8f050e4b4a5180cda0da35acd51603d345da0fd319c1e4266d968a5d5663"} Apr 21 15:56:53.697072 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:56:53.697050 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-5fc4dcb59gkng" Apr 21 15:56:53.829444 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:56:53.829395 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/04a9d119-e0f0-49b7-83b1-4b09dc5d6cd7-tokenizer-cache\") pod \"04a9d119-e0f0-49b7-83b1-4b09dc5d6cd7\" (UID: \"04a9d119-e0f0-49b7-83b1-4b09dc5d6cd7\") " Apr 21 15:56:53.829627 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:56:53.829482 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/04a9d119-e0f0-49b7-83b1-4b09dc5d6cd7-tokenizer-uds\") pod \"04a9d119-e0f0-49b7-83b1-4b09dc5d6cd7\" (UID: \"04a9d119-e0f0-49b7-83b1-4b09dc5d6cd7\") " Apr 21 15:56:53.829627 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:56:53.829549 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/04a9d119-e0f0-49b7-83b1-4b09dc5d6cd7-kserve-provision-location\") pod \"04a9d119-e0f0-49b7-83b1-4b09dc5d6cd7\" (UID: \"04a9d119-e0f0-49b7-83b1-4b09dc5d6cd7\") " Apr 21 15:56:53.829627 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:56:53.829587 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/04a9d119-e0f0-49b7-83b1-4b09dc5d6cd7-tokenizer-tmp\") pod \"04a9d119-e0f0-49b7-83b1-4b09dc5d6cd7\" (UID: \"04a9d119-e0f0-49b7-83b1-4b09dc5d6cd7\") " Apr 21 15:56:53.829627 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:56:53.829625 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/04a9d119-e0f0-49b7-83b1-4b09dc5d6cd7-tls-certs\") pod \"04a9d119-e0f0-49b7-83b1-4b09dc5d6cd7\" (UID: \"04a9d119-e0f0-49b7-83b1-4b09dc5d6cd7\") " Apr 21 15:56:53.829924 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:56:53.829649 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mrnlc\" (UniqueName: \"kubernetes.io/projected/04a9d119-e0f0-49b7-83b1-4b09dc5d6cd7-kube-api-access-mrnlc\") pod \"04a9d119-e0f0-49b7-83b1-4b09dc5d6cd7\" (UID: \"04a9d119-e0f0-49b7-83b1-4b09dc5d6cd7\") " Apr 21 15:56:53.830124 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:56:53.830095 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/04a9d119-e0f0-49b7-83b1-4b09dc5d6cd7-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "04a9d119-e0f0-49b7-83b1-4b09dc5d6cd7" (UID: "04a9d119-e0f0-49b7-83b1-4b09dc5d6cd7"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 15:56:53.830208 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:56:53.830180 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/04a9d119-e0f0-49b7-83b1-4b09dc5d6cd7-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "04a9d119-e0f0-49b7-83b1-4b09dc5d6cd7" (UID: "04a9d119-e0f0-49b7-83b1-4b09dc5d6cd7"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 15:56:53.830272 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:56:53.830250 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/04a9d119-e0f0-49b7-83b1-4b09dc5d6cd7-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "04a9d119-e0f0-49b7-83b1-4b09dc5d6cd7" (UID: "04a9d119-e0f0-49b7-83b1-4b09dc5d6cd7"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 15:56:53.830928 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:56:53.830897 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/04a9d119-e0f0-49b7-83b1-4b09dc5d6cd7-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "04a9d119-e0f0-49b7-83b1-4b09dc5d6cd7" (UID: "04a9d119-e0f0-49b7-83b1-4b09dc5d6cd7"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 15:56:53.832322 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:56:53.832294 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/04a9d119-e0f0-49b7-83b1-4b09dc5d6cd7-kube-api-access-mrnlc" (OuterVolumeSpecName: "kube-api-access-mrnlc") pod "04a9d119-e0f0-49b7-83b1-4b09dc5d6cd7" (UID: "04a9d119-e0f0-49b7-83b1-4b09dc5d6cd7"). InnerVolumeSpecName "kube-api-access-mrnlc". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 15:56:53.832445 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:56:53.832386 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04a9d119-e0f0-49b7-83b1-4b09dc5d6cd7-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "04a9d119-e0f0-49b7-83b1-4b09dc5d6cd7" (UID: "04a9d119-e0f0-49b7-83b1-4b09dc5d6cd7"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 15:56:53.893062 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:56:53.893028 2573 generic.go:358] "Generic (PLEG): container finished" podID="04a9d119-e0f0-49b7-83b1-4b09dc5d6cd7" containerID="794f2a3a95c7d4e18a89f00a8a0ad56d5c7d9f979654313c05d6e8ce30a0dd5c" exitCode=0 Apr 21 15:56:53.893173 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:56:53.893101 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-5fc4dcb59gkng" event={"ID":"04a9d119-e0f0-49b7-83b1-4b09dc5d6cd7","Type":"ContainerDied","Data":"794f2a3a95c7d4e18a89f00a8a0ad56d5c7d9f979654313c05d6e8ce30a0dd5c"} Apr 21 15:56:53.893173 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:56:53.893107 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-5fc4dcb59gkng" Apr 21 15:56:53.893173 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:56:53.893135 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-5fc4dcb59gkng" event={"ID":"04a9d119-e0f0-49b7-83b1-4b09dc5d6cd7","Type":"ContainerDied","Data":"88961c2f4993d380248bc481ea768da9a6b15d8c1f3eba98dcc24131e8cc9eb8"} Apr 21 15:56:53.893173 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:56:53.893153 2573 scope.go:117] "RemoveContainer" containerID="794f2a3a95c7d4e18a89f00a8a0ad56d5c7d9f979654313c05d6e8ce30a0dd5c" Apr 21 15:56:53.901952 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:56:53.901935 2573 scope.go:117] "RemoveContainer" containerID="3fea8f050e4b4a5180cda0da35acd51603d345da0fd319c1e4266d968a5d5663" Apr 21 15:56:53.921820 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:56:53.918043 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-5fc4dcb59gkng"] Apr 21 15:56:53.922674 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:56:53.922601 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-5fc4dcb59gkng"] Apr 21 15:56:53.931347 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:56:53.931229 2573 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/04a9d119-e0f0-49b7-83b1-4b09dc5d6cd7-tls-certs\") on node \"ip-10-0-136-123.ec2.internal\" DevicePath \"\"" Apr 21 15:56:53.931347 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:56:53.931263 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-mrnlc\" (UniqueName: \"kubernetes.io/projected/04a9d119-e0f0-49b7-83b1-4b09dc5d6cd7-kube-api-access-mrnlc\") on node \"ip-10-0-136-123.ec2.internal\" DevicePath \"\"" Apr 21 15:56:53.931347 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:56:53.931281 2573 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/04a9d119-e0f0-49b7-83b1-4b09dc5d6cd7-tokenizer-cache\") on node \"ip-10-0-136-123.ec2.internal\" DevicePath \"\"" Apr 21 15:56:53.931347 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:56:53.931297 2573 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/04a9d119-e0f0-49b7-83b1-4b09dc5d6cd7-tokenizer-uds\") on node \"ip-10-0-136-123.ec2.internal\" DevicePath \"\"" Apr 21 15:56:53.931347 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:56:53.931312 2573 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/04a9d119-e0f0-49b7-83b1-4b09dc5d6cd7-kserve-provision-location\") on node \"ip-10-0-136-123.ec2.internal\" DevicePath \"\"" Apr 21 15:56:53.931347 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:56:53.931325 2573 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/04a9d119-e0f0-49b7-83b1-4b09dc5d6cd7-tokenizer-tmp\") on node \"ip-10-0-136-123.ec2.internal\" DevicePath \"\"" Apr 21 15:56:53.933518 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:56:53.933498 2573 scope.go:117] "RemoveContainer" containerID="9e9e842a28c5feb04091bea25aefa98dae37f4c0ca63d9bbc9ea82c402e9d634" Apr 21 15:56:53.941223 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:56:53.941194 2573 scope.go:117] "RemoveContainer" containerID="794f2a3a95c7d4e18a89f00a8a0ad56d5c7d9f979654313c05d6e8ce30a0dd5c" Apr 21 15:56:53.941491 ip-10-0-136-123 kubenswrapper[2573]: E0421 15:56:53.941471 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"794f2a3a95c7d4e18a89f00a8a0ad56d5c7d9f979654313c05d6e8ce30a0dd5c\": container with ID starting with 794f2a3a95c7d4e18a89f00a8a0ad56d5c7d9f979654313c05d6e8ce30a0dd5c not found: ID does not exist" containerID="794f2a3a95c7d4e18a89f00a8a0ad56d5c7d9f979654313c05d6e8ce30a0dd5c" Apr 21 15:56:53.941547 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:56:53.941500 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"794f2a3a95c7d4e18a89f00a8a0ad56d5c7d9f979654313c05d6e8ce30a0dd5c"} err="failed to get container status \"794f2a3a95c7d4e18a89f00a8a0ad56d5c7d9f979654313c05d6e8ce30a0dd5c\": rpc error: code = NotFound desc = could not find container \"794f2a3a95c7d4e18a89f00a8a0ad56d5c7d9f979654313c05d6e8ce30a0dd5c\": container with ID starting with 794f2a3a95c7d4e18a89f00a8a0ad56d5c7d9f979654313c05d6e8ce30a0dd5c not found: ID does not exist" Apr 21 15:56:53.941547 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:56:53.941522 2573 scope.go:117] "RemoveContainer" containerID="3fea8f050e4b4a5180cda0da35acd51603d345da0fd319c1e4266d968a5d5663" Apr 21 15:56:53.941762 ip-10-0-136-123 kubenswrapper[2573]: E0421 15:56:53.941743 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3fea8f050e4b4a5180cda0da35acd51603d345da0fd319c1e4266d968a5d5663\": container with ID starting with 3fea8f050e4b4a5180cda0da35acd51603d345da0fd319c1e4266d968a5d5663 not found: ID does not exist" containerID="3fea8f050e4b4a5180cda0da35acd51603d345da0fd319c1e4266d968a5d5663" Apr 21 15:56:53.941840 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:56:53.941772 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3fea8f050e4b4a5180cda0da35acd51603d345da0fd319c1e4266d968a5d5663"} err="failed to get container status \"3fea8f050e4b4a5180cda0da35acd51603d345da0fd319c1e4266d968a5d5663\": rpc error: code = NotFound desc = could not find container \"3fea8f050e4b4a5180cda0da35acd51603d345da0fd319c1e4266d968a5d5663\": container with ID starting with 3fea8f050e4b4a5180cda0da35acd51603d345da0fd319c1e4266d968a5d5663 not found: ID does not exist" Apr 21 15:56:53.941840 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:56:53.941812 2573 scope.go:117] "RemoveContainer" containerID="9e9e842a28c5feb04091bea25aefa98dae37f4c0ca63d9bbc9ea82c402e9d634" Apr 21 15:56:53.942084 ip-10-0-136-123 kubenswrapper[2573]: E0421 15:56:53.942062 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9e9e842a28c5feb04091bea25aefa98dae37f4c0ca63d9bbc9ea82c402e9d634\": container with ID starting with 9e9e842a28c5feb04091bea25aefa98dae37f4c0ca63d9bbc9ea82c402e9d634 not found: ID does not exist" containerID="9e9e842a28c5feb04091bea25aefa98dae37f4c0ca63d9bbc9ea82c402e9d634" Apr 21 15:56:53.942119 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:56:53.942091 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e9e842a28c5feb04091bea25aefa98dae37f4c0ca63d9bbc9ea82c402e9d634"} err="failed to get container status \"9e9e842a28c5feb04091bea25aefa98dae37f4c0ca63d9bbc9ea82c402e9d634\": rpc error: code = NotFound desc = could not find container \"9e9e842a28c5feb04091bea25aefa98dae37f4c0ca63d9bbc9ea82c402e9d634\": container with ID starting with 9e9e842a28c5feb04091bea25aefa98dae37f4c0ca63d9bbc9ea82c402e9d634 not found: ID does not exist" Apr 21 15:56:55.486249 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:56:55.486213 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="04a9d119-e0f0-49b7-83b1-4b09dc5d6cd7" path="/var/lib/kubelet/pods/04a9d119-e0f0-49b7-83b1-4b09dc5d6cd7/volumes" Apr 21 15:56:56.855241 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:56:56.855198 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-5458bd7bf-txt8m"] Apr 21 15:56:56.855695 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:56:56.855659 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="04a9d119-e0f0-49b7-83b1-4b09dc5d6cd7" containerName="main" Apr 21 15:56:56.855695 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:56:56.855673 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="04a9d119-e0f0-49b7-83b1-4b09dc5d6cd7" containerName="main" Apr 21 15:56:56.855695 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:56:56.855688 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="48562357-724a-429c-b5df-83c134ccdde5" containerName="storage-initializer" Apr 21 15:56:56.855695 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:56:56.855696 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="48562357-724a-429c-b5df-83c134ccdde5" containerName="storage-initializer" Apr 21 15:56:56.855940 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:56:56.855708 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="48562357-724a-429c-b5df-83c134ccdde5" containerName="main" Apr 21 15:56:56.855940 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:56:56.855717 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="48562357-724a-429c-b5df-83c134ccdde5" containerName="main" Apr 21 15:56:56.855940 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:56:56.855725 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="04a9d119-e0f0-49b7-83b1-4b09dc5d6cd7" containerName="storage-initializer" Apr 21 15:56:56.855940 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:56:56.855733 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="04a9d119-e0f0-49b7-83b1-4b09dc5d6cd7" containerName="storage-initializer" Apr 21 15:56:56.855940 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:56:56.855744 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="04a9d119-e0f0-49b7-83b1-4b09dc5d6cd7" containerName="tokenizer" Apr 21 15:56:56.855940 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:56:56.855752 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="04a9d119-e0f0-49b7-83b1-4b09dc5d6cd7" containerName="tokenizer" Apr 21 15:56:56.855940 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:56:56.855890 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="48562357-724a-429c-b5df-83c134ccdde5" containerName="main" Apr 21 15:56:56.855940 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:56:56.855908 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="04a9d119-e0f0-49b7-83b1-4b09dc5d6cd7" containerName="tokenizer" Apr 21 15:56:56.855940 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:56:56.855918 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="04a9d119-e0f0-49b7-83b1-4b09dc5d6cd7" containerName="main" Apr 21 15:56:56.859460 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:56:56.859437 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-5458bd7bf-txt8m" Apr 21 15:56:56.862407 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:56:56.862384 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-with-refs-test-epp-sa-dockercfg-ffhwc\"" Apr 21 15:56:56.862520 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:56:56.862466 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-with-refs-test-kserve-self-signed-certs\"" Apr 21 15:56:56.870458 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:56:56.870434 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-5458bd7bf-txt8m"] Apr 21 15:56:56.955697 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:56:56.955664 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/071ee785-900b-47c6-9e8f-c65e221cc3ad-tokenizer-tmp\") pod \"router-with-refs-test-kserve-router-scheduler-5458bd7bf-txt8m\" (UID: \"071ee785-900b-47c6-9e8f-c65e221cc3ad\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-5458bd7bf-txt8m" Apr 21 15:56:56.955949 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:56:56.955708 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/071ee785-900b-47c6-9e8f-c65e221cc3ad-tokenizer-cache\") pod \"router-with-refs-test-kserve-router-scheduler-5458bd7bf-txt8m\" (UID: \"071ee785-900b-47c6-9e8f-c65e221cc3ad\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-5458bd7bf-txt8m" Apr 21 15:56:56.955949 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:56:56.955725 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fv9kl\" (UniqueName: \"kubernetes.io/projected/071ee785-900b-47c6-9e8f-c65e221cc3ad-kube-api-access-fv9kl\") pod \"router-with-refs-test-kserve-router-scheduler-5458bd7bf-txt8m\" (UID: \"071ee785-900b-47c6-9e8f-c65e221cc3ad\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-5458bd7bf-txt8m" Apr 21 15:56:56.955949 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:56:56.955856 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/071ee785-900b-47c6-9e8f-c65e221cc3ad-kserve-provision-location\") pod \"router-with-refs-test-kserve-router-scheduler-5458bd7bf-txt8m\" (UID: \"071ee785-900b-47c6-9e8f-c65e221cc3ad\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-5458bd7bf-txt8m" Apr 21 15:56:56.955949 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:56:56.955892 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/071ee785-900b-47c6-9e8f-c65e221cc3ad-tls-certs\") pod \"router-with-refs-test-kserve-router-scheduler-5458bd7bf-txt8m\" (UID: \"071ee785-900b-47c6-9e8f-c65e221cc3ad\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-5458bd7bf-txt8m" Apr 21 15:56:56.956112 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:56:56.955949 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/071ee785-900b-47c6-9e8f-c65e221cc3ad-tokenizer-uds\") pod \"router-with-refs-test-kserve-router-scheduler-5458bd7bf-txt8m\" (UID: \"071ee785-900b-47c6-9e8f-c65e221cc3ad\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-5458bd7bf-txt8m" Apr 21 15:56:57.057046 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:56:57.057007 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/071ee785-900b-47c6-9e8f-c65e221cc3ad-kserve-provision-location\") pod \"router-with-refs-test-kserve-router-scheduler-5458bd7bf-txt8m\" (UID: \"071ee785-900b-47c6-9e8f-c65e221cc3ad\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-5458bd7bf-txt8m" Apr 21 15:56:57.057194 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:56:57.057059 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/071ee785-900b-47c6-9e8f-c65e221cc3ad-tls-certs\") pod \"router-with-refs-test-kserve-router-scheduler-5458bd7bf-txt8m\" (UID: \"071ee785-900b-47c6-9e8f-c65e221cc3ad\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-5458bd7bf-txt8m" Apr 21 15:56:57.057194 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:56:57.057096 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/071ee785-900b-47c6-9e8f-c65e221cc3ad-tokenizer-uds\") pod \"router-with-refs-test-kserve-router-scheduler-5458bd7bf-txt8m\" (UID: \"071ee785-900b-47c6-9e8f-c65e221cc3ad\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-5458bd7bf-txt8m" Apr 21 15:56:57.057194 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:56:57.057146 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/071ee785-900b-47c6-9e8f-c65e221cc3ad-tokenizer-tmp\") pod \"router-with-refs-test-kserve-router-scheduler-5458bd7bf-txt8m\" (UID: \"071ee785-900b-47c6-9e8f-c65e221cc3ad\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-5458bd7bf-txt8m" Apr 21 15:56:57.057335 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:56:57.057314 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/071ee785-900b-47c6-9e8f-c65e221cc3ad-tokenizer-cache\") pod \"router-with-refs-test-kserve-router-scheduler-5458bd7bf-txt8m\" (UID: \"071ee785-900b-47c6-9e8f-c65e221cc3ad\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-5458bd7bf-txt8m" Apr 21 15:56:57.057392 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:56:57.057349 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fv9kl\" (UniqueName: \"kubernetes.io/projected/071ee785-900b-47c6-9e8f-c65e221cc3ad-kube-api-access-fv9kl\") pod \"router-with-refs-test-kserve-router-scheduler-5458bd7bf-txt8m\" (UID: \"071ee785-900b-47c6-9e8f-c65e221cc3ad\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-5458bd7bf-txt8m" Apr 21 15:56:57.057485 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:56:57.057465 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/071ee785-900b-47c6-9e8f-c65e221cc3ad-kserve-provision-location\") pod \"router-with-refs-test-kserve-router-scheduler-5458bd7bf-txt8m\" (UID: \"071ee785-900b-47c6-9e8f-c65e221cc3ad\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-5458bd7bf-txt8m" Apr 21 15:56:57.057536 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:56:57.057511 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/071ee785-900b-47c6-9e8f-c65e221cc3ad-tokenizer-uds\") pod \"router-with-refs-test-kserve-router-scheduler-5458bd7bf-txt8m\" (UID: \"071ee785-900b-47c6-9e8f-c65e221cc3ad\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-5458bd7bf-txt8m" Apr 21 15:56:57.057536 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:56:57.057529 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/071ee785-900b-47c6-9e8f-c65e221cc3ad-tokenizer-tmp\") pod \"router-with-refs-test-kserve-router-scheduler-5458bd7bf-txt8m\" (UID: \"071ee785-900b-47c6-9e8f-c65e221cc3ad\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-5458bd7bf-txt8m" Apr 21 15:56:57.057757 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:56:57.057732 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/071ee785-900b-47c6-9e8f-c65e221cc3ad-tokenizer-cache\") pod \"router-with-refs-test-kserve-router-scheduler-5458bd7bf-txt8m\" (UID: \"071ee785-900b-47c6-9e8f-c65e221cc3ad\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-5458bd7bf-txt8m" Apr 21 15:56:57.059741 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:56:57.059717 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/071ee785-900b-47c6-9e8f-c65e221cc3ad-tls-certs\") pod \"router-with-refs-test-kserve-router-scheduler-5458bd7bf-txt8m\" (UID: \"071ee785-900b-47c6-9e8f-c65e221cc3ad\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-5458bd7bf-txt8m" Apr 21 15:56:57.065530 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:56:57.065505 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fv9kl\" (UniqueName: \"kubernetes.io/projected/071ee785-900b-47c6-9e8f-c65e221cc3ad-kube-api-access-fv9kl\") pod \"router-with-refs-test-kserve-router-scheduler-5458bd7bf-txt8m\" (UID: \"071ee785-900b-47c6-9e8f-c65e221cc3ad\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-5458bd7bf-txt8m" Apr 21 15:56:57.170371 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:56:57.170288 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-5458bd7bf-txt8m" Apr 21 15:56:57.298970 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:56:57.298941 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-5458bd7bf-txt8m"] Apr 21 15:56:57.301456 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:56:57.301427 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod071ee785_900b_47c6_9e8f_c65e221cc3ad.slice/crio-aa93790b8f3d7ccf01fda798578e64475a412b502d6859d053f482d2f122df80 WatchSource:0}: Error finding container aa93790b8f3d7ccf01fda798578e64475a412b502d6859d053f482d2f122df80: Status 404 returned error can't find the container with id aa93790b8f3d7ccf01fda798578e64475a412b502d6859d053f482d2f122df80 Apr 21 15:56:57.911387 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:56:57.911353 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-5458bd7bf-txt8m" event={"ID":"071ee785-900b-47c6-9e8f-c65e221cc3ad","Type":"ContainerStarted","Data":"eddadb309bb46ff01e8897112be22e42d4596c8b2431ff68724cdc270c50a7e5"} Apr 21 15:56:57.911387 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:56:57.911393 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-5458bd7bf-txt8m" event={"ID":"071ee785-900b-47c6-9e8f-c65e221cc3ad","Type":"ContainerStarted","Data":"aa93790b8f3d7ccf01fda798578e64475a412b502d6859d053f482d2f122df80"} Apr 21 15:56:58.915558 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:56:58.915526 2573 generic.go:358] "Generic (PLEG): container finished" podID="071ee785-900b-47c6-9e8f-c65e221cc3ad" containerID="eddadb309bb46ff01e8897112be22e42d4596c8b2431ff68724cdc270c50a7e5" exitCode=0 Apr 21 15:56:58.915947 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:56:58.915600 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-5458bd7bf-txt8m" event={"ID":"071ee785-900b-47c6-9e8f-c65e221cc3ad","Type":"ContainerDied","Data":"eddadb309bb46ff01e8897112be22e42d4596c8b2431ff68724cdc270c50a7e5"} Apr 21 15:56:59.920653 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:56:59.920614 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-5458bd7bf-txt8m" event={"ID":"071ee785-900b-47c6-9e8f-c65e221cc3ad","Type":"ContainerStarted","Data":"891f0b3b168ff5c2446ed30820064889dcd8fe2ccf25ffc06e9d711067920fbc"} Apr 21 15:56:59.920653 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:56:59.920656 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-5458bd7bf-txt8m" event={"ID":"071ee785-900b-47c6-9e8f-c65e221cc3ad","Type":"ContainerStarted","Data":"92d0556ebe10cb1168b8dd2131cc38b12e6bcc0f4e36cc163bd2a2abda514cdc"} Apr 21 15:56:59.921117 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:56:59.920687 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-5458bd7bf-txt8m" Apr 21 15:56:59.943246 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:56:59.943186 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-5458bd7bf-txt8m" podStartSLOduration=3.94316652 podStartE2EDuration="3.94316652s" podCreationTimestamp="2026-04-21 15:56:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 15:56:59.940921567 +0000 UTC m=+1315.070975202" watchObservedRunningTime="2026-04-21 15:56:59.94316652 +0000 UTC m=+1315.073220156" Apr 21 15:57:03.837104 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:57:03.837060 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/llmisvc-controller-manager-7454559757-dpm6j"] Apr 21 15:57:03.837570 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:57:03.837313 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve/llmisvc-controller-manager-7454559757-dpm6j" podUID="56a319d7-3883-4871-bc37-8a8b0772785c" containerName="manager" containerID="cri-o://1b1808b36d101e7222c748900fe184877fad0ac9376ea9125f64c8fed74a291c" gracePeriod=30 Apr 21 15:57:07.170509 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:57:07.170474 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-5458bd7bf-txt8m" Apr 21 15:57:07.170968 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:57:07.170637 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-5458bd7bf-txt8m" Apr 21 15:57:07.173288 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:57:07.173265 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-5458bd7bf-txt8m" Apr 21 15:57:07.948208 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:57:07.948174 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-5458bd7bf-txt8m" Apr 21 15:57:08.952337 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:57:08.952245 2573 generic.go:358] "Generic (PLEG): container finished" podID="56a319d7-3883-4871-bc37-8a8b0772785c" containerID="1b1808b36d101e7222c748900fe184877fad0ac9376ea9125f64c8fed74a291c" exitCode=0 Apr 21 15:57:08.952337 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:57:08.952318 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-7454559757-dpm6j" event={"ID":"56a319d7-3883-4871-bc37-8a8b0772785c","Type":"ContainerDied","Data":"1b1808b36d101e7222c748900fe184877fad0ac9376ea9125f64c8fed74a291c"} Apr 21 15:57:09.276098 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:57:09.276076 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-7454559757-dpm6j" Apr 21 15:57:09.361937 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:57:09.361905 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d687v\" (UniqueName: \"kubernetes.io/projected/56a319d7-3883-4871-bc37-8a8b0772785c-kube-api-access-d687v\") pod \"56a319d7-3883-4871-bc37-8a8b0772785c\" (UID: \"56a319d7-3883-4871-bc37-8a8b0772785c\") " Apr 21 15:57:09.361937 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:57:09.361945 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/56a319d7-3883-4871-bc37-8a8b0772785c-cert\") pod \"56a319d7-3883-4871-bc37-8a8b0772785c\" (UID: \"56a319d7-3883-4871-bc37-8a8b0772785c\") " Apr 21 15:57:09.364016 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:57:09.363980 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/56a319d7-3883-4871-bc37-8a8b0772785c-kube-api-access-d687v" (OuterVolumeSpecName: "kube-api-access-d687v") pod "56a319d7-3883-4871-bc37-8a8b0772785c" (UID: "56a319d7-3883-4871-bc37-8a8b0772785c"). InnerVolumeSpecName "kube-api-access-d687v". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 15:57:09.364112 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:57:09.364050 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56a319d7-3883-4871-bc37-8a8b0772785c-cert" (OuterVolumeSpecName: "cert") pod "56a319d7-3883-4871-bc37-8a8b0772785c" (UID: "56a319d7-3883-4871-bc37-8a8b0772785c"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 15:57:09.463205 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:57:09.463155 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-d687v\" (UniqueName: \"kubernetes.io/projected/56a319d7-3883-4871-bc37-8a8b0772785c-kube-api-access-d687v\") on node \"ip-10-0-136-123.ec2.internal\" DevicePath \"\"" Apr 21 15:57:09.463205 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:57:09.463200 2573 reconciler_common.go:299] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/56a319d7-3883-4871-bc37-8a8b0772785c-cert\") on node \"ip-10-0-136-123.ec2.internal\" DevicePath \"\"" Apr 21 15:57:09.957649 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:57:09.957619 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-7454559757-dpm6j" Apr 21 15:57:09.958136 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:57:09.957642 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-7454559757-dpm6j" event={"ID":"56a319d7-3883-4871-bc37-8a8b0772785c","Type":"ContainerDied","Data":"151332a37a3e95397413b2b0a2e78145273da6e14515cd5744c289656b2a2fd4"} Apr 21 15:57:09.958136 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:57:09.957689 2573 scope.go:117] "RemoveContainer" containerID="1b1808b36d101e7222c748900fe184877fad0ac9376ea9125f64c8fed74a291c" Apr 21 15:57:09.975822 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:57:09.975783 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/llmisvc-controller-manager-7454559757-dpm6j"] Apr 21 15:57:09.979200 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:57:09.979174 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve/llmisvc-controller-manager-7454559757-dpm6j"] Apr 21 15:57:11.481355 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:57:11.481319 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="56a319d7-3883-4871-bc37-8a8b0772785c" path="/var/lib/kubelet/pods/56a319d7-3883-4871-bc37-8a8b0772785c/volumes" Apr 21 15:57:22.758678 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:57:22.758653 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_custom-route-timeout-test-kserve-985dcd4b5-xldg8_703b5a30-0d32-4af1-a6ae-7c5466d465ad/main/0.log" Apr 21 15:57:22.759070 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:57:22.759033 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-985dcd4b5-xldg8" Apr 21 15:57:22.877105 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:57:22.877076 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/703b5a30-0d32-4af1-a6ae-7c5466d465ad-kserve-provision-location\") pod \"703b5a30-0d32-4af1-a6ae-7c5466d465ad\" (UID: \"703b5a30-0d32-4af1-a6ae-7c5466d465ad\") " Apr 21 15:57:22.877273 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:57:22.877138 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/703b5a30-0d32-4af1-a6ae-7c5466d465ad-tls-certs\") pod \"703b5a30-0d32-4af1-a6ae-7c5466d465ad\" (UID: \"703b5a30-0d32-4af1-a6ae-7c5466d465ad\") " Apr 21 15:57:22.877273 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:57:22.877154 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/703b5a30-0d32-4af1-a6ae-7c5466d465ad-home\") pod \"703b5a30-0d32-4af1-a6ae-7c5466d465ad\" (UID: \"703b5a30-0d32-4af1-a6ae-7c5466d465ad\") " Apr 21 15:57:22.877273 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:57:22.877178 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tmdrm\" (UniqueName: \"kubernetes.io/projected/703b5a30-0d32-4af1-a6ae-7c5466d465ad-kube-api-access-tmdrm\") pod \"703b5a30-0d32-4af1-a6ae-7c5466d465ad\" (UID: \"703b5a30-0d32-4af1-a6ae-7c5466d465ad\") " Apr 21 15:57:22.877273 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:57:22.877209 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/703b5a30-0d32-4af1-a6ae-7c5466d465ad-dshm\") pod \"703b5a30-0d32-4af1-a6ae-7c5466d465ad\" (UID: \"703b5a30-0d32-4af1-a6ae-7c5466d465ad\") " Apr 21 15:57:22.877273 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:57:22.877233 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/703b5a30-0d32-4af1-a6ae-7c5466d465ad-model-cache\") pod \"703b5a30-0d32-4af1-a6ae-7c5466d465ad\" (UID: \"703b5a30-0d32-4af1-a6ae-7c5466d465ad\") " Apr 21 15:57:22.877576 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:57:22.877532 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/703b5a30-0d32-4af1-a6ae-7c5466d465ad-model-cache" (OuterVolumeSpecName: "model-cache") pod "703b5a30-0d32-4af1-a6ae-7c5466d465ad" (UID: "703b5a30-0d32-4af1-a6ae-7c5466d465ad"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 15:57:22.877644 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:57:22.877578 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/703b5a30-0d32-4af1-a6ae-7c5466d465ad-home" (OuterVolumeSpecName: "home") pod "703b5a30-0d32-4af1-a6ae-7c5466d465ad" (UID: "703b5a30-0d32-4af1-a6ae-7c5466d465ad"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 15:57:22.879403 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:57:22.879376 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/703b5a30-0d32-4af1-a6ae-7c5466d465ad-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "703b5a30-0d32-4af1-a6ae-7c5466d465ad" (UID: "703b5a30-0d32-4af1-a6ae-7c5466d465ad"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 15:57:22.879502 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:57:22.879441 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/703b5a30-0d32-4af1-a6ae-7c5466d465ad-kube-api-access-tmdrm" (OuterVolumeSpecName: "kube-api-access-tmdrm") pod "703b5a30-0d32-4af1-a6ae-7c5466d465ad" (UID: "703b5a30-0d32-4af1-a6ae-7c5466d465ad"). InnerVolumeSpecName "kube-api-access-tmdrm". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 15:57:22.879502 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:57:22.879458 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/703b5a30-0d32-4af1-a6ae-7c5466d465ad-dshm" (OuterVolumeSpecName: "dshm") pod "703b5a30-0d32-4af1-a6ae-7c5466d465ad" (UID: "703b5a30-0d32-4af1-a6ae-7c5466d465ad"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 15:57:22.941290 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:57:22.941249 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/703b5a30-0d32-4af1-a6ae-7c5466d465ad-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "703b5a30-0d32-4af1-a6ae-7c5466d465ad" (UID: "703b5a30-0d32-4af1-a6ae-7c5466d465ad"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 15:57:22.978170 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:57:22.978141 2573 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/703b5a30-0d32-4af1-a6ae-7c5466d465ad-tls-certs\") on node \"ip-10-0-136-123.ec2.internal\" DevicePath \"\"" Apr 21 15:57:22.978170 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:57:22.978168 2573 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/703b5a30-0d32-4af1-a6ae-7c5466d465ad-home\") on node \"ip-10-0-136-123.ec2.internal\" DevicePath \"\"" Apr 21 15:57:22.978330 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:57:22.978179 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-tmdrm\" (UniqueName: \"kubernetes.io/projected/703b5a30-0d32-4af1-a6ae-7c5466d465ad-kube-api-access-tmdrm\") on node \"ip-10-0-136-123.ec2.internal\" DevicePath \"\"" Apr 21 15:57:22.978330 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:57:22.978189 2573 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/703b5a30-0d32-4af1-a6ae-7c5466d465ad-dshm\") on node \"ip-10-0-136-123.ec2.internal\" DevicePath \"\"" Apr 21 15:57:22.978330 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:57:22.978198 2573 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/703b5a30-0d32-4af1-a6ae-7c5466d465ad-model-cache\") on node \"ip-10-0-136-123.ec2.internal\" DevicePath \"\"" Apr 21 15:57:22.978330 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:57:22.978208 2573 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/703b5a30-0d32-4af1-a6ae-7c5466d465ad-kserve-provision-location\") on node \"ip-10-0-136-123.ec2.internal\" DevicePath \"\"" Apr 21 15:57:23.001903 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:57:23.001880 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_custom-route-timeout-test-kserve-985dcd4b5-xldg8_703b5a30-0d32-4af1-a6ae-7c5466d465ad/main/0.log" Apr 21 15:57:23.002257 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:57:23.002227 2573 generic.go:358] "Generic (PLEG): container finished" podID="703b5a30-0d32-4af1-a6ae-7c5466d465ad" containerID="54264cd3ef9d96e03e8f47b418b032dee3d67a41e60e01e40175216e76c1e1be" exitCode=137 Apr 21 15:57:23.002364 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:57:23.002310 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-985dcd4b5-xldg8" Apr 21 15:57:23.002364 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:57:23.002312 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-985dcd4b5-xldg8" event={"ID":"703b5a30-0d32-4af1-a6ae-7c5466d465ad","Type":"ContainerDied","Data":"54264cd3ef9d96e03e8f47b418b032dee3d67a41e60e01e40175216e76c1e1be"} Apr 21 15:57:23.002364 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:57:23.002356 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-985dcd4b5-xldg8" event={"ID":"703b5a30-0d32-4af1-a6ae-7c5466d465ad","Type":"ContainerDied","Data":"c446ea6a25e81e85fa0c7c4b8814d1946f1bc066bafce3efe67ba12ca6eea8e6"} Apr 21 15:57:23.002516 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:57:23.002372 2573 scope.go:117] "RemoveContainer" containerID="54264cd3ef9d96e03e8f47b418b032dee3d67a41e60e01e40175216e76c1e1be" Apr 21 15:57:23.021584 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:57:23.021567 2573 scope.go:117] "RemoveContainer" containerID="0fb2e89f170ca69cc245270204a95b6365ef662cc0c7970c3103b816b08e1253" Apr 21 15:57:23.027169 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:57:23.027118 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-985dcd4b5-xldg8"] Apr 21 15:57:23.030579 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:57:23.030560 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-985dcd4b5-xldg8"] Apr 21 15:57:23.032208 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:57:23.032186 2573 scope.go:117] "RemoveContainer" containerID="54264cd3ef9d96e03e8f47b418b032dee3d67a41e60e01e40175216e76c1e1be" Apr 21 15:57:23.032448 ip-10-0-136-123 kubenswrapper[2573]: E0421 15:57:23.032428 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"54264cd3ef9d96e03e8f47b418b032dee3d67a41e60e01e40175216e76c1e1be\": container with ID starting with 54264cd3ef9d96e03e8f47b418b032dee3d67a41e60e01e40175216e76c1e1be not found: ID does not exist" containerID="54264cd3ef9d96e03e8f47b418b032dee3d67a41e60e01e40175216e76c1e1be" Apr 21 15:57:23.032500 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:57:23.032457 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"54264cd3ef9d96e03e8f47b418b032dee3d67a41e60e01e40175216e76c1e1be"} err="failed to get container status \"54264cd3ef9d96e03e8f47b418b032dee3d67a41e60e01e40175216e76c1e1be\": rpc error: code = NotFound desc = could not find container \"54264cd3ef9d96e03e8f47b418b032dee3d67a41e60e01e40175216e76c1e1be\": container with ID starting with 54264cd3ef9d96e03e8f47b418b032dee3d67a41e60e01e40175216e76c1e1be not found: ID does not exist" Apr 21 15:57:23.032500 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:57:23.032481 2573 scope.go:117] "RemoveContainer" containerID="0fb2e89f170ca69cc245270204a95b6365ef662cc0c7970c3103b816b08e1253" Apr 21 15:57:23.032715 ip-10-0-136-123 kubenswrapper[2573]: E0421 15:57:23.032697 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0fb2e89f170ca69cc245270204a95b6365ef662cc0c7970c3103b816b08e1253\": container with ID starting with 0fb2e89f170ca69cc245270204a95b6365ef662cc0c7970c3103b816b08e1253 not found: ID does not exist" containerID="0fb2e89f170ca69cc245270204a95b6365ef662cc0c7970c3103b816b08e1253" Apr 21 15:57:23.032751 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:57:23.032721 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0fb2e89f170ca69cc245270204a95b6365ef662cc0c7970c3103b816b08e1253"} err="failed to get container status \"0fb2e89f170ca69cc245270204a95b6365ef662cc0c7970c3103b816b08e1253\": rpc error: code = NotFound desc = could not find container \"0fb2e89f170ca69cc245270204a95b6365ef662cc0c7970c3103b816b08e1253\": container with ID starting with 0fb2e89f170ca69cc245270204a95b6365ef662cc0c7970c3103b816b08e1253 not found: ID does not exist" Apr 21 15:57:23.485876 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:57:23.482135 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="703b5a30-0d32-4af1-a6ae-7c5466d465ad" path="/var/lib/kubelet/pods/703b5a30-0d32-4af1-a6ae-7c5466d465ad/volumes" Apr 21 15:57:29.960082 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:57:29.960005 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-5458bd7bf-txt8m" Apr 21 15:57:44.502605 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:57:44.502571 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-nm684"] Apr 21 15:57:44.502980 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:57:44.502934 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="703b5a30-0d32-4af1-a6ae-7c5466d465ad" containerName="storage-initializer" Apr 21 15:57:44.502980 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:57:44.502952 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="703b5a30-0d32-4af1-a6ae-7c5466d465ad" containerName="storage-initializer" Apr 21 15:57:44.503081 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:57:44.502986 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="703b5a30-0d32-4af1-a6ae-7c5466d465ad" containerName="main" Apr 21 15:57:44.503081 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:57:44.502995 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="703b5a30-0d32-4af1-a6ae-7c5466d465ad" containerName="main" Apr 21 15:57:44.503081 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:57:44.503008 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="56a319d7-3883-4871-bc37-8a8b0772785c" containerName="manager" Apr 21 15:57:44.503081 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:57:44.503016 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="56a319d7-3883-4871-bc37-8a8b0772785c" containerName="manager" Apr 21 15:57:44.503259 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:57:44.503086 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="56a319d7-3883-4871-bc37-8a8b0772785c" containerName="manager" Apr 21 15:57:44.503259 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:57:44.503095 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="703b5a30-0d32-4af1-a6ae-7c5466d465ad" containerName="main" Apr 21 15:57:44.505399 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:57:44.505376 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-nm684" Apr 21 15:57:44.508193 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:57:44.508164 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-gateway-2-openshift-default-dockercfg-cl78r\"" Apr 21 15:57:44.520032 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:57:44.519983 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-nm684"] Apr 21 15:57:44.662127 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:57:44.662087 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/93c9ec2a-89a5-4cf1-a9f1-0e414968c8cf-istiod-ca-cert\") pod \"router-gateway-2-openshift-default-6866b85949-nm684\" (UID: \"93c9ec2a-89a5-4cf1-a9f1-0e414968c8cf\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-nm684" Apr 21 15:57:44.662127 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:57:44.662131 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/93c9ec2a-89a5-4cf1-a9f1-0e414968c8cf-istio-envoy\") pod \"router-gateway-2-openshift-default-6866b85949-nm684\" (UID: \"93c9ec2a-89a5-4cf1-a9f1-0e414968c8cf\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-nm684" Apr 21 15:57:44.662366 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:57:44.662236 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/93c9ec2a-89a5-4cf1-a9f1-0e414968c8cf-workload-certs\") pod \"router-gateway-2-openshift-default-6866b85949-nm684\" (UID: \"93c9ec2a-89a5-4cf1-a9f1-0e414968c8cf\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-nm684" Apr 21 15:57:44.662366 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:57:44.662278 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bzcmw\" (UniqueName: \"kubernetes.io/projected/93c9ec2a-89a5-4cf1-a9f1-0e414968c8cf-kube-api-access-bzcmw\") pod \"router-gateway-2-openshift-default-6866b85949-nm684\" (UID: \"93c9ec2a-89a5-4cf1-a9f1-0e414968c8cf\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-nm684" Apr 21 15:57:44.662366 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:57:44.662308 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/93c9ec2a-89a5-4cf1-a9f1-0e414968c8cf-istio-data\") pod \"router-gateway-2-openshift-default-6866b85949-nm684\" (UID: \"93c9ec2a-89a5-4cf1-a9f1-0e414968c8cf\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-nm684" Apr 21 15:57:44.662366 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:57:44.662340 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/93c9ec2a-89a5-4cf1-a9f1-0e414968c8cf-istio-token\") pod \"router-gateway-2-openshift-default-6866b85949-nm684\" (UID: \"93c9ec2a-89a5-4cf1-a9f1-0e414968c8cf\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-nm684" Apr 21 15:57:44.662366 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:57:44.662367 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/93c9ec2a-89a5-4cf1-a9f1-0e414968c8cf-istio-podinfo\") pod \"router-gateway-2-openshift-default-6866b85949-nm684\" (UID: \"93c9ec2a-89a5-4cf1-a9f1-0e414968c8cf\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-nm684" Apr 21 15:57:44.662600 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:57:44.662390 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/93c9ec2a-89a5-4cf1-a9f1-0e414968c8cf-credential-socket\") pod \"router-gateway-2-openshift-default-6866b85949-nm684\" (UID: \"93c9ec2a-89a5-4cf1-a9f1-0e414968c8cf\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-nm684" Apr 21 15:57:44.662600 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:57:44.662423 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/93c9ec2a-89a5-4cf1-a9f1-0e414968c8cf-workload-socket\") pod \"router-gateway-2-openshift-default-6866b85949-nm684\" (UID: \"93c9ec2a-89a5-4cf1-a9f1-0e414968c8cf\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-nm684" Apr 21 15:57:44.763502 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:57:44.763414 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/93c9ec2a-89a5-4cf1-a9f1-0e414968c8cf-workload-certs\") pod \"router-gateway-2-openshift-default-6866b85949-nm684\" (UID: \"93c9ec2a-89a5-4cf1-a9f1-0e414968c8cf\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-nm684" Apr 21 15:57:44.763502 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:57:44.763463 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bzcmw\" (UniqueName: \"kubernetes.io/projected/93c9ec2a-89a5-4cf1-a9f1-0e414968c8cf-kube-api-access-bzcmw\") pod \"router-gateway-2-openshift-default-6866b85949-nm684\" (UID: \"93c9ec2a-89a5-4cf1-a9f1-0e414968c8cf\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-nm684" Apr 21 15:57:44.763502 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:57:44.763488 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/93c9ec2a-89a5-4cf1-a9f1-0e414968c8cf-istio-data\") pod \"router-gateway-2-openshift-default-6866b85949-nm684\" (UID: \"93c9ec2a-89a5-4cf1-a9f1-0e414968c8cf\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-nm684" Apr 21 15:57:44.763722 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:57:44.763514 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/93c9ec2a-89a5-4cf1-a9f1-0e414968c8cf-istio-token\") pod \"router-gateway-2-openshift-default-6866b85949-nm684\" (UID: \"93c9ec2a-89a5-4cf1-a9f1-0e414968c8cf\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-nm684" Apr 21 15:57:44.763722 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:57:44.763663 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/93c9ec2a-89a5-4cf1-a9f1-0e414968c8cf-istio-podinfo\") pod \"router-gateway-2-openshift-default-6866b85949-nm684\" (UID: \"93c9ec2a-89a5-4cf1-a9f1-0e414968c8cf\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-nm684" Apr 21 15:57:44.763841 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:57:44.763717 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/93c9ec2a-89a5-4cf1-a9f1-0e414968c8cf-credential-socket\") pod \"router-gateway-2-openshift-default-6866b85949-nm684\" (UID: \"93c9ec2a-89a5-4cf1-a9f1-0e414968c8cf\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-nm684" Apr 21 15:57:44.763841 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:57:44.763762 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/93c9ec2a-89a5-4cf1-a9f1-0e414968c8cf-workload-socket\") pod \"router-gateway-2-openshift-default-6866b85949-nm684\" (UID: \"93c9ec2a-89a5-4cf1-a9f1-0e414968c8cf\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-nm684" Apr 21 15:57:44.763841 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:57:44.763828 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/93c9ec2a-89a5-4cf1-a9f1-0e414968c8cf-istiod-ca-cert\") pod \"router-gateway-2-openshift-default-6866b85949-nm684\" (UID: \"93c9ec2a-89a5-4cf1-a9f1-0e414968c8cf\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-nm684" Apr 21 15:57:44.764010 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:57:44.763861 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/93c9ec2a-89a5-4cf1-a9f1-0e414968c8cf-istio-envoy\") pod \"router-gateway-2-openshift-default-6866b85949-nm684\" (UID: \"93c9ec2a-89a5-4cf1-a9f1-0e414968c8cf\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-nm684" Apr 21 15:57:44.764010 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:57:44.763932 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/93c9ec2a-89a5-4cf1-a9f1-0e414968c8cf-workload-certs\") pod \"router-gateway-2-openshift-default-6866b85949-nm684\" (UID: \"93c9ec2a-89a5-4cf1-a9f1-0e414968c8cf\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-nm684" Apr 21 15:57:44.764010 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:57:44.763972 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/93c9ec2a-89a5-4cf1-a9f1-0e414968c8cf-istio-data\") pod \"router-gateway-2-openshift-default-6866b85949-nm684\" (UID: \"93c9ec2a-89a5-4cf1-a9f1-0e414968c8cf\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-nm684" Apr 21 15:57:44.764162 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:57:44.764111 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/93c9ec2a-89a5-4cf1-a9f1-0e414968c8cf-credential-socket\") pod \"router-gateway-2-openshift-default-6866b85949-nm684\" (UID: \"93c9ec2a-89a5-4cf1-a9f1-0e414968c8cf\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-nm684" Apr 21 15:57:44.764257 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:57:44.764234 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/93c9ec2a-89a5-4cf1-a9f1-0e414968c8cf-workload-socket\") pod \"router-gateway-2-openshift-default-6866b85949-nm684\" (UID: \"93c9ec2a-89a5-4cf1-a9f1-0e414968c8cf\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-nm684" Apr 21 15:57:44.764498 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:57:44.764479 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/93c9ec2a-89a5-4cf1-a9f1-0e414968c8cf-istiod-ca-cert\") pod \"router-gateway-2-openshift-default-6866b85949-nm684\" (UID: \"93c9ec2a-89a5-4cf1-a9f1-0e414968c8cf\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-nm684" Apr 21 15:57:44.766015 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:57:44.765997 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/93c9ec2a-89a5-4cf1-a9f1-0e414968c8cf-istio-envoy\") pod \"router-gateway-2-openshift-default-6866b85949-nm684\" (UID: \"93c9ec2a-89a5-4cf1-a9f1-0e414968c8cf\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-nm684" Apr 21 15:57:44.766583 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:57:44.766563 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/93c9ec2a-89a5-4cf1-a9f1-0e414968c8cf-istio-podinfo\") pod \"router-gateway-2-openshift-default-6866b85949-nm684\" (UID: \"93c9ec2a-89a5-4cf1-a9f1-0e414968c8cf\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-nm684" Apr 21 15:57:44.772845 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:57:44.772821 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bzcmw\" (UniqueName: \"kubernetes.io/projected/93c9ec2a-89a5-4cf1-a9f1-0e414968c8cf-kube-api-access-bzcmw\") pod \"router-gateway-2-openshift-default-6866b85949-nm684\" (UID: \"93c9ec2a-89a5-4cf1-a9f1-0e414968c8cf\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-nm684" Apr 21 15:57:44.773259 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:57:44.773237 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/93c9ec2a-89a5-4cf1-a9f1-0e414968c8cf-istio-token\") pod \"router-gateway-2-openshift-default-6866b85949-nm684\" (UID: \"93c9ec2a-89a5-4cf1-a9f1-0e414968c8cf\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-nm684" Apr 21 15:57:44.819351 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:57:44.819315 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-nm684" Apr 21 15:57:44.957536 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:57:44.957494 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-nm684"] Apr 21 15:57:44.960191 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:57:44.960162 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod93c9ec2a_89a5_4cf1_a9f1_0e414968c8cf.slice/crio-aece8aa59735cb4a808ea4367421e1469342ca558801f936be81d3c041286503 WatchSource:0}: Error finding container aece8aa59735cb4a808ea4367421e1469342ca558801f936be81d3c041286503: Status 404 returned error can't find the container with id aece8aa59735cb4a808ea4367421e1469342ca558801f936be81d3c041286503 Apr 21 15:57:44.962974 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:57:44.962936 2573 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 21 15:57:44.963089 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:57:44.963018 2573 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 21 15:57:44.963089 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:57:44.963055 2573 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 21 15:57:45.078483 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:57:45.078442 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-nm684" event={"ID":"93c9ec2a-89a5-4cf1-a9f1-0e414968c8cf","Type":"ContainerStarted","Data":"1fc833e395f6b5ab6bb95e96adabf99a198f9b54fe9d10d86f7264d91a908218"} Apr 21 15:57:45.078483 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:57:45.078485 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-nm684" event={"ID":"93c9ec2a-89a5-4cf1-a9f1-0e414968c8cf","Type":"ContainerStarted","Data":"aece8aa59735cb4a808ea4367421e1469342ca558801f936be81d3c041286503"} Apr 21 15:57:45.820235 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:57:45.820196 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-nm684" Apr 21 15:57:47.820140 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:57:47.820091 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-nm684" podUID="93c9ec2a-89a5-4cf1-a9f1-0e414968c8cf" containerName="istio-proxy" probeResult="failure" output="Get \"http://10.132.0.41:15021/healthz/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Apr 21 15:57:48.821421 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:57:48.821373 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-nm684" podUID="93c9ec2a-89a5-4cf1-a9f1-0e414968c8cf" containerName="istio-proxy" probeResult="failure" output="Get \"http://10.132.0.41:15021/healthz/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Apr 21 15:57:48.872576 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:57:48.872541 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-nm684" Apr 21 15:57:48.873038 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:57:48.873013 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-nm684" Apr 21 15:57:48.880330 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:57:48.880303 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-nm684" Apr 21 15:57:48.894736 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:57:48.894695 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-nm684" podStartSLOduration=4.8946832879999995 podStartE2EDuration="4.894683288s" podCreationTimestamp="2026-04-21 15:57:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 15:57:45.102355263 +0000 UTC m=+1360.232408899" watchObservedRunningTime="2026-04-21 15:57:48.894683288 +0000 UTC m=+1364.024736922" Apr 21 15:58:06.284021 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:58:06.283988 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-5449f976b6-ktl2c"] Apr 21 15:58:06.286567 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:58:06.286551 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-5449f976b6-ktl2c" Apr 21 15:58:06.289381 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:58:06.289357 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-with-refs-pd-test-kserve-self-signed-certs\"" Apr 21 15:58:06.299195 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:58:06.299167 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-5449f976b6-ktl2c"] Apr 21 15:58:06.355929 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:58:06.355896 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/fa8e534b-9e54-4a71-bb6e-72ab0aaf97ab-home\") pod \"router-with-refs-pd-test-kserve-prefill-5449f976b6-ktl2c\" (UID: \"fa8e534b-9e54-4a71-bb6e-72ab0aaf97ab\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-5449f976b6-ktl2c" Apr 21 15:58:06.356102 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:58:06.355936 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/fa8e534b-9e54-4a71-bb6e-72ab0aaf97ab-tls-certs\") pod \"router-with-refs-pd-test-kserve-prefill-5449f976b6-ktl2c\" (UID: \"fa8e534b-9e54-4a71-bb6e-72ab0aaf97ab\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-5449f976b6-ktl2c" Apr 21 15:58:06.356102 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:58:06.355981 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/fa8e534b-9e54-4a71-bb6e-72ab0aaf97ab-dshm\") pod \"router-with-refs-pd-test-kserve-prefill-5449f976b6-ktl2c\" (UID: \"fa8e534b-9e54-4a71-bb6e-72ab0aaf97ab\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-5449f976b6-ktl2c" Apr 21 15:58:06.356102 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:58:06.356015 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ks9pc\" (UniqueName: \"kubernetes.io/projected/fa8e534b-9e54-4a71-bb6e-72ab0aaf97ab-kube-api-access-ks9pc\") pod \"router-with-refs-pd-test-kserve-prefill-5449f976b6-ktl2c\" (UID: \"fa8e534b-9e54-4a71-bb6e-72ab0aaf97ab\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-5449f976b6-ktl2c" Apr 21 15:58:06.356102 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:58:06.356084 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/fa8e534b-9e54-4a71-bb6e-72ab0aaf97ab-model-cache\") pod \"router-with-refs-pd-test-kserve-prefill-5449f976b6-ktl2c\" (UID: \"fa8e534b-9e54-4a71-bb6e-72ab0aaf97ab\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-5449f976b6-ktl2c" Apr 21 15:58:06.356248 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:58:06.356120 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fa8e534b-9e54-4a71-bb6e-72ab0aaf97ab-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-prefill-5449f976b6-ktl2c\" (UID: \"fa8e534b-9e54-4a71-bb6e-72ab0aaf97ab\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-5449f976b6-ktl2c" Apr 21 15:58:06.457569 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:58:06.457531 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ks9pc\" (UniqueName: \"kubernetes.io/projected/fa8e534b-9e54-4a71-bb6e-72ab0aaf97ab-kube-api-access-ks9pc\") pod \"router-with-refs-pd-test-kserve-prefill-5449f976b6-ktl2c\" (UID: \"fa8e534b-9e54-4a71-bb6e-72ab0aaf97ab\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-5449f976b6-ktl2c" Apr 21 15:58:06.457569 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:58:06.457570 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/fa8e534b-9e54-4a71-bb6e-72ab0aaf97ab-model-cache\") pod \"router-with-refs-pd-test-kserve-prefill-5449f976b6-ktl2c\" (UID: \"fa8e534b-9e54-4a71-bb6e-72ab0aaf97ab\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-5449f976b6-ktl2c" Apr 21 15:58:06.457836 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:58:06.457591 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fa8e534b-9e54-4a71-bb6e-72ab0aaf97ab-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-prefill-5449f976b6-ktl2c\" (UID: \"fa8e534b-9e54-4a71-bb6e-72ab0aaf97ab\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-5449f976b6-ktl2c" Apr 21 15:58:06.457836 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:58:06.457630 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/fa8e534b-9e54-4a71-bb6e-72ab0aaf97ab-home\") pod \"router-with-refs-pd-test-kserve-prefill-5449f976b6-ktl2c\" (UID: \"fa8e534b-9e54-4a71-bb6e-72ab0aaf97ab\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-5449f976b6-ktl2c" Apr 21 15:58:06.457836 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:58:06.457657 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/fa8e534b-9e54-4a71-bb6e-72ab0aaf97ab-tls-certs\") pod \"router-with-refs-pd-test-kserve-prefill-5449f976b6-ktl2c\" (UID: \"fa8e534b-9e54-4a71-bb6e-72ab0aaf97ab\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-5449f976b6-ktl2c" Apr 21 15:58:06.458462 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:58:06.458420 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/fa8e534b-9e54-4a71-bb6e-72ab0aaf97ab-dshm\") pod \"router-with-refs-pd-test-kserve-prefill-5449f976b6-ktl2c\" (UID: \"fa8e534b-9e54-4a71-bb6e-72ab0aaf97ab\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-5449f976b6-ktl2c" Apr 21 15:58:06.458599 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:58:06.458477 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fa8e534b-9e54-4a71-bb6e-72ab0aaf97ab-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-prefill-5449f976b6-ktl2c\" (UID: \"fa8e534b-9e54-4a71-bb6e-72ab0aaf97ab\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-5449f976b6-ktl2c" Apr 21 15:58:06.458599 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:58:06.458481 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/fa8e534b-9e54-4a71-bb6e-72ab0aaf97ab-model-cache\") pod \"router-with-refs-pd-test-kserve-prefill-5449f976b6-ktl2c\" (UID: \"fa8e534b-9e54-4a71-bb6e-72ab0aaf97ab\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-5449f976b6-ktl2c" Apr 21 15:58:06.458599 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:58:06.458560 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/fa8e534b-9e54-4a71-bb6e-72ab0aaf97ab-home\") pod \"router-with-refs-pd-test-kserve-prefill-5449f976b6-ktl2c\" (UID: \"fa8e534b-9e54-4a71-bb6e-72ab0aaf97ab\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-5449f976b6-ktl2c" Apr 21 15:58:06.464902 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:58:06.464868 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/fa8e534b-9e54-4a71-bb6e-72ab0aaf97ab-dshm\") pod \"router-with-refs-pd-test-kserve-prefill-5449f976b6-ktl2c\" (UID: \"fa8e534b-9e54-4a71-bb6e-72ab0aaf97ab\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-5449f976b6-ktl2c" Apr 21 15:58:06.465535 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:58:06.465361 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/fa8e534b-9e54-4a71-bb6e-72ab0aaf97ab-tls-certs\") pod \"router-with-refs-pd-test-kserve-prefill-5449f976b6-ktl2c\" (UID: \"fa8e534b-9e54-4a71-bb6e-72ab0aaf97ab\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-5449f976b6-ktl2c" Apr 21 15:58:06.466491 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:58:06.466468 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ks9pc\" (UniqueName: \"kubernetes.io/projected/fa8e534b-9e54-4a71-bb6e-72ab0aaf97ab-kube-api-access-ks9pc\") pod \"router-with-refs-pd-test-kserve-prefill-5449f976b6-ktl2c\" (UID: \"fa8e534b-9e54-4a71-bb6e-72ab0aaf97ab\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-5449f976b6-ktl2c" Apr 21 15:58:06.598689 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:58:06.598652 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-5449f976b6-ktl2c" Apr 21 15:58:06.733357 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:58:06.733326 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-5449f976b6-ktl2c"] Apr 21 15:58:06.736206 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:58:06.736163 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfa8e534b_9e54_4a71_bb6e_72ab0aaf97ab.slice/crio-2faa3f2971075393c3d6b798b0af99204ab17f829c4df0062b67ccbd3beb3f56 WatchSource:0}: Error finding container 2faa3f2971075393c3d6b798b0af99204ab17f829c4df0062b67ccbd3beb3f56: Status 404 returned error can't find the container with id 2faa3f2971075393c3d6b798b0af99204ab17f829c4df0062b67ccbd3beb3f56 Apr 21 15:58:07.153966 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:58:07.153928 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-5449f976b6-ktl2c" event={"ID":"fa8e534b-9e54-4a71-bb6e-72ab0aaf97ab","Type":"ContainerStarted","Data":"298f11c62f999a955be57859b470d55f379d91b6a22cafb0cbabf69932e679da"} Apr 21 15:58:07.153966 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:58:07.153966 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-5449f976b6-ktl2c" event={"ID":"fa8e534b-9e54-4a71-bb6e-72ab0aaf97ab","Type":"ContainerStarted","Data":"2faa3f2971075393c3d6b798b0af99204ab17f829c4df0062b67ccbd3beb3f56"} Apr 21 15:58:11.172349 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:58:11.172263 2573 generic.go:358] "Generic (PLEG): container finished" podID="fa8e534b-9e54-4a71-bb6e-72ab0aaf97ab" containerID="298f11c62f999a955be57859b470d55f379d91b6a22cafb0cbabf69932e679da" exitCode=0 Apr 21 15:58:11.172349 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:58:11.172334 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-5449f976b6-ktl2c" event={"ID":"fa8e534b-9e54-4a71-bb6e-72ab0aaf97ab","Type":"ContainerDied","Data":"298f11c62f999a955be57859b470d55f379d91b6a22cafb0cbabf69932e679da"} Apr 21 15:58:12.176769 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:58:12.176730 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-5449f976b6-ktl2c" event={"ID":"fa8e534b-9e54-4a71-bb6e-72ab0aaf97ab","Type":"ContainerStarted","Data":"455710a0f9449e275c69495b5acfb690f7582aa7e96516958ad92faacf270c96"} Apr 21 15:58:12.199293 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:58:12.199232 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-5449f976b6-ktl2c" podStartSLOduration=6.199214126 podStartE2EDuration="6.199214126s" podCreationTimestamp="2026-04-21 15:58:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 15:58:12.197213817 +0000 UTC m=+1387.327267464" watchObservedRunningTime="2026-04-21 15:58:12.199214126 +0000 UTC m=+1387.329267762" Apr 21 15:58:16.599125 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:58:16.599074 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-5449f976b6-ktl2c" Apr 21 15:58:16.599125 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:58:16.599124 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-5449f976b6-ktl2c" Apr 21 15:58:16.600684 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:58:16.600648 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-5449f976b6-ktl2c" podUID="fa8e534b-9e54-4a71-bb6e-72ab0aaf97ab" containerName="main" probeResult="failure" output="Get \"https://10.132.0.42:8000/health\": dial tcp 10.132.0.42:8000: connect: connection refused" Apr 21 15:58:26.599727 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:58:26.599673 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-5449f976b6-ktl2c" podUID="fa8e534b-9e54-4a71-bb6e-72ab0aaf97ab" containerName="main" probeResult="failure" output="Get \"https://10.132.0.42:8000/health\": dial tcp 10.132.0.42:8000: connect: connection refused" Apr 21 15:58:36.599327 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:58:36.599275 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-5449f976b6-ktl2c" podUID="fa8e534b-9e54-4a71-bb6e-72ab0aaf97ab" containerName="main" probeResult="failure" output="Get \"https://10.132.0.42:8000/health\": dial tcp 10.132.0.42:8000: connect: connection refused" Apr 21 15:58:46.599663 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:58:46.599624 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-5449f976b6-ktl2c" podUID="fa8e534b-9e54-4a71-bb6e-72ab0aaf97ab" containerName="main" probeResult="failure" output="Get \"https://10.132.0.42:8000/health\": dial tcp 10.132.0.42:8000: connect: connection refused" Apr 21 15:58:56.599841 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:58:56.599728 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-5449f976b6-ktl2c" podUID="fa8e534b-9e54-4a71-bb6e-72ab0aaf97ab" containerName="main" probeResult="failure" output="Get \"https://10.132.0.42:8000/health\": dial tcp 10.132.0.42:8000: connect: connection refused" Apr 21 15:58:59.733425 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:58:59.733388 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-5458bd7bf-txt8m"] Apr 21 15:58:59.733809 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:58:59.733750 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-5458bd7bf-txt8m" podUID="071ee785-900b-47c6-9e8f-c65e221cc3ad" containerName="main" containerID="cri-o://92d0556ebe10cb1168b8dd2131cc38b12e6bcc0f4e36cc163bd2a2abda514cdc" gracePeriod=30 Apr 21 15:58:59.733889 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:58:59.733828 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-5458bd7bf-txt8m" podUID="071ee785-900b-47c6-9e8f-c65e221cc3ad" containerName="tokenizer" containerID="cri-o://891f0b3b168ff5c2446ed30820064889dcd8fe2ccf25ffc06e9d711067920fbc" gracePeriod=30 Apr 21 15:58:59.959455 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:58:59.959417 2573 logging.go:55] [core] [Channel #248 SubChannel #249]grpc: addrConn.createTransport failed to connect to {Addr: "10.132.0.40:9003", ServerName: "10.132.0.40:9003", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 10.132.0.40:9003: connect: connection refused" Apr 21 15:59:00.348716 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:59:00.348675 2573 generic.go:358] "Generic (PLEG): container finished" podID="071ee785-900b-47c6-9e8f-c65e221cc3ad" containerID="92d0556ebe10cb1168b8dd2131cc38b12e6bcc0f4e36cc163bd2a2abda514cdc" exitCode=0 Apr 21 15:59:00.348920 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:59:00.348741 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-5458bd7bf-txt8m" event={"ID":"071ee785-900b-47c6-9e8f-c65e221cc3ad","Type":"ContainerDied","Data":"92d0556ebe10cb1168b8dd2131cc38b12e6bcc0f4e36cc163bd2a2abda514cdc"} Apr 21 15:59:00.959282 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:59:00.959237 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-5458bd7bf-txt8m" podUID="071ee785-900b-47c6-9e8f-c65e221cc3ad" containerName="main" probeResult="failure" output="timeout: failed to connect service \"10.132.0.40:9003\" within 1s: context deadline exceeded" Apr 21 15:59:00.984345 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:59:00.984324 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-5458bd7bf-txt8m" Apr 21 15:59:01.041842 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:59:01.041734 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/071ee785-900b-47c6-9e8f-c65e221cc3ad-tokenizer-uds\") pod \"071ee785-900b-47c6-9e8f-c65e221cc3ad\" (UID: \"071ee785-900b-47c6-9e8f-c65e221cc3ad\") " Apr 21 15:59:01.041842 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:59:01.041804 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/071ee785-900b-47c6-9e8f-c65e221cc3ad-kserve-provision-location\") pod \"071ee785-900b-47c6-9e8f-c65e221cc3ad\" (UID: \"071ee785-900b-47c6-9e8f-c65e221cc3ad\") " Apr 21 15:59:01.042076 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:59:01.041857 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/071ee785-900b-47c6-9e8f-c65e221cc3ad-tls-certs\") pod \"071ee785-900b-47c6-9e8f-c65e221cc3ad\" (UID: \"071ee785-900b-47c6-9e8f-c65e221cc3ad\") " Apr 21 15:59:01.042076 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:59:01.041899 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/071ee785-900b-47c6-9e8f-c65e221cc3ad-tokenizer-cache\") pod \"071ee785-900b-47c6-9e8f-c65e221cc3ad\" (UID: \"071ee785-900b-47c6-9e8f-c65e221cc3ad\") " Apr 21 15:59:01.042076 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:59:01.041940 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fv9kl\" (UniqueName: \"kubernetes.io/projected/071ee785-900b-47c6-9e8f-c65e221cc3ad-kube-api-access-fv9kl\") pod \"071ee785-900b-47c6-9e8f-c65e221cc3ad\" (UID: \"071ee785-900b-47c6-9e8f-c65e221cc3ad\") " Apr 21 15:59:01.042248 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:59:01.042079 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/071ee785-900b-47c6-9e8f-c65e221cc3ad-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "071ee785-900b-47c6-9e8f-c65e221cc3ad" (UID: "071ee785-900b-47c6-9e8f-c65e221cc3ad"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 15:59:01.042248 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:59:01.042141 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/071ee785-900b-47c6-9e8f-c65e221cc3ad-tokenizer-tmp\") pod \"071ee785-900b-47c6-9e8f-c65e221cc3ad\" (UID: \"071ee785-900b-47c6-9e8f-c65e221cc3ad\") " Apr 21 15:59:01.042248 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:59:01.042233 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/071ee785-900b-47c6-9e8f-c65e221cc3ad-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "071ee785-900b-47c6-9e8f-c65e221cc3ad" (UID: "071ee785-900b-47c6-9e8f-c65e221cc3ad"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 15:59:01.042450 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:59:01.042429 2573 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/071ee785-900b-47c6-9e8f-c65e221cc3ad-tokenizer-cache\") on node \"ip-10-0-136-123.ec2.internal\" DevicePath \"\"" Apr 21 15:59:01.042521 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:59:01.042457 2573 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/071ee785-900b-47c6-9e8f-c65e221cc3ad-tokenizer-uds\") on node \"ip-10-0-136-123.ec2.internal\" DevicePath \"\"" Apr 21 15:59:01.042670 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:59:01.042644 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/071ee785-900b-47c6-9e8f-c65e221cc3ad-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "071ee785-900b-47c6-9e8f-c65e221cc3ad" (UID: "071ee785-900b-47c6-9e8f-c65e221cc3ad"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 15:59:01.042955 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:59:01.042924 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/071ee785-900b-47c6-9e8f-c65e221cc3ad-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "071ee785-900b-47c6-9e8f-c65e221cc3ad" (UID: "071ee785-900b-47c6-9e8f-c65e221cc3ad"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 15:59:01.044163 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:59:01.044136 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/071ee785-900b-47c6-9e8f-c65e221cc3ad-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "071ee785-900b-47c6-9e8f-c65e221cc3ad" (UID: "071ee785-900b-47c6-9e8f-c65e221cc3ad"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 15:59:01.044245 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:59:01.044156 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/071ee785-900b-47c6-9e8f-c65e221cc3ad-kube-api-access-fv9kl" (OuterVolumeSpecName: "kube-api-access-fv9kl") pod "071ee785-900b-47c6-9e8f-c65e221cc3ad" (UID: "071ee785-900b-47c6-9e8f-c65e221cc3ad"). InnerVolumeSpecName "kube-api-access-fv9kl". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 15:59:01.143750 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:59:01.143705 2573 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/071ee785-900b-47c6-9e8f-c65e221cc3ad-tls-certs\") on node \"ip-10-0-136-123.ec2.internal\" DevicePath \"\"" Apr 21 15:59:01.143914 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:59:01.143755 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-fv9kl\" (UniqueName: \"kubernetes.io/projected/071ee785-900b-47c6-9e8f-c65e221cc3ad-kube-api-access-fv9kl\") on node \"ip-10-0-136-123.ec2.internal\" DevicePath \"\"" Apr 21 15:59:01.143914 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:59:01.143774 2573 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/071ee785-900b-47c6-9e8f-c65e221cc3ad-tokenizer-tmp\") on node \"ip-10-0-136-123.ec2.internal\" DevicePath \"\"" Apr 21 15:59:01.143914 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:59:01.143816 2573 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/071ee785-900b-47c6-9e8f-c65e221cc3ad-kserve-provision-location\") on node \"ip-10-0-136-123.ec2.internal\" DevicePath \"\"" Apr 21 15:59:01.353760 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:59:01.353726 2573 generic.go:358] "Generic (PLEG): container finished" podID="071ee785-900b-47c6-9e8f-c65e221cc3ad" containerID="891f0b3b168ff5c2446ed30820064889dcd8fe2ccf25ffc06e9d711067920fbc" exitCode=0 Apr 21 15:59:01.353944 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:59:01.353825 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-5458bd7bf-txt8m" event={"ID":"071ee785-900b-47c6-9e8f-c65e221cc3ad","Type":"ContainerDied","Data":"891f0b3b168ff5c2446ed30820064889dcd8fe2ccf25ffc06e9d711067920fbc"} Apr 21 15:59:01.353944 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:59:01.353870 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-5458bd7bf-txt8m" event={"ID":"071ee785-900b-47c6-9e8f-c65e221cc3ad","Type":"ContainerDied","Data":"aa93790b8f3d7ccf01fda798578e64475a412b502d6859d053f482d2f122df80"} Apr 21 15:59:01.353944 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:59:01.353887 2573 scope.go:117] "RemoveContainer" containerID="891f0b3b168ff5c2446ed30820064889dcd8fe2ccf25ffc06e9d711067920fbc" Apr 21 15:59:01.353944 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:59:01.353889 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-5458bd7bf-txt8m" Apr 21 15:59:01.362317 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:59:01.362295 2573 scope.go:117] "RemoveContainer" containerID="92d0556ebe10cb1168b8dd2131cc38b12e6bcc0f4e36cc163bd2a2abda514cdc" Apr 21 15:59:01.369975 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:59:01.369958 2573 scope.go:117] "RemoveContainer" containerID="eddadb309bb46ff01e8897112be22e42d4596c8b2431ff68724cdc270c50a7e5" Apr 21 15:59:01.378550 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:59:01.378528 2573 scope.go:117] "RemoveContainer" containerID="891f0b3b168ff5c2446ed30820064889dcd8fe2ccf25ffc06e9d711067920fbc" Apr 21 15:59:01.378862 ip-10-0-136-123 kubenswrapper[2573]: E0421 15:59:01.378785 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"891f0b3b168ff5c2446ed30820064889dcd8fe2ccf25ffc06e9d711067920fbc\": container with ID starting with 891f0b3b168ff5c2446ed30820064889dcd8fe2ccf25ffc06e9d711067920fbc not found: ID does not exist" containerID="891f0b3b168ff5c2446ed30820064889dcd8fe2ccf25ffc06e9d711067920fbc" Apr 21 15:59:01.378946 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:59:01.378870 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"891f0b3b168ff5c2446ed30820064889dcd8fe2ccf25ffc06e9d711067920fbc"} err="failed to get container status \"891f0b3b168ff5c2446ed30820064889dcd8fe2ccf25ffc06e9d711067920fbc\": rpc error: code = NotFound desc = could not find container \"891f0b3b168ff5c2446ed30820064889dcd8fe2ccf25ffc06e9d711067920fbc\": container with ID starting with 891f0b3b168ff5c2446ed30820064889dcd8fe2ccf25ffc06e9d711067920fbc not found: ID does not exist" Apr 21 15:59:01.378946 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:59:01.378891 2573 scope.go:117] "RemoveContainer" containerID="92d0556ebe10cb1168b8dd2131cc38b12e6bcc0f4e36cc163bd2a2abda514cdc" Apr 21 15:59:01.379164 ip-10-0-136-123 kubenswrapper[2573]: E0421 15:59:01.379135 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"92d0556ebe10cb1168b8dd2131cc38b12e6bcc0f4e36cc163bd2a2abda514cdc\": container with ID starting with 92d0556ebe10cb1168b8dd2131cc38b12e6bcc0f4e36cc163bd2a2abda514cdc not found: ID does not exist" containerID="92d0556ebe10cb1168b8dd2131cc38b12e6bcc0f4e36cc163bd2a2abda514cdc" Apr 21 15:59:01.379243 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:59:01.379168 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"92d0556ebe10cb1168b8dd2131cc38b12e6bcc0f4e36cc163bd2a2abda514cdc"} err="failed to get container status \"92d0556ebe10cb1168b8dd2131cc38b12e6bcc0f4e36cc163bd2a2abda514cdc\": rpc error: code = NotFound desc = could not find container \"92d0556ebe10cb1168b8dd2131cc38b12e6bcc0f4e36cc163bd2a2abda514cdc\": container with ID starting with 92d0556ebe10cb1168b8dd2131cc38b12e6bcc0f4e36cc163bd2a2abda514cdc not found: ID does not exist" Apr 21 15:59:01.379243 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:59:01.379181 2573 scope.go:117] "RemoveContainer" containerID="eddadb309bb46ff01e8897112be22e42d4596c8b2431ff68724cdc270c50a7e5" Apr 21 15:59:01.379432 ip-10-0-136-123 kubenswrapper[2573]: E0421 15:59:01.379409 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eddadb309bb46ff01e8897112be22e42d4596c8b2431ff68724cdc270c50a7e5\": container with ID starting with eddadb309bb46ff01e8897112be22e42d4596c8b2431ff68724cdc270c50a7e5 not found: ID does not exist" containerID="eddadb309bb46ff01e8897112be22e42d4596c8b2431ff68724cdc270c50a7e5" Apr 21 15:59:01.379509 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:59:01.379438 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eddadb309bb46ff01e8897112be22e42d4596c8b2431ff68724cdc270c50a7e5"} err="failed to get container status \"eddadb309bb46ff01e8897112be22e42d4596c8b2431ff68724cdc270c50a7e5\": rpc error: code = NotFound desc = could not find container \"eddadb309bb46ff01e8897112be22e42d4596c8b2431ff68724cdc270c50a7e5\": container with ID starting with eddadb309bb46ff01e8897112be22e42d4596c8b2431ff68724cdc270c50a7e5 not found: ID does not exist" Apr 21 15:59:01.379913 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:59:01.379893 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-5458bd7bf-txt8m"] Apr 21 15:59:01.383179 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:59:01.383158 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-5458bd7bf-txt8m"] Apr 21 15:59:01.482343 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:59:01.482313 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="071ee785-900b-47c6-9e8f-c65e221cc3ad" path="/var/lib/kubelet/pods/071ee785-900b-47c6-9e8f-c65e221cc3ad/volumes" Apr 21 15:59:06.599055 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:59:06.599005 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-5449f976b6-ktl2c" podUID="fa8e534b-9e54-4a71-bb6e-72ab0aaf97ab" containerName="main" probeResult="failure" output="Get \"https://10.132.0.42:8000/health\": dial tcp 10.132.0.42:8000: connect: connection refused" Apr 21 15:59:16.599289 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:59:16.599227 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-5449f976b6-ktl2c" podUID="fa8e534b-9e54-4a71-bb6e-72ab0aaf97ab" containerName="main" probeResult="failure" output="Get \"https://10.132.0.42:8000/health\": dial tcp 10.132.0.42:8000: connect: connection refused" Apr 21 15:59:20.153002 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:59:20.152959 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7bfb78dbd-2t84b"] Apr 21 15:59:20.153507 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:59:20.153487 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="071ee785-900b-47c6-9e8f-c65e221cc3ad" containerName="storage-initializer" Apr 21 15:59:20.153555 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:59:20.153513 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="071ee785-900b-47c6-9e8f-c65e221cc3ad" containerName="storage-initializer" Apr 21 15:59:20.153555 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:59:20.153537 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="071ee785-900b-47c6-9e8f-c65e221cc3ad" containerName="main" Apr 21 15:59:20.153555 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:59:20.153547 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="071ee785-900b-47c6-9e8f-c65e221cc3ad" containerName="main" Apr 21 15:59:20.153652 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:59:20.153567 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="071ee785-900b-47c6-9e8f-c65e221cc3ad" containerName="tokenizer" Apr 21 15:59:20.153652 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:59:20.153577 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="071ee785-900b-47c6-9e8f-c65e221cc3ad" containerName="tokenizer" Apr 21 15:59:20.153714 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:59:20.153657 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="071ee785-900b-47c6-9e8f-c65e221cc3ad" containerName="tokenizer" Apr 21 15:59:20.153714 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:59:20.153672 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="071ee785-900b-47c6-9e8f-c65e221cc3ad" containerName="main" Apr 21 15:59:20.158850 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:59:20.158827 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7bfb78dbd-2t84b" Apr 21 15:59:20.162121 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:59:20.162100 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-dockercfg-kwm2g\"" Apr 21 15:59:20.162512 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:59:20.162491 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisv8f1a6f044e8c7a4d31a250e0c4861caf-kserve-self-signed-certs\"" Apr 21 15:59:20.169516 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:59:20.169492 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7bfb78dbd-2t84b"] Apr 21 15:59:20.173298 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:59:20.173276 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-f47l6w4"] Apr 21 15:59:20.176409 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:59:20.176394 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-f47l6w4" Apr 21 15:59:20.192579 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:59:20.192548 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-f47l6w4"] Apr 21 15:59:20.307853 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:59:20.307792 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/6961ef97-4c61-4ba5-85ed-00ddf50f12e9-home\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7bfb78dbd-2t84b\" (UID: \"6961ef97-4c61-4ba5-85ed-00ddf50f12e9\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7bfb78dbd-2t84b" Apr 21 15:59:20.307853 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:59:20.307856 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/6961ef97-4c61-4ba5-85ed-00ddf50f12e9-dshm\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7bfb78dbd-2t84b\" (UID: \"6961ef97-4c61-4ba5-85ed-00ddf50f12e9\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7bfb78dbd-2t84b" Apr 21 15:59:20.308058 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:59:20.307905 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/6961ef97-4c61-4ba5-85ed-00ddf50f12e9-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7bfb78dbd-2t84b\" (UID: \"6961ef97-4c61-4ba5-85ed-00ddf50f12e9\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7bfb78dbd-2t84b" Apr 21 15:59:20.308058 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:59:20.307925 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/88d11bfe-261a-4d92-9f6d-092c5ebea6e7-dshm\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-f47l6w4\" (UID: \"88d11bfe-261a-4d92-9f6d-092c5ebea6e7\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-f47l6w4" Apr 21 15:59:20.308058 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:59:20.307945 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/88d11bfe-261a-4d92-9f6d-092c5ebea6e7-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-f47l6w4\" (UID: \"88d11bfe-261a-4d92-9f6d-092c5ebea6e7\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-f47l6w4" Apr 21 15:59:20.308058 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:59:20.307972 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9k2pb\" (UniqueName: \"kubernetes.io/projected/88d11bfe-261a-4d92-9f6d-092c5ebea6e7-kube-api-access-9k2pb\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-f47l6w4\" (UID: \"88d11bfe-261a-4d92-9f6d-092c5ebea6e7\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-f47l6w4" Apr 21 15:59:20.308058 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:59:20.307990 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vkhrh\" (UniqueName: \"kubernetes.io/projected/6961ef97-4c61-4ba5-85ed-00ddf50f12e9-kube-api-access-vkhrh\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7bfb78dbd-2t84b\" (UID: \"6961ef97-4c61-4ba5-85ed-00ddf50f12e9\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7bfb78dbd-2t84b" Apr 21 15:59:20.308223 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:59:20.308050 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/88d11bfe-261a-4d92-9f6d-092c5ebea6e7-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-f47l6w4\" (UID: \"88d11bfe-261a-4d92-9f6d-092c5ebea6e7\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-f47l6w4" Apr 21 15:59:20.308223 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:59:20.308090 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/88d11bfe-261a-4d92-9f6d-092c5ebea6e7-home\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-f47l6w4\" (UID: \"88d11bfe-261a-4d92-9f6d-092c5ebea6e7\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-f47l6w4" Apr 21 15:59:20.308223 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:59:20.308117 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6961ef97-4c61-4ba5-85ed-00ddf50f12e9-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7bfb78dbd-2t84b\" (UID: \"6961ef97-4c61-4ba5-85ed-00ddf50f12e9\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7bfb78dbd-2t84b" Apr 21 15:59:20.308223 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:59:20.308136 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6961ef97-4c61-4ba5-85ed-00ddf50f12e9-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7bfb78dbd-2t84b\" (UID: \"6961ef97-4c61-4ba5-85ed-00ddf50f12e9\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7bfb78dbd-2t84b" Apr 21 15:59:20.308223 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:59:20.308210 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/88d11bfe-261a-4d92-9f6d-092c5ebea6e7-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-f47l6w4\" (UID: \"88d11bfe-261a-4d92-9f6d-092c5ebea6e7\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-f47l6w4" Apr 21 15:59:20.409350 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:59:20.409267 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/88d11bfe-261a-4d92-9f6d-092c5ebea6e7-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-f47l6w4\" (UID: \"88d11bfe-261a-4d92-9f6d-092c5ebea6e7\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-f47l6w4" Apr 21 15:59:20.409350 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:59:20.409316 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/6961ef97-4c61-4ba5-85ed-00ddf50f12e9-home\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7bfb78dbd-2t84b\" (UID: \"6961ef97-4c61-4ba5-85ed-00ddf50f12e9\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7bfb78dbd-2t84b" Apr 21 15:59:20.409350 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:59:20.409343 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/6961ef97-4c61-4ba5-85ed-00ddf50f12e9-dshm\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7bfb78dbd-2t84b\" (UID: \"6961ef97-4c61-4ba5-85ed-00ddf50f12e9\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7bfb78dbd-2t84b" Apr 21 15:59:20.409645 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:59:20.409366 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/6961ef97-4c61-4ba5-85ed-00ddf50f12e9-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7bfb78dbd-2t84b\" (UID: \"6961ef97-4c61-4ba5-85ed-00ddf50f12e9\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7bfb78dbd-2t84b" Apr 21 15:59:20.409645 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:59:20.409385 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/88d11bfe-261a-4d92-9f6d-092c5ebea6e7-dshm\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-f47l6w4\" (UID: \"88d11bfe-261a-4d92-9f6d-092c5ebea6e7\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-f47l6w4" Apr 21 15:59:20.409645 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:59:20.409402 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/88d11bfe-261a-4d92-9f6d-092c5ebea6e7-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-f47l6w4\" (UID: \"88d11bfe-261a-4d92-9f6d-092c5ebea6e7\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-f47l6w4" Apr 21 15:59:20.409829 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:59:20.409549 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9k2pb\" (UniqueName: \"kubernetes.io/projected/88d11bfe-261a-4d92-9f6d-092c5ebea6e7-kube-api-access-9k2pb\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-f47l6w4\" (UID: \"88d11bfe-261a-4d92-9f6d-092c5ebea6e7\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-f47l6w4" Apr 21 15:59:20.409829 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:59:20.409791 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vkhrh\" (UniqueName: \"kubernetes.io/projected/6961ef97-4c61-4ba5-85ed-00ddf50f12e9-kube-api-access-vkhrh\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7bfb78dbd-2t84b\" (UID: \"6961ef97-4c61-4ba5-85ed-00ddf50f12e9\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7bfb78dbd-2t84b" Apr 21 15:59:20.409948 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:59:20.409845 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/88d11bfe-261a-4d92-9f6d-092c5ebea6e7-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-f47l6w4\" (UID: \"88d11bfe-261a-4d92-9f6d-092c5ebea6e7\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-f47l6w4" Apr 21 15:59:20.409948 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:59:20.409857 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/6961ef97-4c61-4ba5-85ed-00ddf50f12e9-home\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7bfb78dbd-2t84b\" (UID: \"6961ef97-4c61-4ba5-85ed-00ddf50f12e9\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7bfb78dbd-2t84b" Apr 21 15:59:20.409948 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:59:20.409880 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/88d11bfe-261a-4d92-9f6d-092c5ebea6e7-home\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-f47l6w4\" (UID: \"88d11bfe-261a-4d92-9f6d-092c5ebea6e7\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-f47l6w4" Apr 21 15:59:20.409948 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:59:20.409724 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/88d11bfe-261a-4d92-9f6d-092c5ebea6e7-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-f47l6w4\" (UID: \"88d11bfe-261a-4d92-9f6d-092c5ebea6e7\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-f47l6w4" Apr 21 15:59:20.409948 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:59:20.409936 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6961ef97-4c61-4ba5-85ed-00ddf50f12e9-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7bfb78dbd-2t84b\" (UID: \"6961ef97-4c61-4ba5-85ed-00ddf50f12e9\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7bfb78dbd-2t84b" Apr 21 15:59:20.410220 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:59:20.409964 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6961ef97-4c61-4ba5-85ed-00ddf50f12e9-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7bfb78dbd-2t84b\" (UID: \"6961ef97-4c61-4ba5-85ed-00ddf50f12e9\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7bfb78dbd-2t84b" Apr 21 15:59:20.410220 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:59:20.410060 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/6961ef97-4c61-4ba5-85ed-00ddf50f12e9-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7bfb78dbd-2t84b\" (UID: \"6961ef97-4c61-4ba5-85ed-00ddf50f12e9\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7bfb78dbd-2t84b" Apr 21 15:59:20.410220 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:59:20.410138 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/88d11bfe-261a-4d92-9f6d-092c5ebea6e7-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-f47l6w4\" (UID: \"88d11bfe-261a-4d92-9f6d-092c5ebea6e7\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-f47l6w4" Apr 21 15:59:20.410341 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:59:20.410296 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6961ef97-4c61-4ba5-85ed-00ddf50f12e9-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7bfb78dbd-2t84b\" (UID: \"6961ef97-4c61-4ba5-85ed-00ddf50f12e9\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7bfb78dbd-2t84b" Apr 21 15:59:20.410341 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:59:20.410296 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/88d11bfe-261a-4d92-9f6d-092c5ebea6e7-home\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-f47l6w4\" (UID: \"88d11bfe-261a-4d92-9f6d-092c5ebea6e7\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-f47l6w4" Apr 21 15:59:20.412341 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:59:20.412315 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/88d11bfe-261a-4d92-9f6d-092c5ebea6e7-dshm\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-f47l6w4\" (UID: \"88d11bfe-261a-4d92-9f6d-092c5ebea6e7\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-f47l6w4" Apr 21 15:59:20.412440 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:59:20.412316 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6961ef97-4c61-4ba5-85ed-00ddf50f12e9-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7bfb78dbd-2t84b\" (UID: \"6961ef97-4c61-4ba5-85ed-00ddf50f12e9\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7bfb78dbd-2t84b" Apr 21 15:59:20.412480 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:59:20.412462 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/6961ef97-4c61-4ba5-85ed-00ddf50f12e9-dshm\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7bfb78dbd-2t84b\" (UID: \"6961ef97-4c61-4ba5-85ed-00ddf50f12e9\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7bfb78dbd-2t84b" Apr 21 15:59:20.412518 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:59:20.412497 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/88d11bfe-261a-4d92-9f6d-092c5ebea6e7-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-f47l6w4\" (UID: \"88d11bfe-261a-4d92-9f6d-092c5ebea6e7\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-f47l6w4" Apr 21 15:59:20.423387 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:59:20.423345 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9k2pb\" (UniqueName: \"kubernetes.io/projected/88d11bfe-261a-4d92-9f6d-092c5ebea6e7-kube-api-access-9k2pb\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-f47l6w4\" (UID: \"88d11bfe-261a-4d92-9f6d-092c5ebea6e7\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-f47l6w4" Apr 21 15:59:20.424234 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:59:20.424213 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vkhrh\" (UniqueName: \"kubernetes.io/projected/6961ef97-4c61-4ba5-85ed-00ddf50f12e9-kube-api-access-vkhrh\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7bfb78dbd-2t84b\" (UID: \"6961ef97-4c61-4ba5-85ed-00ddf50f12e9\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7bfb78dbd-2t84b" Apr 21 15:59:20.424976 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:59:20.424956 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche5tmhn"] Apr 21 15:59:20.430247 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:59:20.430220 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche5tmhn" Apr 21 15:59:20.433515 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:59:20.433491 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvc-model-fb-opt-125m-route-05aa9bba-epp-sa-dockercfg-stzdm\"" Apr 21 15:59:20.440927 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:59:20.440901 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche5tmhn"] Apr 21 15:59:20.469968 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:59:20.469940 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7bfb78dbd-2t84b" Apr 21 15:59:20.489919 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:59:20.489893 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-f47l6w4" Apr 21 15:59:20.613658 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:59:20.613617 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6f206a71-fd22-4b85-bbae-e38488586fb3-kserve-provision-location\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche5tmhn\" (UID: \"6f206a71-fd22-4b85-bbae-e38488586fb3\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche5tmhn" Apr 21 15:59:20.613892 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:59:20.613692 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/6f206a71-fd22-4b85-bbae-e38488586fb3-tokenizer-tmp\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche5tmhn\" (UID: \"6f206a71-fd22-4b85-bbae-e38488586fb3\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche5tmhn" Apr 21 15:59:20.613892 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:59:20.613763 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/6f206a71-fd22-4b85-bbae-e38488586fb3-tokenizer-uds\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche5tmhn\" (UID: \"6f206a71-fd22-4b85-bbae-e38488586fb3\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche5tmhn" Apr 21 15:59:20.613892 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:59:20.613839 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/6f206a71-fd22-4b85-bbae-e38488586fb3-tokenizer-cache\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche5tmhn\" (UID: \"6f206a71-fd22-4b85-bbae-e38488586fb3\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche5tmhn" Apr 21 15:59:20.613892 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:59:20.613867 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6f206a71-fd22-4b85-bbae-e38488586fb3-tls-certs\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche5tmhn\" (UID: \"6f206a71-fd22-4b85-bbae-e38488586fb3\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche5tmhn" Apr 21 15:59:20.614117 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:59:20.613894 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wptcq\" (UniqueName: \"kubernetes.io/projected/6f206a71-fd22-4b85-bbae-e38488586fb3-kube-api-access-wptcq\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche5tmhn\" (UID: \"6f206a71-fd22-4b85-bbae-e38488586fb3\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche5tmhn" Apr 21 15:59:20.616292 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:59:20.616264 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7bfb78dbd-2t84b"] Apr 21 15:59:20.618196 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:59:20.618163 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6961ef97_4c61_4ba5_85ed_00ddf50f12e9.slice/crio-42d327fc25524ad4a1a2e59efbf3cf45d1fd39aa81291af32e651f9931d312c4 WatchSource:0}: Error finding container 42d327fc25524ad4a1a2e59efbf3cf45d1fd39aa81291af32e651f9931d312c4: Status 404 returned error can't find the container with id 42d327fc25524ad4a1a2e59efbf3cf45d1fd39aa81291af32e651f9931d312c4 Apr 21 15:59:20.645920 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:59:20.645894 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-f47l6w4"] Apr 21 15:59:20.648702 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:59:20.648666 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod88d11bfe_261a_4d92_9f6d_092c5ebea6e7.slice/crio-5363b6c18cab8f3ab2a215790dbb37606937ac0871df7139765565c3ef16442f WatchSource:0}: Error finding container 5363b6c18cab8f3ab2a215790dbb37606937ac0871df7139765565c3ef16442f: Status 404 returned error can't find the container with id 5363b6c18cab8f3ab2a215790dbb37606937ac0871df7139765565c3ef16442f Apr 21 15:59:20.714813 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:59:20.714774 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/6f206a71-fd22-4b85-bbae-e38488586fb3-tokenizer-tmp\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche5tmhn\" (UID: \"6f206a71-fd22-4b85-bbae-e38488586fb3\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche5tmhn" Apr 21 15:59:20.714932 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:59:20.714858 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/6f206a71-fd22-4b85-bbae-e38488586fb3-tokenizer-uds\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche5tmhn\" (UID: \"6f206a71-fd22-4b85-bbae-e38488586fb3\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche5tmhn" Apr 21 15:59:20.714932 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:59:20.714886 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/6f206a71-fd22-4b85-bbae-e38488586fb3-tokenizer-cache\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche5tmhn\" (UID: \"6f206a71-fd22-4b85-bbae-e38488586fb3\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche5tmhn" Apr 21 15:59:20.714932 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:59:20.714903 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6f206a71-fd22-4b85-bbae-e38488586fb3-tls-certs\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche5tmhn\" (UID: \"6f206a71-fd22-4b85-bbae-e38488586fb3\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche5tmhn" Apr 21 15:59:20.714932 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:59:20.714920 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wptcq\" (UniqueName: \"kubernetes.io/projected/6f206a71-fd22-4b85-bbae-e38488586fb3-kube-api-access-wptcq\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche5tmhn\" (UID: \"6f206a71-fd22-4b85-bbae-e38488586fb3\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche5tmhn" Apr 21 15:59:20.715159 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:59:20.714962 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6f206a71-fd22-4b85-bbae-e38488586fb3-kserve-provision-location\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche5tmhn\" (UID: \"6f206a71-fd22-4b85-bbae-e38488586fb3\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche5tmhn" Apr 21 15:59:20.715243 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:59:20.715219 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/6f206a71-fd22-4b85-bbae-e38488586fb3-tokenizer-tmp\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche5tmhn\" (UID: \"6f206a71-fd22-4b85-bbae-e38488586fb3\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche5tmhn" Apr 21 15:59:20.715303 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:59:20.715253 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/6f206a71-fd22-4b85-bbae-e38488586fb3-tokenizer-cache\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche5tmhn\" (UID: \"6f206a71-fd22-4b85-bbae-e38488586fb3\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche5tmhn" Apr 21 15:59:20.715542 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:59:20.715514 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/6f206a71-fd22-4b85-bbae-e38488586fb3-tokenizer-uds\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche5tmhn\" (UID: \"6f206a71-fd22-4b85-bbae-e38488586fb3\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche5tmhn" Apr 21 15:59:20.715634 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:59:20.715520 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6f206a71-fd22-4b85-bbae-e38488586fb3-kserve-provision-location\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche5tmhn\" (UID: \"6f206a71-fd22-4b85-bbae-e38488586fb3\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche5tmhn" Apr 21 15:59:20.717258 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:59:20.717228 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6f206a71-fd22-4b85-bbae-e38488586fb3-tls-certs\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche5tmhn\" (UID: \"6f206a71-fd22-4b85-bbae-e38488586fb3\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche5tmhn" Apr 21 15:59:20.723637 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:59:20.723610 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wptcq\" (UniqueName: \"kubernetes.io/projected/6f206a71-fd22-4b85-bbae-e38488586fb3-kube-api-access-wptcq\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche5tmhn\" (UID: \"6f206a71-fd22-4b85-bbae-e38488586fb3\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche5tmhn" Apr 21 15:59:20.746680 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:59:20.746654 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche5tmhn" Apr 21 15:59:20.887151 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:59:20.887113 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche5tmhn"] Apr 21 15:59:20.889772 ip-10-0-136-123 kubenswrapper[2573]: W0421 15:59:20.889743 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6f206a71_fd22_4b85_bbae_e38488586fb3.slice/crio-aebc7615361a233921ba07797aceeefd7c531e6e988924a4070ca922419dbe08 WatchSource:0}: Error finding container aebc7615361a233921ba07797aceeefd7c531e6e988924a4070ca922419dbe08: Status 404 returned error can't find the container with id aebc7615361a233921ba07797aceeefd7c531e6e988924a4070ca922419dbe08 Apr 21 15:59:21.435892 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:59:21.435852 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-f47l6w4" event={"ID":"88d11bfe-261a-4d92-9f6d-092c5ebea6e7","Type":"ContainerStarted","Data":"ceef73b45dc0a27f28ce71c4e7d72d7de0520aebd2f8da43a7e303799ac03e16"} Apr 21 15:59:21.437741 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:59:21.437595 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-f47l6w4" event={"ID":"88d11bfe-261a-4d92-9f6d-092c5ebea6e7","Type":"ContainerStarted","Data":"5363b6c18cab8f3ab2a215790dbb37606937ac0871df7139765565c3ef16442f"} Apr 21 15:59:21.444531 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:59:21.444498 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7bfb78dbd-2t84b" event={"ID":"6961ef97-4c61-4ba5-85ed-00ddf50f12e9","Type":"ContainerStarted","Data":"42d327fc25524ad4a1a2e59efbf3cf45d1fd39aa81291af32e651f9931d312c4"} Apr 21 15:59:21.450102 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:59:21.450045 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche5tmhn" event={"ID":"6f206a71-fd22-4b85-bbae-e38488586fb3","Type":"ContainerStarted","Data":"87fef2b240b436bf95104e159baac36efc14326de40414727df517031bb77bdb"} Apr 21 15:59:21.450102 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:59:21.450090 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche5tmhn" event={"ID":"6f206a71-fd22-4b85-bbae-e38488586fb3","Type":"ContainerStarted","Data":"aebc7615361a233921ba07797aceeefd7c531e6e988924a4070ca922419dbe08"} Apr 21 15:59:22.455614 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:59:22.455575 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7bfb78dbd-2t84b" event={"ID":"6961ef97-4c61-4ba5-85ed-00ddf50f12e9","Type":"ContainerStarted","Data":"9485dd29a524f2e1702ccb1e112dadd802a9a6344a09244ada677f9d86e00c3c"} Apr 21 15:59:22.456080 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:59:22.455703 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7bfb78dbd-2t84b" Apr 21 15:59:22.457099 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:59:22.457071 2573 generic.go:358] "Generic (PLEG): container finished" podID="6f206a71-fd22-4b85-bbae-e38488586fb3" containerID="87fef2b240b436bf95104e159baac36efc14326de40414727df517031bb77bdb" exitCode=0 Apr 21 15:59:22.457218 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:59:22.457140 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche5tmhn" event={"ID":"6f206a71-fd22-4b85-bbae-e38488586fb3","Type":"ContainerDied","Data":"87fef2b240b436bf95104e159baac36efc14326de40414727df517031bb77bdb"} Apr 21 15:59:23.464921 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:59:23.464865 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche5tmhn" event={"ID":"6f206a71-fd22-4b85-bbae-e38488586fb3","Type":"ContainerStarted","Data":"15831cc00facb007eb29f78ec5f0da7974ad1c2f5d598a5346d5ac40fca57c4a"} Apr 21 15:59:23.465594 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:59:23.464957 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche5tmhn" event={"ID":"6f206a71-fd22-4b85-bbae-e38488586fb3","Type":"ContainerStarted","Data":"535ef156d789d9db74efe948658c2315011a8bee44203733fa2b02c150b7ee92"} Apr 21 15:59:23.465594 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:59:23.465015 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche5tmhn" Apr 21 15:59:23.467512 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:59:23.467450 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7bfb78dbd-2t84b" event={"ID":"6961ef97-4c61-4ba5-85ed-00ddf50f12e9","Type":"ContainerStarted","Data":"8497fff3e514e70b68c56a45eefcb47553bcf37a64db2e6898530da1afab8c2c"} Apr 21 15:59:23.490480 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:59:23.490415 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche5tmhn" podStartSLOduration=3.490395708 podStartE2EDuration="3.490395708s" podCreationTimestamp="2026-04-21 15:59:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 15:59:23.488073688 +0000 UTC m=+1458.618127324" watchObservedRunningTime="2026-04-21 15:59:23.490395708 +0000 UTC m=+1458.620449343" Apr 21 15:59:26.487298 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:59:26.487189 2573 generic.go:358] "Generic (PLEG): container finished" podID="88d11bfe-261a-4d92-9f6d-092c5ebea6e7" containerID="ceef73b45dc0a27f28ce71c4e7d72d7de0520aebd2f8da43a7e303799ac03e16" exitCode=0 Apr 21 15:59:26.487298 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:59:26.487279 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-f47l6w4" event={"ID":"88d11bfe-261a-4d92-9f6d-092c5ebea6e7","Type":"ContainerDied","Data":"ceef73b45dc0a27f28ce71c4e7d72d7de0520aebd2f8da43a7e303799ac03e16"} Apr 21 15:59:26.599440 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:59:26.599383 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-5449f976b6-ktl2c" podUID="fa8e534b-9e54-4a71-bb6e-72ab0aaf97ab" containerName="main" probeResult="failure" output="Get \"https://10.132.0.42:8000/health\": dial tcp 10.132.0.42:8000: connect: connection refused" Apr 21 15:59:27.492539 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:59:27.492486 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-f47l6w4" event={"ID":"88d11bfe-261a-4d92-9f6d-092c5ebea6e7","Type":"ContainerStarted","Data":"0b791d899605216f97340877008ac961d76e17004984e905f8a5214a3a7866bd"} Apr 21 15:59:27.494289 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:59:27.494260 2573 generic.go:358] "Generic (PLEG): container finished" podID="6961ef97-4c61-4ba5-85ed-00ddf50f12e9" containerID="8497fff3e514e70b68c56a45eefcb47553bcf37a64db2e6898530da1afab8c2c" exitCode=0 Apr 21 15:59:27.494436 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:59:27.494320 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7bfb78dbd-2t84b" event={"ID":"6961ef97-4c61-4ba5-85ed-00ddf50f12e9","Type":"ContainerDied","Data":"8497fff3e514e70b68c56a45eefcb47553bcf37a64db2e6898530da1afab8c2c"} Apr 21 15:59:27.517476 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:59:27.517420 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-f47l6w4" podStartSLOduration=7.517402078 podStartE2EDuration="7.517402078s" podCreationTimestamp="2026-04-21 15:59:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 15:59:27.51495071 +0000 UTC m=+1462.645004345" watchObservedRunningTime="2026-04-21 15:59:27.517402078 +0000 UTC m=+1462.647455713" Apr 21 15:59:28.500602 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:59:28.500556 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7bfb78dbd-2t84b" event={"ID":"6961ef97-4c61-4ba5-85ed-00ddf50f12e9","Type":"ContainerStarted","Data":"d02937932f4340f31d95a430e5137692b510cd6d381683c1873ce81aa6192190"} Apr 21 15:59:28.528312 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:59:28.528244 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7bfb78dbd-2t84b" podStartSLOduration=7.282369829 podStartE2EDuration="8.528223298s" podCreationTimestamp="2026-04-21 15:59:20 +0000 UTC" firstStartedPulling="2026-04-21 15:59:20.620412296 +0000 UTC m=+1455.750465924" lastFinishedPulling="2026-04-21 15:59:21.866265781 +0000 UTC m=+1456.996319393" observedRunningTime="2026-04-21 15:59:28.524315416 +0000 UTC m=+1463.654369052" watchObservedRunningTime="2026-04-21 15:59:28.528223298 +0000 UTC m=+1463.658276932" Apr 21 15:59:30.470966 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:59:30.470911 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7bfb78dbd-2t84b" Apr 21 15:59:30.470966 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:59:30.470966 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7bfb78dbd-2t84b" Apr 21 15:59:30.472606 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:59:30.472571 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7bfb78dbd-2t84b" podUID="6961ef97-4c61-4ba5-85ed-00ddf50f12e9" containerName="main" probeResult="failure" output="Get \"https://10.132.0.43:8001/health\": dial tcp 10.132.0.43:8001: connect: connection refused" Apr 21 15:59:30.490270 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:59:30.490237 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-f47l6w4" Apr 21 15:59:30.490420 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:59:30.490286 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-f47l6w4" Apr 21 15:59:30.491773 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:59:30.491737 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-f47l6w4" podUID="88d11bfe-261a-4d92-9f6d-092c5ebea6e7" containerName="main" probeResult="failure" output="Get \"https://10.132.0.44:8000/health\": dial tcp 10.132.0.44:8000: connect: connection refused" Apr 21 15:59:30.747945 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:59:30.747845 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche5tmhn" Apr 21 15:59:30.747945 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:59:30.747893 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche5tmhn" Apr 21 15:59:30.749855 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:59:30.749513 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche5tmhn" podUID="6f206a71-fd22-4b85-bbae-e38488586fb3" containerName="tokenizer" probeResult="failure" output="Get \"http://10.132.0.45:8082/healthz\": dial tcp 10.132.0.45:8082: connect: connection refused" Apr 21 15:59:36.599269 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:59:36.599210 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-5449f976b6-ktl2c" podUID="fa8e534b-9e54-4a71-bb6e-72ab0aaf97ab" containerName="main" probeResult="failure" output="Get \"https://10.132.0.42:8000/health\": dial tcp 10.132.0.42:8000: connect: connection refused" Apr 21 15:59:40.470712 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:59:40.470661 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7bfb78dbd-2t84b" podUID="6961ef97-4c61-4ba5-85ed-00ddf50f12e9" containerName="main" probeResult="failure" output="Get \"https://10.132.0.43:8001/health\": dial tcp 10.132.0.43:8001: connect: connection refused" Apr 21 15:59:40.488974 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:59:40.488940 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7bfb78dbd-2t84b" Apr 21 15:59:40.490900 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:59:40.490866 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-f47l6w4" podUID="88d11bfe-261a-4d92-9f6d-092c5ebea6e7" containerName="main" probeResult="failure" output="Get \"https://10.132.0.44:8000/health\": dial tcp 10.132.0.44:8000: connect: connection refused" Apr 21 15:59:40.749446 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:59:40.749347 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche5tmhn" Apr 21 15:59:40.750862 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:59:40.750834 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche5tmhn" Apr 21 15:59:46.608819 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:59:46.608770 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-5449f976b6-ktl2c" Apr 21 15:59:46.616558 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:59:46.616526 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-5449f976b6-ktl2c" Apr 21 15:59:50.470496 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:59:50.470453 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7bfb78dbd-2t84b" podUID="6961ef97-4c61-4ba5-85ed-00ddf50f12e9" containerName="main" probeResult="failure" output="Get \"https://10.132.0.43:8001/health\": dial tcp 10.132.0.43:8001: connect: connection refused" Apr 21 15:59:50.491351 ip-10-0-136-123 kubenswrapper[2573]: I0421 15:59:50.491307 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-f47l6w4" podUID="88d11bfe-261a-4d92-9f6d-092c5ebea6e7" containerName="main" probeResult="failure" output="Get \"https://10.132.0.44:8000/health\": dial tcp 10.132.0.44:8000: connect: connection refused" Apr 21 16:00:00.471148 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:00:00.471097 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7bfb78dbd-2t84b" podUID="6961ef97-4c61-4ba5-85ed-00ddf50f12e9" containerName="main" probeResult="failure" output="Get \"https://10.132.0.43:8001/health\": dial tcp 10.132.0.43:8001: connect: connection refused" Apr 21 16:00:00.490723 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:00:00.490661 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-f47l6w4" podUID="88d11bfe-261a-4d92-9f6d-092c5ebea6e7" containerName="main" probeResult="failure" output="Get \"https://10.132.0.44:8000/health\": dial tcp 10.132.0.44:8000: connect: connection refused" Apr 21 16:00:01.554503 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:00:01.554468 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche5tmhn" Apr 21 16:00:05.496302 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:00:05.496272 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9v44z_a8821bf6-e244-4b55-bfcc-7d85dec39bc4/ovn-acl-logging/0.log" Apr 21 16:00:05.497544 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:00:05.497515 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9v44z_a8821bf6-e244-4b55-bfcc-7d85dec39bc4/ovn-acl-logging/0.log" Apr 21 16:00:08.829067 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:00:08.829030 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-5449f976b6-ktl2c"] Apr 21 16:00:08.829999 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:00:08.829960 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-5449f976b6-ktl2c" podUID="fa8e534b-9e54-4a71-bb6e-72ab0aaf97ab" containerName="main" containerID="cri-o://455710a0f9449e275c69495b5acfb690f7582aa7e96516958ad92faacf270c96" gracePeriod=30 Apr 21 16:00:10.470737 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:00:10.470696 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7bfb78dbd-2t84b" podUID="6961ef97-4c61-4ba5-85ed-00ddf50f12e9" containerName="main" probeResult="failure" output="Get \"https://10.132.0.43:8001/health\": dial tcp 10.132.0.43:8001: connect: connection refused" Apr 21 16:00:10.490445 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:00:10.490408 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-f47l6w4" podUID="88d11bfe-261a-4d92-9f6d-092c5ebea6e7" containerName="main" probeResult="failure" output="Get \"https://10.132.0.44:8000/health\": dial tcp 10.132.0.44:8000: connect: connection refused" Apr 21 16:00:20.471411 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:00:20.471355 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7bfb78dbd-2t84b" podUID="6961ef97-4c61-4ba5-85ed-00ddf50f12e9" containerName="main" probeResult="failure" output="Get \"https://10.132.0.43:8001/health\": dial tcp 10.132.0.43:8001: connect: connection refused" Apr 21 16:00:20.491141 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:00:20.491104 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-f47l6w4" podUID="88d11bfe-261a-4d92-9f6d-092c5ebea6e7" containerName="main" probeResult="failure" output="Get \"https://10.132.0.44:8000/health\": dial tcp 10.132.0.44:8000: connect: connection refused" Apr 21 16:00:30.470976 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:00:30.470875 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7bfb78dbd-2t84b" podUID="6961ef97-4c61-4ba5-85ed-00ddf50f12e9" containerName="main" probeResult="failure" output="Get \"https://10.132.0.43:8001/health\": dial tcp 10.132.0.43:8001: connect: connection refused" Apr 21 16:00:30.491673 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:00:30.491639 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-f47l6w4" podUID="88d11bfe-261a-4d92-9f6d-092c5ebea6e7" containerName="main" probeResult="failure" output="Get \"https://10.132.0.44:8000/health\": dial tcp 10.132.0.44:8000: connect: connection refused" Apr 21 16:00:39.175191 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:00:39.175162 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-5449f976b6-ktl2c" Apr 21 16:00:39.247785 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:00:39.247740 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/fa8e534b-9e54-4a71-bb6e-72ab0aaf97ab-dshm\") pod \"fa8e534b-9e54-4a71-bb6e-72ab0aaf97ab\" (UID: \"fa8e534b-9e54-4a71-bb6e-72ab0aaf97ab\") " Apr 21 16:00:39.247950 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:00:39.247851 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fa8e534b-9e54-4a71-bb6e-72ab0aaf97ab-kserve-provision-location\") pod \"fa8e534b-9e54-4a71-bb6e-72ab0aaf97ab\" (UID: \"fa8e534b-9e54-4a71-bb6e-72ab0aaf97ab\") " Apr 21 16:00:39.247950 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:00:39.247920 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ks9pc\" (UniqueName: \"kubernetes.io/projected/fa8e534b-9e54-4a71-bb6e-72ab0aaf97ab-kube-api-access-ks9pc\") pod \"fa8e534b-9e54-4a71-bb6e-72ab0aaf97ab\" (UID: \"fa8e534b-9e54-4a71-bb6e-72ab0aaf97ab\") " Apr 21 16:00:39.248024 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:00:39.247958 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/fa8e534b-9e54-4a71-bb6e-72ab0aaf97ab-home\") pod \"fa8e534b-9e54-4a71-bb6e-72ab0aaf97ab\" (UID: \"fa8e534b-9e54-4a71-bb6e-72ab0aaf97ab\") " Apr 21 16:00:39.248024 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:00:39.248017 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/fa8e534b-9e54-4a71-bb6e-72ab0aaf97ab-model-cache\") pod \"fa8e534b-9e54-4a71-bb6e-72ab0aaf97ab\" (UID: \"fa8e534b-9e54-4a71-bb6e-72ab0aaf97ab\") " Apr 21 16:00:39.248130 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:00:39.248058 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/fa8e534b-9e54-4a71-bb6e-72ab0aaf97ab-tls-certs\") pod \"fa8e534b-9e54-4a71-bb6e-72ab0aaf97ab\" (UID: \"fa8e534b-9e54-4a71-bb6e-72ab0aaf97ab\") " Apr 21 16:00:39.248286 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:00:39.248250 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fa8e534b-9e54-4a71-bb6e-72ab0aaf97ab-model-cache" (OuterVolumeSpecName: "model-cache") pod "fa8e534b-9e54-4a71-bb6e-72ab0aaf97ab" (UID: "fa8e534b-9e54-4a71-bb6e-72ab0aaf97ab"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 16:00:39.248422 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:00:39.248356 2573 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/fa8e534b-9e54-4a71-bb6e-72ab0aaf97ab-model-cache\") on node \"ip-10-0-136-123.ec2.internal\" DevicePath \"\"" Apr 21 16:00:39.248422 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:00:39.248365 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fa8e534b-9e54-4a71-bb6e-72ab0aaf97ab-home" (OuterVolumeSpecName: "home") pod "fa8e534b-9e54-4a71-bb6e-72ab0aaf97ab" (UID: "fa8e534b-9e54-4a71-bb6e-72ab0aaf97ab"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 16:00:39.250245 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:00:39.250222 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa8e534b-9e54-4a71-bb6e-72ab0aaf97ab-kube-api-access-ks9pc" (OuterVolumeSpecName: "kube-api-access-ks9pc") pod "fa8e534b-9e54-4a71-bb6e-72ab0aaf97ab" (UID: "fa8e534b-9e54-4a71-bb6e-72ab0aaf97ab"). InnerVolumeSpecName "kube-api-access-ks9pc". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 16:00:39.250462 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:00:39.250425 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa8e534b-9e54-4a71-bb6e-72ab0aaf97ab-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "fa8e534b-9e54-4a71-bb6e-72ab0aaf97ab" (UID: "fa8e534b-9e54-4a71-bb6e-72ab0aaf97ab"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 16:00:39.250462 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:00:39.250438 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fa8e534b-9e54-4a71-bb6e-72ab0aaf97ab-dshm" (OuterVolumeSpecName: "dshm") pod "fa8e534b-9e54-4a71-bb6e-72ab0aaf97ab" (UID: "fa8e534b-9e54-4a71-bb6e-72ab0aaf97ab"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 16:00:39.312309 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:00:39.312274 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fa8e534b-9e54-4a71-bb6e-72ab0aaf97ab-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "fa8e534b-9e54-4a71-bb6e-72ab0aaf97ab" (UID: "fa8e534b-9e54-4a71-bb6e-72ab0aaf97ab"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 16:00:39.349367 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:00:39.349309 2573 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/fa8e534b-9e54-4a71-bb6e-72ab0aaf97ab-dshm\") on node \"ip-10-0-136-123.ec2.internal\" DevicePath \"\"" Apr 21 16:00:39.349367 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:00:39.349335 2573 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fa8e534b-9e54-4a71-bb6e-72ab0aaf97ab-kserve-provision-location\") on node \"ip-10-0-136-123.ec2.internal\" DevicePath \"\"" Apr 21 16:00:39.349367 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:00:39.349347 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-ks9pc\" (UniqueName: \"kubernetes.io/projected/fa8e534b-9e54-4a71-bb6e-72ab0aaf97ab-kube-api-access-ks9pc\") on node \"ip-10-0-136-123.ec2.internal\" DevicePath \"\"" Apr 21 16:00:39.349367 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:00:39.349355 2573 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/fa8e534b-9e54-4a71-bb6e-72ab0aaf97ab-home\") on node \"ip-10-0-136-123.ec2.internal\" DevicePath \"\"" Apr 21 16:00:39.349367 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:00:39.349364 2573 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/fa8e534b-9e54-4a71-bb6e-72ab0aaf97ab-tls-certs\") on node \"ip-10-0-136-123.ec2.internal\" DevicePath \"\"" Apr 21 16:00:39.781372 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:00:39.781286 2573 generic.go:358] "Generic (PLEG): container finished" podID="fa8e534b-9e54-4a71-bb6e-72ab0aaf97ab" containerID="455710a0f9449e275c69495b5acfb690f7582aa7e96516958ad92faacf270c96" exitCode=137 Apr 21 16:00:39.781372 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:00:39.781346 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-5449f976b6-ktl2c" event={"ID":"fa8e534b-9e54-4a71-bb6e-72ab0aaf97ab","Type":"ContainerDied","Data":"455710a0f9449e275c69495b5acfb690f7582aa7e96516958ad92faacf270c96"} Apr 21 16:00:39.781372 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:00:39.781366 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-5449f976b6-ktl2c" Apr 21 16:00:39.781614 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:00:39.781380 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-5449f976b6-ktl2c" event={"ID":"fa8e534b-9e54-4a71-bb6e-72ab0aaf97ab","Type":"ContainerDied","Data":"2faa3f2971075393c3d6b798b0af99204ab17f829c4df0062b67ccbd3beb3f56"} Apr 21 16:00:39.781614 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:00:39.781400 2573 scope.go:117] "RemoveContainer" containerID="455710a0f9449e275c69495b5acfb690f7582aa7e96516958ad92faacf270c96" Apr 21 16:00:39.800682 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:00:39.800602 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-5449f976b6-ktl2c"] Apr 21 16:00:39.800682 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:00:39.800648 2573 scope.go:117] "RemoveContainer" containerID="298f11c62f999a955be57859b470d55f379d91b6a22cafb0cbabf69932e679da" Apr 21 16:00:39.804284 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:00:39.804263 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-5449f976b6-ktl2c"] Apr 21 16:00:39.874475 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:00:39.874302 2573 scope.go:117] "RemoveContainer" containerID="455710a0f9449e275c69495b5acfb690f7582aa7e96516958ad92faacf270c96" Apr 21 16:00:39.874713 ip-10-0-136-123 kubenswrapper[2573]: E0421 16:00:39.874690 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"455710a0f9449e275c69495b5acfb690f7582aa7e96516958ad92faacf270c96\": container with ID starting with 455710a0f9449e275c69495b5acfb690f7582aa7e96516958ad92faacf270c96 not found: ID does not exist" containerID="455710a0f9449e275c69495b5acfb690f7582aa7e96516958ad92faacf270c96" Apr 21 16:00:39.874841 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:00:39.874726 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"455710a0f9449e275c69495b5acfb690f7582aa7e96516958ad92faacf270c96"} err="failed to get container status \"455710a0f9449e275c69495b5acfb690f7582aa7e96516958ad92faacf270c96\": rpc error: code = NotFound desc = could not find container \"455710a0f9449e275c69495b5acfb690f7582aa7e96516958ad92faacf270c96\": container with ID starting with 455710a0f9449e275c69495b5acfb690f7582aa7e96516958ad92faacf270c96 not found: ID does not exist" Apr 21 16:00:39.874841 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:00:39.874756 2573 scope.go:117] "RemoveContainer" containerID="298f11c62f999a955be57859b470d55f379d91b6a22cafb0cbabf69932e679da" Apr 21 16:00:39.875080 ip-10-0-136-123 kubenswrapper[2573]: E0421 16:00:39.875061 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"298f11c62f999a955be57859b470d55f379d91b6a22cafb0cbabf69932e679da\": container with ID starting with 298f11c62f999a955be57859b470d55f379d91b6a22cafb0cbabf69932e679da not found: ID does not exist" containerID="298f11c62f999a955be57859b470d55f379d91b6a22cafb0cbabf69932e679da" Apr 21 16:00:39.875122 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:00:39.875087 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"298f11c62f999a955be57859b470d55f379d91b6a22cafb0cbabf69932e679da"} err="failed to get container status \"298f11c62f999a955be57859b470d55f379d91b6a22cafb0cbabf69932e679da\": rpc error: code = NotFound desc = could not find container \"298f11c62f999a955be57859b470d55f379d91b6a22cafb0cbabf69932e679da\": container with ID starting with 298f11c62f999a955be57859b470d55f379d91b6a22cafb0cbabf69932e679da not found: ID does not exist" Apr 21 16:00:40.470630 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:00:40.470580 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7bfb78dbd-2t84b" podUID="6961ef97-4c61-4ba5-85ed-00ddf50f12e9" containerName="main" probeResult="failure" output="Get \"https://10.132.0.43:8001/health\": dial tcp 10.132.0.43:8001: connect: connection refused" Apr 21 16:00:40.491096 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:00:40.491066 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-f47l6w4" podUID="88d11bfe-261a-4d92-9f6d-092c5ebea6e7" containerName="main" probeResult="failure" output="Get \"https://10.132.0.44:8000/health\": dial tcp 10.132.0.44:8000: connect: connection refused" Apr 21 16:00:41.482146 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:00:41.482112 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fa8e534b-9e54-4a71-bb6e-72ab0aaf97ab" path="/var/lib/kubelet/pods/fa8e534b-9e54-4a71-bb6e-72ab0aaf97ab/volumes" Apr 21 16:00:50.470481 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:00:50.470438 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7bfb78dbd-2t84b" podUID="6961ef97-4c61-4ba5-85ed-00ddf50f12e9" containerName="main" probeResult="failure" output="Get \"https://10.132.0.43:8001/health\": dial tcp 10.132.0.43:8001: connect: connection refused" Apr 21 16:00:50.490906 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:00:50.490863 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-f47l6w4" podUID="88d11bfe-261a-4d92-9f6d-092c5ebea6e7" containerName="main" probeResult="failure" output="Get \"https://10.132.0.44:8000/health\": dial tcp 10.132.0.44:8000: connect: connection refused" Apr 21 16:01:00.470879 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:01:00.470820 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7bfb78dbd-2t84b" podUID="6961ef97-4c61-4ba5-85ed-00ddf50f12e9" containerName="main" probeResult="failure" output="Get \"https://10.132.0.43:8001/health\": dial tcp 10.132.0.43:8001: connect: connection refused" Apr 21 16:01:00.491588 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:01:00.491548 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-f47l6w4" podUID="88d11bfe-261a-4d92-9f6d-092c5ebea6e7" containerName="main" probeResult="failure" output="Get \"https://10.132.0.44:8000/health\": dial tcp 10.132.0.44:8000: connect: connection refused" Apr 21 16:01:10.470564 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:01:10.470498 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7bfb78dbd-2t84b" podUID="6961ef97-4c61-4ba5-85ed-00ddf50f12e9" containerName="main" probeResult="failure" output="Get \"https://10.132.0.43:8001/health\": dial tcp 10.132.0.43:8001: connect: connection refused" Apr 21 16:01:10.490904 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:01:10.490863 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-f47l6w4" podUID="88d11bfe-261a-4d92-9f6d-092c5ebea6e7" containerName="main" probeResult="failure" output="Get \"https://10.132.0.44:8000/health\": dial tcp 10.132.0.44:8000: connect: connection refused" Apr 21 16:01:20.471032 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:01:20.470986 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7bfb78dbd-2t84b" podUID="6961ef97-4c61-4ba5-85ed-00ddf50f12e9" containerName="main" probeResult="failure" output="Get \"https://10.132.0.43:8001/health\": dial tcp 10.132.0.43:8001: connect: connection refused" Apr 21 16:01:20.491067 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:01:20.491027 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-f47l6w4" podUID="88d11bfe-261a-4d92-9f6d-092c5ebea6e7" containerName="main" probeResult="failure" output="Get \"https://10.132.0.44:8000/health\": dial tcp 10.132.0.44:8000: connect: connection refused" Apr 21 16:01:30.470698 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:01:30.470640 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7bfb78dbd-2t84b" podUID="6961ef97-4c61-4ba5-85ed-00ddf50f12e9" containerName="main" probeResult="failure" output="Get \"https://10.132.0.43:8001/health\": dial tcp 10.132.0.43:8001: connect: connection refused" Apr 21 16:01:30.490948 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:01:30.490913 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-f47l6w4" podUID="88d11bfe-261a-4d92-9f6d-092c5ebea6e7" containerName="main" probeResult="failure" output="Get \"https://10.132.0.44:8000/health\": dial tcp 10.132.0.44:8000: connect: connection refused" Apr 21 16:01:40.480282 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:01:40.480250 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7bfb78dbd-2t84b" Apr 21 16:01:40.493170 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:01:40.493148 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7bfb78dbd-2t84b" Apr 21 16:01:40.500221 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:01:40.500200 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-f47l6w4" Apr 21 16:01:40.509094 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:01:40.509071 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-f47l6w4" Apr 21 16:01:52.946853 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:01:52.946816 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-f47l6w4"] Apr 21 16:01:52.947374 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:01:52.947205 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-f47l6w4" podUID="88d11bfe-261a-4d92-9f6d-092c5ebea6e7" containerName="main" containerID="cri-o://0b791d899605216f97340877008ac961d76e17004984e905f8a5214a3a7866bd" gracePeriod=30 Apr 21 16:01:52.951664 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:01:52.951639 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche5tmhn"] Apr 21 16:01:52.951951 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:01:52.951925 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche5tmhn" podUID="6f206a71-fd22-4b85-bbae-e38488586fb3" containerName="main" containerID="cri-o://535ef156d789d9db74efe948658c2315011a8bee44203733fa2b02c150b7ee92" gracePeriod=30 Apr 21 16:01:52.952032 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:01:52.951925 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche5tmhn" podUID="6f206a71-fd22-4b85-bbae-e38488586fb3" containerName="tokenizer" containerID="cri-o://15831cc00facb007eb29f78ec5f0da7974ad1c2f5d598a5346d5ac40fca57c4a" gracePeriod=30 Apr 21 16:01:52.962529 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:01:52.962501 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7bfb78dbd-2t84b"] Apr 21 16:01:52.962838 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:01:52.962809 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7bfb78dbd-2t84b" podUID="6961ef97-4c61-4ba5-85ed-00ddf50f12e9" containerName="main" containerID="cri-o://d02937932f4340f31d95a430e5137692b510cd6d381683c1873ce81aa6192190" gracePeriod=30 Apr 21 16:01:54.058972 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:01:54.058938 2573 generic.go:358] "Generic (PLEG): container finished" podID="6f206a71-fd22-4b85-bbae-e38488586fb3" containerID="15831cc00facb007eb29f78ec5f0da7974ad1c2f5d598a5346d5ac40fca57c4a" exitCode=0 Apr 21 16:01:54.058972 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:01:54.058965 2573 generic.go:358] "Generic (PLEG): container finished" podID="6f206a71-fd22-4b85-bbae-e38488586fb3" containerID="535ef156d789d9db74efe948658c2315011a8bee44203733fa2b02c150b7ee92" exitCode=0 Apr 21 16:01:54.059452 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:01:54.058992 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche5tmhn" event={"ID":"6f206a71-fd22-4b85-bbae-e38488586fb3","Type":"ContainerDied","Data":"15831cc00facb007eb29f78ec5f0da7974ad1c2f5d598a5346d5ac40fca57c4a"} Apr 21 16:01:54.059452 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:01:54.059028 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche5tmhn" event={"ID":"6f206a71-fd22-4b85-bbae-e38488586fb3","Type":"ContainerDied","Data":"535ef156d789d9db74efe948658c2315011a8bee44203733fa2b02c150b7ee92"} Apr 21 16:01:54.108704 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:01:54.108682 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche5tmhn" Apr 21 16:01:54.190197 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:01:54.190137 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/6f206a71-fd22-4b85-bbae-e38488586fb3-tokenizer-uds\") pod \"6f206a71-fd22-4b85-bbae-e38488586fb3\" (UID: \"6f206a71-fd22-4b85-bbae-e38488586fb3\") " Apr 21 16:01:54.190308 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:01:54.190202 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wptcq\" (UniqueName: \"kubernetes.io/projected/6f206a71-fd22-4b85-bbae-e38488586fb3-kube-api-access-wptcq\") pod \"6f206a71-fd22-4b85-bbae-e38488586fb3\" (UID: \"6f206a71-fd22-4b85-bbae-e38488586fb3\") " Apr 21 16:01:54.190308 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:01:54.190231 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6f206a71-fd22-4b85-bbae-e38488586fb3-kserve-provision-location\") pod \"6f206a71-fd22-4b85-bbae-e38488586fb3\" (UID: \"6f206a71-fd22-4b85-bbae-e38488586fb3\") " Apr 21 16:01:54.190308 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:01:54.190255 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6f206a71-fd22-4b85-bbae-e38488586fb3-tls-certs\") pod \"6f206a71-fd22-4b85-bbae-e38488586fb3\" (UID: \"6f206a71-fd22-4b85-bbae-e38488586fb3\") " Apr 21 16:01:54.190469 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:01:54.190305 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/6f206a71-fd22-4b85-bbae-e38488586fb3-tokenizer-cache\") pod \"6f206a71-fd22-4b85-bbae-e38488586fb3\" (UID: \"6f206a71-fd22-4b85-bbae-e38488586fb3\") " Apr 21 16:01:54.190469 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:01:54.190337 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/6f206a71-fd22-4b85-bbae-e38488586fb3-tokenizer-tmp\") pod \"6f206a71-fd22-4b85-bbae-e38488586fb3\" (UID: \"6f206a71-fd22-4b85-bbae-e38488586fb3\") " Apr 21 16:01:54.190469 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:01:54.190360 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6f206a71-fd22-4b85-bbae-e38488586fb3-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "6f206a71-fd22-4b85-bbae-e38488586fb3" (UID: "6f206a71-fd22-4b85-bbae-e38488586fb3"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 16:01:54.190659 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:01:54.190567 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6f206a71-fd22-4b85-bbae-e38488586fb3-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "6f206a71-fd22-4b85-bbae-e38488586fb3" (UID: "6f206a71-fd22-4b85-bbae-e38488586fb3"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 16:01:54.190659 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:01:54.190598 2573 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/6f206a71-fd22-4b85-bbae-e38488586fb3-tokenizer-uds\") on node \"ip-10-0-136-123.ec2.internal\" DevicePath \"\"" Apr 21 16:01:54.190734 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:01:54.190695 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6f206a71-fd22-4b85-bbae-e38488586fb3-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "6f206a71-fd22-4b85-bbae-e38488586fb3" (UID: "6f206a71-fd22-4b85-bbae-e38488586fb3"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 16:01:54.190909 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:01:54.190891 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6f206a71-fd22-4b85-bbae-e38488586fb3-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "6f206a71-fd22-4b85-bbae-e38488586fb3" (UID: "6f206a71-fd22-4b85-bbae-e38488586fb3"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 16:01:54.192369 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:01:54.192347 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f206a71-fd22-4b85-bbae-e38488586fb3-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "6f206a71-fd22-4b85-bbae-e38488586fb3" (UID: "6f206a71-fd22-4b85-bbae-e38488586fb3"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 16:01:54.192422 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:01:54.192386 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f206a71-fd22-4b85-bbae-e38488586fb3-kube-api-access-wptcq" (OuterVolumeSpecName: "kube-api-access-wptcq") pod "6f206a71-fd22-4b85-bbae-e38488586fb3" (UID: "6f206a71-fd22-4b85-bbae-e38488586fb3"). InnerVolumeSpecName "kube-api-access-wptcq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 16:01:54.291045 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:01:54.291023 2573 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/6f206a71-fd22-4b85-bbae-e38488586fb3-tokenizer-cache\") on node \"ip-10-0-136-123.ec2.internal\" DevicePath \"\"" Apr 21 16:01:54.291045 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:01:54.291045 2573 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/6f206a71-fd22-4b85-bbae-e38488586fb3-tokenizer-tmp\") on node \"ip-10-0-136-123.ec2.internal\" DevicePath \"\"" Apr 21 16:01:54.291177 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:01:54.291055 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-wptcq\" (UniqueName: \"kubernetes.io/projected/6f206a71-fd22-4b85-bbae-e38488586fb3-kube-api-access-wptcq\") on node \"ip-10-0-136-123.ec2.internal\" DevicePath \"\"" Apr 21 16:01:54.291177 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:01:54.291064 2573 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6f206a71-fd22-4b85-bbae-e38488586fb3-kserve-provision-location\") on node \"ip-10-0-136-123.ec2.internal\" DevicePath \"\"" Apr 21 16:01:54.291177 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:01:54.291074 2573 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6f206a71-fd22-4b85-bbae-e38488586fb3-tls-certs\") on node \"ip-10-0-136-123.ec2.internal\" DevicePath \"\"" Apr 21 16:01:55.064306 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:01:55.064243 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche5tmhn" Apr 21 16:01:55.064306 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:01:55.064251 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche5tmhn" event={"ID":"6f206a71-fd22-4b85-bbae-e38488586fb3","Type":"ContainerDied","Data":"aebc7615361a233921ba07797aceeefd7c531e6e988924a4070ca922419dbe08"} Apr 21 16:01:55.064306 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:01:55.064295 2573 scope.go:117] "RemoveContainer" containerID="15831cc00facb007eb29f78ec5f0da7974ad1c2f5d598a5346d5ac40fca57c4a" Apr 21 16:01:55.072593 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:01:55.072575 2573 scope.go:117] "RemoveContainer" containerID="535ef156d789d9db74efe948658c2315011a8bee44203733fa2b02c150b7ee92" Apr 21 16:01:55.082725 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:01:55.082706 2573 scope.go:117] "RemoveContainer" containerID="87fef2b240b436bf95104e159baac36efc14326de40414727df517031bb77bdb" Apr 21 16:01:55.087745 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:01:55.087718 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche5tmhn"] Apr 21 16:01:55.097274 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:01:55.093227 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche5tmhn"] Apr 21 16:01:55.481414 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:01:55.481339 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6f206a71-fd22-4b85-bbae-e38488586fb3" path="/var/lib/kubelet/pods/6f206a71-fd22-4b85-bbae-e38488586fb3/volumes" Apr 21 16:02:03.409986 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:02:03.409950 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-8f8ffb558-6d6bw"] Apr 21 16:02:03.410415 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:02:03.410298 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fa8e534b-9e54-4a71-bb6e-72ab0aaf97ab" containerName="main" Apr 21 16:02:03.410415 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:02:03.410310 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa8e534b-9e54-4a71-bb6e-72ab0aaf97ab" containerName="main" Apr 21 16:02:03.410415 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:02:03.410321 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fa8e534b-9e54-4a71-bb6e-72ab0aaf97ab" containerName="storage-initializer" Apr 21 16:02:03.410415 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:02:03.410327 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa8e534b-9e54-4a71-bb6e-72ab0aaf97ab" containerName="storage-initializer" Apr 21 16:02:03.410415 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:02:03.410335 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6f206a71-fd22-4b85-bbae-e38488586fb3" containerName="main" Apr 21 16:02:03.410415 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:02:03.410340 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f206a71-fd22-4b85-bbae-e38488586fb3" containerName="main" Apr 21 16:02:03.410415 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:02:03.410355 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6f206a71-fd22-4b85-bbae-e38488586fb3" containerName="storage-initializer" Apr 21 16:02:03.410415 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:02:03.410360 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f206a71-fd22-4b85-bbae-e38488586fb3" containerName="storage-initializer" Apr 21 16:02:03.410415 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:02:03.410370 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6f206a71-fd22-4b85-bbae-e38488586fb3" containerName="tokenizer" Apr 21 16:02:03.410415 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:02:03.410375 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f206a71-fd22-4b85-bbae-e38488586fb3" containerName="tokenizer" Apr 21 16:02:03.410723 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:02:03.410426 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="fa8e534b-9e54-4a71-bb6e-72ab0aaf97ab" containerName="main" Apr 21 16:02:03.410723 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:02:03.410435 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="6f206a71-fd22-4b85-bbae-e38488586fb3" containerName="main" Apr 21 16:02:03.410723 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:02:03.410444 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="6f206a71-fd22-4b85-bbae-e38488586fb3" containerName="tokenizer" Apr 21 16:02:03.415347 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:02:03.415319 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-8f8ffb558-6d6bw" Apr 21 16:02:03.418392 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:02:03.418355 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"custom-route-timeout-pd-test-kserve-self-signed-certs\"" Apr 21 16:02:03.431741 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:02:03.431715 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-8f8ffb558-6d6bw"] Apr 21 16:02:03.456177 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:02:03.456152 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bpwmz\" (UniqueName: \"kubernetes.io/projected/e042e45b-8908-48d8-8c84-1cdda8ab66bb-kube-api-access-bpwmz\") pod \"custom-route-timeout-pd-test-kserve-prefill-8f8ffb558-6d6bw\" (UID: \"e042e45b-8908-48d8-8c84-1cdda8ab66bb\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-8f8ffb558-6d6bw" Apr 21 16:02:03.456319 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:02:03.456188 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/e042e45b-8908-48d8-8c84-1cdda8ab66bb-home\") pod \"custom-route-timeout-pd-test-kserve-prefill-8f8ffb558-6d6bw\" (UID: \"e042e45b-8908-48d8-8c84-1cdda8ab66bb\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-8f8ffb558-6d6bw" Apr 21 16:02:03.456397 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:02:03.456309 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e042e45b-8908-48d8-8c84-1cdda8ab66bb-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-prefill-8f8ffb558-6d6bw\" (UID: \"e042e45b-8908-48d8-8c84-1cdda8ab66bb\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-8f8ffb558-6d6bw" Apr 21 16:02:03.456397 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:02:03.456351 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/e042e45b-8908-48d8-8c84-1cdda8ab66bb-model-cache\") pod \"custom-route-timeout-pd-test-kserve-prefill-8f8ffb558-6d6bw\" (UID: \"e042e45b-8908-48d8-8c84-1cdda8ab66bb\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-8f8ffb558-6d6bw" Apr 21 16:02:03.456480 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:02:03.456428 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/e042e45b-8908-48d8-8c84-1cdda8ab66bb-dshm\") pod \"custom-route-timeout-pd-test-kserve-prefill-8f8ffb558-6d6bw\" (UID: \"e042e45b-8908-48d8-8c84-1cdda8ab66bb\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-8f8ffb558-6d6bw" Apr 21 16:02:03.456480 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:02:03.456471 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e042e45b-8908-48d8-8c84-1cdda8ab66bb-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-prefill-8f8ffb558-6d6bw\" (UID: \"e042e45b-8908-48d8-8c84-1cdda8ab66bb\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-8f8ffb558-6d6bw" Apr 21 16:02:03.557763 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:02:03.557730 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e042e45b-8908-48d8-8c84-1cdda8ab66bb-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-prefill-8f8ffb558-6d6bw\" (UID: \"e042e45b-8908-48d8-8c84-1cdda8ab66bb\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-8f8ffb558-6d6bw" Apr 21 16:02:03.557921 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:02:03.557773 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/e042e45b-8908-48d8-8c84-1cdda8ab66bb-model-cache\") pod \"custom-route-timeout-pd-test-kserve-prefill-8f8ffb558-6d6bw\" (UID: \"e042e45b-8908-48d8-8c84-1cdda8ab66bb\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-8f8ffb558-6d6bw" Apr 21 16:02:03.557921 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:02:03.557835 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/e042e45b-8908-48d8-8c84-1cdda8ab66bb-dshm\") pod \"custom-route-timeout-pd-test-kserve-prefill-8f8ffb558-6d6bw\" (UID: \"e042e45b-8908-48d8-8c84-1cdda8ab66bb\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-8f8ffb558-6d6bw" Apr 21 16:02:03.557921 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:02:03.557863 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e042e45b-8908-48d8-8c84-1cdda8ab66bb-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-prefill-8f8ffb558-6d6bw\" (UID: \"e042e45b-8908-48d8-8c84-1cdda8ab66bb\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-8f8ffb558-6d6bw" Apr 21 16:02:03.558083 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:02:03.557926 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bpwmz\" (UniqueName: \"kubernetes.io/projected/e042e45b-8908-48d8-8c84-1cdda8ab66bb-kube-api-access-bpwmz\") pod \"custom-route-timeout-pd-test-kserve-prefill-8f8ffb558-6d6bw\" (UID: \"e042e45b-8908-48d8-8c84-1cdda8ab66bb\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-8f8ffb558-6d6bw" Apr 21 16:02:03.558083 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:02:03.557968 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/e042e45b-8908-48d8-8c84-1cdda8ab66bb-home\") pod \"custom-route-timeout-pd-test-kserve-prefill-8f8ffb558-6d6bw\" (UID: \"e042e45b-8908-48d8-8c84-1cdda8ab66bb\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-8f8ffb558-6d6bw" Apr 21 16:02:03.558230 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:02:03.558206 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e042e45b-8908-48d8-8c84-1cdda8ab66bb-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-prefill-8f8ffb558-6d6bw\" (UID: \"e042e45b-8908-48d8-8c84-1cdda8ab66bb\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-8f8ffb558-6d6bw" Apr 21 16:02:03.558295 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:02:03.558242 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/e042e45b-8908-48d8-8c84-1cdda8ab66bb-model-cache\") pod \"custom-route-timeout-pd-test-kserve-prefill-8f8ffb558-6d6bw\" (UID: \"e042e45b-8908-48d8-8c84-1cdda8ab66bb\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-8f8ffb558-6d6bw" Apr 21 16:02:03.558295 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:02:03.558274 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/e042e45b-8908-48d8-8c84-1cdda8ab66bb-home\") pod \"custom-route-timeout-pd-test-kserve-prefill-8f8ffb558-6d6bw\" (UID: \"e042e45b-8908-48d8-8c84-1cdda8ab66bb\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-8f8ffb558-6d6bw" Apr 21 16:02:03.560147 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:02:03.560120 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/e042e45b-8908-48d8-8c84-1cdda8ab66bb-dshm\") pod \"custom-route-timeout-pd-test-kserve-prefill-8f8ffb558-6d6bw\" (UID: \"e042e45b-8908-48d8-8c84-1cdda8ab66bb\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-8f8ffb558-6d6bw" Apr 21 16:02:03.560578 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:02:03.560554 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e042e45b-8908-48d8-8c84-1cdda8ab66bb-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-prefill-8f8ffb558-6d6bw\" (UID: \"e042e45b-8908-48d8-8c84-1cdda8ab66bb\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-8f8ffb558-6d6bw" Apr 21 16:02:03.568466 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:02:03.568441 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bpwmz\" (UniqueName: \"kubernetes.io/projected/e042e45b-8908-48d8-8c84-1cdda8ab66bb-kube-api-access-bpwmz\") pod \"custom-route-timeout-pd-test-kserve-prefill-8f8ffb558-6d6bw\" (UID: \"e042e45b-8908-48d8-8c84-1cdda8ab66bb\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-8f8ffb558-6d6bw" Apr 21 16:02:03.726388 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:02:03.726318 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-8f8ffb558-6d6bw" Apr 21 16:02:03.859821 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:02:03.859775 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-8f8ffb558-6d6bw"] Apr 21 16:02:03.862063 ip-10-0-136-123 kubenswrapper[2573]: W0421 16:02:03.862032 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode042e45b_8908_48d8_8c84_1cdda8ab66bb.slice/crio-6092fd9e2686131c5007feb5cc191cdc143c44fcefd7e2e68d10cee2d18ec17c WatchSource:0}: Error finding container 6092fd9e2686131c5007feb5cc191cdc143c44fcefd7e2e68d10cee2d18ec17c: Status 404 returned error can't find the container with id 6092fd9e2686131c5007feb5cc191cdc143c44fcefd7e2e68d10cee2d18ec17c Apr 21 16:02:03.863836 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:02:03.863818 2573 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 21 16:02:04.099743 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:02:04.099712 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-8f8ffb558-6d6bw" event={"ID":"e042e45b-8908-48d8-8c84-1cdda8ab66bb","Type":"ContainerStarted","Data":"871ecce5a23bac17866bb1d1f0978ede28a4e47457f64c0b2d8368b6292f5781"} Apr 21 16:02:04.099743 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:02:04.099748 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-8f8ffb558-6d6bw" event={"ID":"e042e45b-8908-48d8-8c84-1cdda8ab66bb","Type":"ContainerStarted","Data":"6092fd9e2686131c5007feb5cc191cdc143c44fcefd7e2e68d10cee2d18ec17c"} Apr 21 16:02:08.116375 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:02:08.116303 2573 generic.go:358] "Generic (PLEG): container finished" podID="e042e45b-8908-48d8-8c84-1cdda8ab66bb" containerID="871ecce5a23bac17866bb1d1f0978ede28a4e47457f64c0b2d8368b6292f5781" exitCode=0 Apr 21 16:02:08.116690 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:02:08.116385 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-8f8ffb558-6d6bw" event={"ID":"e042e45b-8908-48d8-8c84-1cdda8ab66bb","Type":"ContainerDied","Data":"871ecce5a23bac17866bb1d1f0978ede28a4e47457f64c0b2d8368b6292f5781"} Apr 21 16:02:09.122079 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:02:09.122042 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-8f8ffb558-6d6bw" event={"ID":"e042e45b-8908-48d8-8c84-1cdda8ab66bb","Type":"ContainerStarted","Data":"69923fe1a10e07835a7c62008592b2de54d0d935899e4441f516d7fbd23aced8"} Apr 21 16:02:09.146363 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:02:09.146304 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-8f8ffb558-6d6bw" podStartSLOduration=6.146285216 podStartE2EDuration="6.146285216s" podCreationTimestamp="2026-04-21 16:02:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 16:02:09.14383832 +0000 UTC m=+1624.273891963" watchObservedRunningTime="2026-04-21 16:02:09.146285216 +0000 UTC m=+1624.276338854" Apr 21 16:02:13.084813 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:02:13.084753 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1"] Apr 21 16:02:13.089690 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:02:13.089653 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 21 16:02:13.092520 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:02:13.092491 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisv4e643bc258191ffc517a31cd1d0ddd27-kserve-self-signed-certs\"" Apr 21 16:02:13.092678 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:02:13.092533 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvc-model-fb-opt-125m-route-f312f5-cb7fb8cf-dockercfg-pclvk\"" Apr 21 16:02:13.096728 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:02:13.096703 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1"] Apr 21 16:02:13.147922 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:02:13.147888 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/34799a69-4511-4755-9aee-d09129de977c-home\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"34799a69-4511-4755-9aee-d09129de977c\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 21 16:02:13.148154 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:02:13.148136 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/34799a69-4511-4755-9aee-d09129de977c-dshm\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"34799a69-4511-4755-9aee-d09129de977c\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 21 16:02:13.148285 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:02:13.148271 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/34799a69-4511-4755-9aee-d09129de977c-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"34799a69-4511-4755-9aee-d09129de977c\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 21 16:02:13.148422 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:02:13.148406 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/34799a69-4511-4755-9aee-d09129de977c-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"34799a69-4511-4755-9aee-d09129de977c\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 21 16:02:13.148553 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:02:13.148538 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/34799a69-4511-4755-9aee-d09129de977c-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"34799a69-4511-4755-9aee-d09129de977c\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 21 16:02:13.148688 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:02:13.148674 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-57vkv\" (UniqueName: \"kubernetes.io/projected/34799a69-4511-4755-9aee-d09129de977c-kube-api-access-57vkv\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"34799a69-4511-4755-9aee-d09129de977c\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 21 16:02:13.173919 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:02:13.173886 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schetj9zn"] Apr 21 16:02:13.176714 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:02:13.176694 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schetj9zn" Apr 21 16:02:13.179624 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:02:13.179603 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvc-model-fb-opt-125m-route-f312f5ec-epp-sa-dockercfg-vl4c7\"" Apr 21 16:02:13.189538 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:02:13.189515 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schetj9zn"] Apr 21 16:02:13.249337 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:02:13.249312 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/34799a69-4511-4755-9aee-d09129de977c-home\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"34799a69-4511-4755-9aee-d09129de977c\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 21 16:02:13.249459 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:02:13.249355 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/f086989a-62a5-4fa3-a5ad-2ff37aec85bc-tokenizer-cache\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schetj9zn\" (UID: \"f086989a-62a5-4fa3-a5ad-2ff37aec85bc\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schetj9zn" Apr 21 16:02:13.249459 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:02:13.249374 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lzl6h\" (UniqueName: \"kubernetes.io/projected/f086989a-62a5-4fa3-a5ad-2ff37aec85bc-kube-api-access-lzl6h\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schetj9zn\" (UID: \"f086989a-62a5-4fa3-a5ad-2ff37aec85bc\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schetj9zn" Apr 21 16:02:13.249459 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:02:13.249395 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/34799a69-4511-4755-9aee-d09129de977c-dshm\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"34799a69-4511-4755-9aee-d09129de977c\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 21 16:02:13.249459 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:02:13.249415 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/f086989a-62a5-4fa3-a5ad-2ff37aec85bc-tokenizer-tmp\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schetj9zn\" (UID: \"f086989a-62a5-4fa3-a5ad-2ff37aec85bc\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schetj9zn" Apr 21 16:02:13.249459 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:02:13.249444 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f086989a-62a5-4fa3-a5ad-2ff37aec85bc-kserve-provision-location\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schetj9zn\" (UID: \"f086989a-62a5-4fa3-a5ad-2ff37aec85bc\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schetj9zn" Apr 21 16:02:13.249687 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:02:13.249551 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/34799a69-4511-4755-9aee-d09129de977c-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"34799a69-4511-4755-9aee-d09129de977c\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 21 16:02:13.249687 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:02:13.249594 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/34799a69-4511-4755-9aee-d09129de977c-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"34799a69-4511-4755-9aee-d09129de977c\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 21 16:02:13.249687 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:02:13.249629 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f086989a-62a5-4fa3-a5ad-2ff37aec85bc-tls-certs\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schetj9zn\" (UID: \"f086989a-62a5-4fa3-a5ad-2ff37aec85bc\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schetj9zn" Apr 21 16:02:13.249687 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:02:13.249664 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/34799a69-4511-4755-9aee-d09129de977c-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"34799a69-4511-4755-9aee-d09129de977c\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 21 16:02:13.249687 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:02:13.249675 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/34799a69-4511-4755-9aee-d09129de977c-home\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"34799a69-4511-4755-9aee-d09129de977c\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 21 16:02:13.249964 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:02:13.249731 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/f086989a-62a5-4fa3-a5ad-2ff37aec85bc-tokenizer-uds\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schetj9zn\" (UID: \"f086989a-62a5-4fa3-a5ad-2ff37aec85bc\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schetj9zn" Apr 21 16:02:13.249964 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:02:13.249842 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/34799a69-4511-4755-9aee-d09129de977c-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"34799a69-4511-4755-9aee-d09129de977c\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 21 16:02:13.249964 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:02:13.249847 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-57vkv\" (UniqueName: \"kubernetes.io/projected/34799a69-4511-4755-9aee-d09129de977c-kube-api-access-57vkv\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"34799a69-4511-4755-9aee-d09129de977c\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 21 16:02:13.249964 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:02:13.249908 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/34799a69-4511-4755-9aee-d09129de977c-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"34799a69-4511-4755-9aee-d09129de977c\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 21 16:02:13.251637 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:02:13.251614 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/34799a69-4511-4755-9aee-d09129de977c-dshm\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"34799a69-4511-4755-9aee-d09129de977c\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 21 16:02:13.252187 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:02:13.252165 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/34799a69-4511-4755-9aee-d09129de977c-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"34799a69-4511-4755-9aee-d09129de977c\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 21 16:02:13.258835 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:02:13.258789 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-57vkv\" (UniqueName: \"kubernetes.io/projected/34799a69-4511-4755-9aee-d09129de977c-kube-api-access-57vkv\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"34799a69-4511-4755-9aee-d09129de977c\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 21 16:02:13.350438 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:02:13.350373 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/f086989a-62a5-4fa3-a5ad-2ff37aec85bc-tokenizer-cache\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schetj9zn\" (UID: \"f086989a-62a5-4fa3-a5ad-2ff37aec85bc\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schetj9zn" Apr 21 16:02:13.350438 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:02:13.350410 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lzl6h\" (UniqueName: \"kubernetes.io/projected/f086989a-62a5-4fa3-a5ad-2ff37aec85bc-kube-api-access-lzl6h\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schetj9zn\" (UID: \"f086989a-62a5-4fa3-a5ad-2ff37aec85bc\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schetj9zn" Apr 21 16:02:13.350622 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:02:13.350447 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/f086989a-62a5-4fa3-a5ad-2ff37aec85bc-tokenizer-tmp\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schetj9zn\" (UID: \"f086989a-62a5-4fa3-a5ad-2ff37aec85bc\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schetj9zn" Apr 21 16:02:13.350622 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:02:13.350478 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f086989a-62a5-4fa3-a5ad-2ff37aec85bc-kserve-provision-location\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schetj9zn\" (UID: \"f086989a-62a5-4fa3-a5ad-2ff37aec85bc\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schetj9zn" Apr 21 16:02:13.350622 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:02:13.350560 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f086989a-62a5-4fa3-a5ad-2ff37aec85bc-tls-certs\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schetj9zn\" (UID: \"f086989a-62a5-4fa3-a5ad-2ff37aec85bc\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schetj9zn" Apr 21 16:02:13.350817 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:02:13.350619 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/f086989a-62a5-4fa3-a5ad-2ff37aec85bc-tokenizer-uds\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schetj9zn\" (UID: \"f086989a-62a5-4fa3-a5ad-2ff37aec85bc\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schetj9zn" Apr 21 16:02:13.350920 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:02:13.350837 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/f086989a-62a5-4fa3-a5ad-2ff37aec85bc-tokenizer-cache\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schetj9zn\" (UID: \"f086989a-62a5-4fa3-a5ad-2ff37aec85bc\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schetj9zn" Apr 21 16:02:13.350964 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:02:13.350931 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f086989a-62a5-4fa3-a5ad-2ff37aec85bc-kserve-provision-location\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schetj9zn\" (UID: \"f086989a-62a5-4fa3-a5ad-2ff37aec85bc\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schetj9zn" Apr 21 16:02:13.351002 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:02:13.350985 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/f086989a-62a5-4fa3-a5ad-2ff37aec85bc-tokenizer-uds\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schetj9zn\" (UID: \"f086989a-62a5-4fa3-a5ad-2ff37aec85bc\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schetj9zn" Apr 21 16:02:13.351239 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:02:13.351213 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/f086989a-62a5-4fa3-a5ad-2ff37aec85bc-tokenizer-tmp\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schetj9zn\" (UID: \"f086989a-62a5-4fa3-a5ad-2ff37aec85bc\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schetj9zn" Apr 21 16:02:13.353193 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:02:13.353170 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f086989a-62a5-4fa3-a5ad-2ff37aec85bc-tls-certs\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schetj9zn\" (UID: \"f086989a-62a5-4fa3-a5ad-2ff37aec85bc\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schetj9zn" Apr 21 16:02:13.360703 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:02:13.360684 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lzl6h\" (UniqueName: \"kubernetes.io/projected/f086989a-62a5-4fa3-a5ad-2ff37aec85bc-kube-api-access-lzl6h\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schetj9zn\" (UID: \"f086989a-62a5-4fa3-a5ad-2ff37aec85bc\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schetj9zn" Apr 21 16:02:13.402582 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:02:13.402550 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 21 16:02:13.489331 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:02:13.489298 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schetj9zn" Apr 21 16:02:13.532621 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:02:13.532590 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1"] Apr 21 16:02:13.533970 ip-10-0-136-123 kubenswrapper[2573]: W0421 16:02:13.533917 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod34799a69_4511_4755_9aee_d09129de977c.slice/crio-465d9fcff044c8d1a433b64725ebc60eadb3c7208e1f15001f5cc1cbe059ea5e WatchSource:0}: Error finding container 465d9fcff044c8d1a433b64725ebc60eadb3c7208e1f15001f5cc1cbe059ea5e: Status 404 returned error can't find the container with id 465d9fcff044c8d1a433b64725ebc60eadb3c7208e1f15001f5cc1cbe059ea5e Apr 21 16:02:13.632491 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:02:13.632461 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schetj9zn"] Apr 21 16:02:13.635142 ip-10-0-136-123 kubenswrapper[2573]: W0421 16:02:13.635111 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf086989a_62a5_4fa3_a5ad_2ff37aec85bc.slice/crio-da26bc1adaf54629f58fb051f765e468398fca6e0153e9e1af60e9e05aafe259 WatchSource:0}: Error finding container da26bc1adaf54629f58fb051f765e468398fca6e0153e9e1af60e9e05aafe259: Status 404 returned error can't find the container with id da26bc1adaf54629f58fb051f765e468398fca6e0153e9e1af60e9e05aafe259 Apr 21 16:02:13.726768 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:02:13.726738 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-8f8ffb558-6d6bw" Apr 21 16:02:13.726936 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:02:13.726775 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-8f8ffb558-6d6bw" Apr 21 16:02:13.728566 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:02:13.728535 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-8f8ffb558-6d6bw" podUID="e042e45b-8908-48d8-8c84-1cdda8ab66bb" containerName="main" probeResult="failure" output="Get \"https://10.132.0.46:8000/health\": dial tcp 10.132.0.46:8000: connect: connection refused" Apr 21 16:02:14.142138 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:02:14.142101 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schetj9zn" event={"ID":"f086989a-62a5-4fa3-a5ad-2ff37aec85bc","Type":"ContainerStarted","Data":"12e383e661f12f659fffe8b4bd2a99387d84a4d835cd6d58ec69d6d10c3f648b"} Apr 21 16:02:14.142138 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:02:14.142140 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schetj9zn" event={"ID":"f086989a-62a5-4fa3-a5ad-2ff37aec85bc","Type":"ContainerStarted","Data":"da26bc1adaf54629f58fb051f765e468398fca6e0153e9e1af60e9e05aafe259"} Apr 21 16:02:14.143671 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:02:14.143636 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" event={"ID":"34799a69-4511-4755-9aee-d09129de977c","Type":"ContainerStarted","Data":"2ed10b75a645c7c254b2b5f03595163ed6aec007703deb9189dd99ce1e342de4"} Apr 21 16:02:14.143836 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:02:14.143677 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" event={"ID":"34799a69-4511-4755-9aee-d09129de977c","Type":"ContainerStarted","Data":"465d9fcff044c8d1a433b64725ebc60eadb3c7208e1f15001f5cc1cbe059ea5e"} Apr 21 16:02:15.148870 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:02:15.148830 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schetj9zn" event={"ID":"f086989a-62a5-4fa3-a5ad-2ff37aec85bc","Type":"ContainerDied","Data":"12e383e661f12f659fffe8b4bd2a99387d84a4d835cd6d58ec69d6d10c3f648b"} Apr 21 16:02:15.148870 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:02:15.148760 2573 generic.go:358] "Generic (PLEG): container finished" podID="f086989a-62a5-4fa3-a5ad-2ff37aec85bc" containerID="12e383e661f12f659fffe8b4bd2a99387d84a4d835cd6d58ec69d6d10c3f648b" exitCode=0 Apr 21 16:02:16.156343 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:02:16.156307 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schetj9zn" event={"ID":"f086989a-62a5-4fa3-a5ad-2ff37aec85bc","Type":"ContainerStarted","Data":"9f37ce8c58af95e9b1c774b5f3308952e2144420e7654c2a1c9093d9ab8425c3"} Apr 21 16:02:16.156343 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:02:16.156349 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schetj9zn" event={"ID":"f086989a-62a5-4fa3-a5ad-2ff37aec85bc","Type":"ContainerStarted","Data":"32ca676bc24430d0e4cf82f4a89911ae86e8a1ab5b2d4e3aa436939832283e4d"} Apr 21 16:02:16.156863 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:02:16.156844 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schetj9zn" Apr 21 16:02:16.184199 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:02:16.184098 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schetj9zn" podStartSLOduration=3.184072794 podStartE2EDuration="3.184072794s" podCreationTimestamp="2026-04-21 16:02:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 16:02:16.179869793 +0000 UTC m=+1631.309923442" watchObservedRunningTime="2026-04-21 16:02:16.184072794 +0000 UTC m=+1631.314126433" Apr 21 16:02:18.167600 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:02:18.167559 2573 generic.go:358] "Generic (PLEG): container finished" podID="34799a69-4511-4755-9aee-d09129de977c" containerID="2ed10b75a645c7c254b2b5f03595163ed6aec007703deb9189dd99ce1e342de4" exitCode=0 Apr 21 16:02:18.167984 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:02:18.167602 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" event={"ID":"34799a69-4511-4755-9aee-d09129de977c","Type":"ContainerDied","Data":"2ed10b75a645c7c254b2b5f03595163ed6aec007703deb9189dd99ce1e342de4"} Apr 21 16:02:19.173929 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:02:19.173889 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" event={"ID":"34799a69-4511-4755-9aee-d09129de977c","Type":"ContainerStarted","Data":"2cf8e9c5f71acf6df006f272efdfdb1fc9828c2ecd40ede94eb110757bc4e58a"} Apr 21 16:02:19.197986 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:02:19.197926 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" podStartSLOduration=6.197908037 podStartE2EDuration="6.197908037s" podCreationTimestamp="2026-04-21 16:02:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 16:02:19.195285706 +0000 UTC m=+1634.325339342" watchObservedRunningTime="2026-04-21 16:02:19.197908037 +0000 UTC m=+1634.327961672" Apr 21 16:02:22.963449 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:02:22.963408 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7bfb78dbd-2t84b" podUID="6961ef97-4c61-4ba5-85ed-00ddf50f12e9" containerName="llm-d-routing-sidecar" containerID="cri-o://9485dd29a524f2e1702ccb1e112dadd802a9a6344a09244ada677f9d86e00c3c" gracePeriod=2 Apr 21 16:02:23.206450 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:02:23.206238 2573 generic.go:358] "Generic (PLEG): container finished" podID="88d11bfe-261a-4d92-9f6d-092c5ebea6e7" containerID="0b791d899605216f97340877008ac961d76e17004984e905f8a5214a3a7866bd" exitCode=137 Apr 21 16:02:23.206450 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:02:23.206349 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-f47l6w4" event={"ID":"88d11bfe-261a-4d92-9f6d-092c5ebea6e7","Type":"ContainerDied","Data":"0b791d899605216f97340877008ac961d76e17004984e905f8a5214a3a7866bd"} Apr 21 16:02:23.208673 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:02:23.208598 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7bfb78dbd-2t84b_6961ef97-4c61-4ba5-85ed-00ddf50f12e9/main/0.log" Apr 21 16:02:23.209494 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:02:23.209398 2573 generic.go:358] "Generic (PLEG): container finished" podID="6961ef97-4c61-4ba5-85ed-00ddf50f12e9" containerID="d02937932f4340f31d95a430e5137692b510cd6d381683c1873ce81aa6192190" exitCode=137 Apr 21 16:02:23.209494 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:02:23.209417 2573 generic.go:358] "Generic (PLEG): container finished" podID="6961ef97-4c61-4ba5-85ed-00ddf50f12e9" containerID="9485dd29a524f2e1702ccb1e112dadd802a9a6344a09244ada677f9d86e00c3c" exitCode=0 Apr 21 16:02:23.209494 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:02:23.209445 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7bfb78dbd-2t84b" event={"ID":"6961ef97-4c61-4ba5-85ed-00ddf50f12e9","Type":"ContainerDied","Data":"d02937932f4340f31d95a430e5137692b510cd6d381683c1873ce81aa6192190"} Apr 21 16:02:23.209494 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:02:23.209468 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7bfb78dbd-2t84b" event={"ID":"6961ef97-4c61-4ba5-85ed-00ddf50f12e9","Type":"ContainerDied","Data":"9485dd29a524f2e1702ccb1e112dadd802a9a6344a09244ada677f9d86e00c3c"} Apr 21 16:02:23.274998 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:02:23.274968 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-f47l6w4" Apr 21 16:02:23.294467 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:02:23.294445 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7bfb78dbd-2t84b_6961ef97-4c61-4ba5-85ed-00ddf50f12e9/main/0.log" Apr 21 16:02:23.295253 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:02:23.295236 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7bfb78dbd-2t84b" Apr 21 16:02:23.351531 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:02:23.351502 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/88d11bfe-261a-4d92-9f6d-092c5ebea6e7-dshm\") pod \"88d11bfe-261a-4d92-9f6d-092c5ebea6e7\" (UID: \"88d11bfe-261a-4d92-9f6d-092c5ebea6e7\") " Apr 21 16:02:23.351664 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:02:23.351542 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/88d11bfe-261a-4d92-9f6d-092c5ebea6e7-kserve-provision-location\") pod \"88d11bfe-261a-4d92-9f6d-092c5ebea6e7\" (UID: \"88d11bfe-261a-4d92-9f6d-092c5ebea6e7\") " Apr 21 16:02:23.351664 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:02:23.351581 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9k2pb\" (UniqueName: \"kubernetes.io/projected/88d11bfe-261a-4d92-9f6d-092c5ebea6e7-kube-api-access-9k2pb\") pod \"88d11bfe-261a-4d92-9f6d-092c5ebea6e7\" (UID: \"88d11bfe-261a-4d92-9f6d-092c5ebea6e7\") " Apr 21 16:02:23.351664 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:02:23.351655 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/88d11bfe-261a-4d92-9f6d-092c5ebea6e7-model-cache\") pod \"88d11bfe-261a-4d92-9f6d-092c5ebea6e7\" (UID: \"88d11bfe-261a-4d92-9f6d-092c5ebea6e7\") " Apr 21 16:02:23.351865 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:02:23.351680 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/88d11bfe-261a-4d92-9f6d-092c5ebea6e7-home\") pod \"88d11bfe-261a-4d92-9f6d-092c5ebea6e7\" (UID: \"88d11bfe-261a-4d92-9f6d-092c5ebea6e7\") " Apr 21 16:02:23.351865 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:02:23.351736 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/88d11bfe-261a-4d92-9f6d-092c5ebea6e7-tls-certs\") pod \"88d11bfe-261a-4d92-9f6d-092c5ebea6e7\" (UID: \"88d11bfe-261a-4d92-9f6d-092c5ebea6e7\") " Apr 21 16:02:23.351991 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:02:23.351962 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/88d11bfe-261a-4d92-9f6d-092c5ebea6e7-model-cache" (OuterVolumeSpecName: "model-cache") pod "88d11bfe-261a-4d92-9f6d-092c5ebea6e7" (UID: "88d11bfe-261a-4d92-9f6d-092c5ebea6e7"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 16:02:23.352154 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:02:23.352124 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/88d11bfe-261a-4d92-9f6d-092c5ebea6e7-home" (OuterVolumeSpecName: "home") pod "88d11bfe-261a-4d92-9f6d-092c5ebea6e7" (UID: "88d11bfe-261a-4d92-9f6d-092c5ebea6e7"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 16:02:23.352518 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:02:23.352498 2573 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/88d11bfe-261a-4d92-9f6d-092c5ebea6e7-model-cache\") on node \"ip-10-0-136-123.ec2.internal\" DevicePath \"\"" Apr 21 16:02:23.352611 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:02:23.352523 2573 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/88d11bfe-261a-4d92-9f6d-092c5ebea6e7-home\") on node \"ip-10-0-136-123.ec2.internal\" DevicePath \"\"" Apr 21 16:02:23.353993 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:02:23.353962 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88d11bfe-261a-4d92-9f6d-092c5ebea6e7-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "88d11bfe-261a-4d92-9f6d-092c5ebea6e7" (UID: "88d11bfe-261a-4d92-9f6d-092c5ebea6e7"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 16:02:23.354175 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:02:23.354150 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/88d11bfe-261a-4d92-9f6d-092c5ebea6e7-dshm" (OuterVolumeSpecName: "dshm") pod "88d11bfe-261a-4d92-9f6d-092c5ebea6e7" (UID: "88d11bfe-261a-4d92-9f6d-092c5ebea6e7"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 16:02:23.354257 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:02:23.354235 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/88d11bfe-261a-4d92-9f6d-092c5ebea6e7-kube-api-access-9k2pb" (OuterVolumeSpecName: "kube-api-access-9k2pb") pod "88d11bfe-261a-4d92-9f6d-092c5ebea6e7" (UID: "88d11bfe-261a-4d92-9f6d-092c5ebea6e7"). InnerVolumeSpecName "kube-api-access-9k2pb". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 16:02:23.406832 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:02:23.406777 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/88d11bfe-261a-4d92-9f6d-092c5ebea6e7-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "88d11bfe-261a-4d92-9f6d-092c5ebea6e7" (UID: "88d11bfe-261a-4d92-9f6d-092c5ebea6e7"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 16:02:23.453188 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:02:23.453146 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vkhrh\" (UniqueName: \"kubernetes.io/projected/6961ef97-4c61-4ba5-85ed-00ddf50f12e9-kube-api-access-vkhrh\") pod \"6961ef97-4c61-4ba5-85ed-00ddf50f12e9\" (UID: \"6961ef97-4c61-4ba5-85ed-00ddf50f12e9\") " Apr 21 16:02:23.453321 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:02:23.453223 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6961ef97-4c61-4ba5-85ed-00ddf50f12e9-tls-certs\") pod \"6961ef97-4c61-4ba5-85ed-00ddf50f12e9\" (UID: \"6961ef97-4c61-4ba5-85ed-00ddf50f12e9\") " Apr 21 16:02:23.453321 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:02:23.453266 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6961ef97-4c61-4ba5-85ed-00ddf50f12e9-kserve-provision-location\") pod \"6961ef97-4c61-4ba5-85ed-00ddf50f12e9\" (UID: \"6961ef97-4c61-4ba5-85ed-00ddf50f12e9\") " Apr 21 16:02:23.453443 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:02:23.453376 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/6961ef97-4c61-4ba5-85ed-00ddf50f12e9-dshm\") pod \"6961ef97-4c61-4ba5-85ed-00ddf50f12e9\" (UID: \"6961ef97-4c61-4ba5-85ed-00ddf50f12e9\") " Apr 21 16:02:23.453443 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:02:23.453407 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/6961ef97-4c61-4ba5-85ed-00ddf50f12e9-home\") pod \"6961ef97-4c61-4ba5-85ed-00ddf50f12e9\" (UID: \"6961ef97-4c61-4ba5-85ed-00ddf50f12e9\") " Apr 21 16:02:23.453443 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:02:23.453434 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/6961ef97-4c61-4ba5-85ed-00ddf50f12e9-model-cache\") pod \"6961ef97-4c61-4ba5-85ed-00ddf50f12e9\" (UID: \"6961ef97-4c61-4ba5-85ed-00ddf50f12e9\") " Apr 21 16:02:23.453724 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:02:23.453699 2573 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/88d11bfe-261a-4d92-9f6d-092c5ebea6e7-dshm\") on node \"ip-10-0-136-123.ec2.internal\" DevicePath \"\"" Apr 21 16:02:23.453820 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:02:23.453719 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6961ef97-4c61-4ba5-85ed-00ddf50f12e9-model-cache" (OuterVolumeSpecName: "model-cache") pod "6961ef97-4c61-4ba5-85ed-00ddf50f12e9" (UID: "6961ef97-4c61-4ba5-85ed-00ddf50f12e9"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 16:02:23.453820 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:02:23.453732 2573 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/88d11bfe-261a-4d92-9f6d-092c5ebea6e7-kserve-provision-location\") on node \"ip-10-0-136-123.ec2.internal\" DevicePath \"\"" Apr 21 16:02:23.453820 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:02:23.453788 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-9k2pb\" (UniqueName: \"kubernetes.io/projected/88d11bfe-261a-4d92-9f6d-092c5ebea6e7-kube-api-access-9k2pb\") on node \"ip-10-0-136-123.ec2.internal\" DevicePath \"\"" Apr 21 16:02:23.454022 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:02:23.453824 2573 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/88d11bfe-261a-4d92-9f6d-092c5ebea6e7-tls-certs\") on node \"ip-10-0-136-123.ec2.internal\" DevicePath \"\"" Apr 21 16:02:23.454022 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:02:23.453969 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6961ef97-4c61-4ba5-85ed-00ddf50f12e9-home" (OuterVolumeSpecName: "home") pod "6961ef97-4c61-4ba5-85ed-00ddf50f12e9" (UID: "6961ef97-4c61-4ba5-85ed-00ddf50f12e9"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 16:02:23.455764 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:02:23.455733 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6961ef97-4c61-4ba5-85ed-00ddf50f12e9-dshm" (OuterVolumeSpecName: "dshm") pod "6961ef97-4c61-4ba5-85ed-00ddf50f12e9" (UID: "6961ef97-4c61-4ba5-85ed-00ddf50f12e9"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 16:02:23.455764 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:02:23.455750 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6961ef97-4c61-4ba5-85ed-00ddf50f12e9-kube-api-access-vkhrh" (OuterVolumeSpecName: "kube-api-access-vkhrh") pod "6961ef97-4c61-4ba5-85ed-00ddf50f12e9" (UID: "6961ef97-4c61-4ba5-85ed-00ddf50f12e9"). InnerVolumeSpecName "kube-api-access-vkhrh". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 16:02:23.456014 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:02:23.455828 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6961ef97-4c61-4ba5-85ed-00ddf50f12e9-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "6961ef97-4c61-4ba5-85ed-00ddf50f12e9" (UID: "6961ef97-4c61-4ba5-85ed-00ddf50f12e9"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 16:02:23.489900 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:02:23.489866 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schetj9zn" Apr 21 16:02:23.490076 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:02:23.489908 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schetj9zn" Apr 21 16:02:23.493299 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:02:23.493270 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schetj9zn" Apr 21 16:02:23.508882 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:02:23.508852 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6961ef97-4c61-4ba5-85ed-00ddf50f12e9-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "6961ef97-4c61-4ba5-85ed-00ddf50f12e9" (UID: "6961ef97-4c61-4ba5-85ed-00ddf50f12e9"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 16:02:23.554650 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:02:23.554617 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-vkhrh\" (UniqueName: \"kubernetes.io/projected/6961ef97-4c61-4ba5-85ed-00ddf50f12e9-kube-api-access-vkhrh\") on node \"ip-10-0-136-123.ec2.internal\" DevicePath \"\"" Apr 21 16:02:23.554650 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:02:23.554641 2573 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6961ef97-4c61-4ba5-85ed-00ddf50f12e9-tls-certs\") on node \"ip-10-0-136-123.ec2.internal\" DevicePath \"\"" Apr 21 16:02:23.554650 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:02:23.554651 2573 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6961ef97-4c61-4ba5-85ed-00ddf50f12e9-kserve-provision-location\") on node \"ip-10-0-136-123.ec2.internal\" DevicePath \"\"" Apr 21 16:02:23.554931 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:02:23.554660 2573 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/6961ef97-4c61-4ba5-85ed-00ddf50f12e9-dshm\") on node \"ip-10-0-136-123.ec2.internal\" DevicePath \"\"" Apr 21 16:02:23.554931 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:02:23.554670 2573 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/6961ef97-4c61-4ba5-85ed-00ddf50f12e9-home\") on node \"ip-10-0-136-123.ec2.internal\" DevicePath \"\"" Apr 21 16:02:23.554931 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:02:23.554677 2573 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/6961ef97-4c61-4ba5-85ed-00ddf50f12e9-model-cache\") on node \"ip-10-0-136-123.ec2.internal\" DevicePath \"\"" Apr 21 16:02:23.726970 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:02:23.726882 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-8f8ffb558-6d6bw" podUID="e042e45b-8908-48d8-8c84-1cdda8ab66bb" containerName="main" probeResult="failure" output="Get \"https://10.132.0.46:8000/health\": dial tcp 10.132.0.46:8000: connect: connection refused" Apr 21 16:02:24.215142 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:02:24.215110 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-f47l6w4" Apr 21 16:02:24.215608 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:02:24.215109 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-f47l6w4" event={"ID":"88d11bfe-261a-4d92-9f6d-092c5ebea6e7","Type":"ContainerDied","Data":"5363b6c18cab8f3ab2a215790dbb37606937ac0871df7139765565c3ef16442f"} Apr 21 16:02:24.215608 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:02:24.215250 2573 scope.go:117] "RemoveContainer" containerID="0b791d899605216f97340877008ac961d76e17004984e905f8a5214a3a7866bd" Apr 21 16:02:24.216974 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:02:24.216937 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7bfb78dbd-2t84b_6961ef97-4c61-4ba5-85ed-00ddf50f12e9/main/0.log" Apr 21 16:02:24.217981 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:02:24.217956 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7bfb78dbd-2t84b" event={"ID":"6961ef97-4c61-4ba5-85ed-00ddf50f12e9","Type":"ContainerDied","Data":"42d327fc25524ad4a1a2e59efbf3cf45d1fd39aa81291af32e651f9931d312c4"} Apr 21 16:02:24.218456 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:02:24.218431 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7bfb78dbd-2t84b" Apr 21 16:02:24.219481 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:02:24.219461 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schetj9zn" Apr 21 16:02:24.238520 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:02:24.238490 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-f47l6w4"] Apr 21 16:02:24.239329 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:02:24.239308 2573 scope.go:117] "RemoveContainer" containerID="ceef73b45dc0a27f28ce71c4e7d72d7de0520aebd2f8da43a7e303799ac03e16" Apr 21 16:02:24.243533 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:02:24.243507 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-f47l6w4"] Apr 21 16:02:24.279407 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:02:24.279337 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7bfb78dbd-2t84b"] Apr 21 16:02:24.285514 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:02:24.285480 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7bfb78dbd-2t84b"] Apr 21 16:02:24.311725 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:02:24.311701 2573 scope.go:117] "RemoveContainer" containerID="d02937932f4340f31d95a430e5137692b510cd6d381683c1873ce81aa6192190" Apr 21 16:02:24.332498 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:02:24.332477 2573 scope.go:117] "RemoveContainer" containerID="8497fff3e514e70b68c56a45eefcb47553bcf37a64db2e6898530da1afab8c2c" Apr 21 16:02:24.387713 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:02:24.387684 2573 scope.go:117] "RemoveContainer" containerID="9485dd29a524f2e1702ccb1e112dadd802a9a6344a09244ada677f9d86e00c3c" Apr 21 16:02:25.482376 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:02:25.482344 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6961ef97-4c61-4ba5-85ed-00ddf50f12e9" path="/var/lib/kubelet/pods/6961ef97-4c61-4ba5-85ed-00ddf50f12e9/volumes" Apr 21 16:02:25.482863 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:02:25.482791 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="88d11bfe-261a-4d92-9f6d-092c5ebea6e7" path="/var/lib/kubelet/pods/88d11bfe-261a-4d92-9f6d-092c5ebea6e7/volumes" Apr 21 16:02:33.726823 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:02:33.726760 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-8f8ffb558-6d6bw" podUID="e042e45b-8908-48d8-8c84-1cdda8ab66bb" containerName="main" probeResult="failure" output="Get \"https://10.132.0.46:8000/health\": dial tcp 10.132.0.46:8000: connect: connection refused" Apr 21 16:02:43.727006 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:02:43.726944 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-8f8ffb558-6d6bw" podUID="e042e45b-8908-48d8-8c84-1cdda8ab66bb" containerName="main" probeResult="failure" output="Get \"https://10.132.0.46:8000/health\": dial tcp 10.132.0.46:8000: connect: connection refused" Apr 21 16:02:45.226468 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:02:45.226427 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schetj9zn" Apr 21 16:02:53.727088 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:02:53.727044 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-8f8ffb558-6d6bw" podUID="e042e45b-8908-48d8-8c84-1cdda8ab66bb" containerName="main" probeResult="failure" output="Get \"https://10.132.0.46:8000/health\": dial tcp 10.132.0.46:8000: connect: connection refused" Apr 21 16:03:03.727541 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:03:03.727491 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-8f8ffb558-6d6bw" podUID="e042e45b-8908-48d8-8c84-1cdda8ab66bb" containerName="main" probeResult="failure" output="Get \"https://10.132.0.46:8000/health\": dial tcp 10.132.0.46:8000: connect: connection refused" Apr 21 16:03:13.727568 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:03:13.727526 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-8f8ffb558-6d6bw" podUID="e042e45b-8908-48d8-8c84-1cdda8ab66bb" containerName="main" probeResult="failure" output="Get \"https://10.132.0.46:8000/health\": dial tcp 10.132.0.46:8000: connect: connection refused" Apr 21 16:03:23.727656 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:03:23.727603 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-8f8ffb558-6d6bw" podUID="e042e45b-8908-48d8-8c84-1cdda8ab66bb" containerName="main" probeResult="failure" output="Get \"https://10.132.0.46:8000/health\": dial tcp 10.132.0.46:8000: connect: connection refused" Apr 21 16:03:33.727240 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:03:33.727141 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-8f8ffb558-6d6bw" podUID="e042e45b-8908-48d8-8c84-1cdda8ab66bb" containerName="main" probeResult="failure" output="Get \"https://10.132.0.46:8000/health\": dial tcp 10.132.0.46:8000: connect: connection refused" Apr 21 16:03:43.727481 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:03:43.727440 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-8f8ffb558-6d6bw" podUID="e042e45b-8908-48d8-8c84-1cdda8ab66bb" containerName="main" probeResult="failure" output="Get \"https://10.132.0.46:8000/health\": dial tcp 10.132.0.46:8000: connect: connection refused" Apr 21 16:03:53.727391 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:03:53.727348 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-8f8ffb558-6d6bw" podUID="e042e45b-8908-48d8-8c84-1cdda8ab66bb" containerName="main" probeResult="failure" output="Get \"https://10.132.0.46:8000/health\": dial tcp 10.132.0.46:8000: connect: connection refused" Apr 21 16:04:03.727010 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:04:03.726966 2573 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-8f8ffb558-6d6bw" podUID="e042e45b-8908-48d8-8c84-1cdda8ab66bb" containerName="main" probeResult="failure" output="Get \"https://10.132.0.46:8000/health\": dial tcp 10.132.0.46:8000: connect: connection refused" Apr 21 16:04:13.736955 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:04:13.736920 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-8f8ffb558-6d6bw" Apr 21 16:04:13.744719 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:04:13.744690 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-8f8ffb558-6d6bw" Apr 21 16:04:35.126594 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:04:35.126550 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-8f8ffb558-6d6bw"] Apr 21 16:04:35.127007 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:04:35.126936 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-8f8ffb558-6d6bw" podUID="e042e45b-8908-48d8-8c84-1cdda8ab66bb" containerName="main" containerID="cri-o://69923fe1a10e07835a7c62008592b2de54d0d935899e4441f516d7fbd23aced8" gracePeriod=30 Apr 21 16:04:46.624661 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:04:46.624629 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6d6487c47d-z54hv"] Apr 21 16:04:46.625104 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:04:46.624987 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="88d11bfe-261a-4d92-9f6d-092c5ebea6e7" containerName="main" Apr 21 16:04:46.625104 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:04:46.625000 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="88d11bfe-261a-4d92-9f6d-092c5ebea6e7" containerName="main" Apr 21 16:04:46.625104 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:04:46.625014 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="88d11bfe-261a-4d92-9f6d-092c5ebea6e7" containerName="storage-initializer" Apr 21 16:04:46.625104 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:04:46.625021 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="88d11bfe-261a-4d92-9f6d-092c5ebea6e7" containerName="storage-initializer" Apr 21 16:04:46.625104 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:04:46.625038 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6961ef97-4c61-4ba5-85ed-00ddf50f12e9" containerName="llm-d-routing-sidecar" Apr 21 16:04:46.625104 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:04:46.625047 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="6961ef97-4c61-4ba5-85ed-00ddf50f12e9" containerName="llm-d-routing-sidecar" Apr 21 16:04:46.625104 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:04:46.625053 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6961ef97-4c61-4ba5-85ed-00ddf50f12e9" containerName="main" Apr 21 16:04:46.625104 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:04:46.625058 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="6961ef97-4c61-4ba5-85ed-00ddf50f12e9" containerName="main" Apr 21 16:04:46.625104 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:04:46.625066 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6961ef97-4c61-4ba5-85ed-00ddf50f12e9" containerName="storage-initializer" Apr 21 16:04:46.625104 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:04:46.625071 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="6961ef97-4c61-4ba5-85ed-00ddf50f12e9" containerName="storage-initializer" Apr 21 16:04:46.625498 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:04:46.625134 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="6961ef97-4c61-4ba5-85ed-00ddf50f12e9" containerName="main" Apr 21 16:04:46.625498 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:04:46.625149 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="88d11bfe-261a-4d92-9f6d-092c5ebea6e7" containerName="main" Apr 21 16:04:46.625498 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:04:46.625156 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="6961ef97-4c61-4ba5-85ed-00ddf50f12e9" containerName="llm-d-routing-sidecar" Apr 21 16:04:46.628226 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:04:46.628203 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6d6487c47d-z54hv" Apr 21 16:04:46.631136 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:04:46.631112 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-inline-config-test-kserve-self-signed-certs\"" Apr 21 16:04:46.639677 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:04:46.639650 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6d6487c47d-z54hv"] Apr 21 16:04:46.773485 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:04:46.773448 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/b76c1fa9-446b-4a0f-8008-07d02ca53527-dshm\") pod \"scheduler-inline-config-test-kserve-6d6487c47d-z54hv\" (UID: \"b76c1fa9-446b-4a0f-8008-07d02ca53527\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6d6487c47d-z54hv" Apr 21 16:04:46.773652 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:04:46.773509 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b76c1fa9-446b-4a0f-8008-07d02ca53527-kserve-provision-location\") pod \"scheduler-inline-config-test-kserve-6d6487c47d-z54hv\" (UID: \"b76c1fa9-446b-4a0f-8008-07d02ca53527\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6d6487c47d-z54hv" Apr 21 16:04:46.773652 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:04:46.773632 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rt7pp\" (UniqueName: \"kubernetes.io/projected/b76c1fa9-446b-4a0f-8008-07d02ca53527-kube-api-access-rt7pp\") pod \"scheduler-inline-config-test-kserve-6d6487c47d-z54hv\" (UID: \"b76c1fa9-446b-4a0f-8008-07d02ca53527\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6d6487c47d-z54hv" Apr 21 16:04:46.773732 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:04:46.773697 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b76c1fa9-446b-4a0f-8008-07d02ca53527-tls-certs\") pod \"scheduler-inline-config-test-kserve-6d6487c47d-z54hv\" (UID: \"b76c1fa9-446b-4a0f-8008-07d02ca53527\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6d6487c47d-z54hv" Apr 21 16:04:46.773773 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:04:46.773738 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/b76c1fa9-446b-4a0f-8008-07d02ca53527-model-cache\") pod \"scheduler-inline-config-test-kserve-6d6487c47d-z54hv\" (UID: \"b76c1fa9-446b-4a0f-8008-07d02ca53527\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6d6487c47d-z54hv" Apr 21 16:04:46.773848 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:04:46.773811 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/b76c1fa9-446b-4a0f-8008-07d02ca53527-home\") pod \"scheduler-inline-config-test-kserve-6d6487c47d-z54hv\" (UID: \"b76c1fa9-446b-4a0f-8008-07d02ca53527\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6d6487c47d-z54hv" Apr 21 16:04:46.874532 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:04:46.874486 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rt7pp\" (UniqueName: \"kubernetes.io/projected/b76c1fa9-446b-4a0f-8008-07d02ca53527-kube-api-access-rt7pp\") pod \"scheduler-inline-config-test-kserve-6d6487c47d-z54hv\" (UID: \"b76c1fa9-446b-4a0f-8008-07d02ca53527\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6d6487c47d-z54hv" Apr 21 16:04:46.874742 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:04:46.874547 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b76c1fa9-446b-4a0f-8008-07d02ca53527-tls-certs\") pod \"scheduler-inline-config-test-kserve-6d6487c47d-z54hv\" (UID: \"b76c1fa9-446b-4a0f-8008-07d02ca53527\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6d6487c47d-z54hv" Apr 21 16:04:46.874742 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:04:46.874584 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/b76c1fa9-446b-4a0f-8008-07d02ca53527-model-cache\") pod \"scheduler-inline-config-test-kserve-6d6487c47d-z54hv\" (UID: \"b76c1fa9-446b-4a0f-8008-07d02ca53527\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6d6487c47d-z54hv" Apr 21 16:04:46.874742 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:04:46.874621 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/b76c1fa9-446b-4a0f-8008-07d02ca53527-home\") pod \"scheduler-inline-config-test-kserve-6d6487c47d-z54hv\" (UID: \"b76c1fa9-446b-4a0f-8008-07d02ca53527\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6d6487c47d-z54hv" Apr 21 16:04:46.874742 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:04:46.874648 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/b76c1fa9-446b-4a0f-8008-07d02ca53527-dshm\") pod \"scheduler-inline-config-test-kserve-6d6487c47d-z54hv\" (UID: \"b76c1fa9-446b-4a0f-8008-07d02ca53527\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6d6487c47d-z54hv" Apr 21 16:04:46.874742 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:04:46.874680 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b76c1fa9-446b-4a0f-8008-07d02ca53527-kserve-provision-location\") pod \"scheduler-inline-config-test-kserve-6d6487c47d-z54hv\" (UID: \"b76c1fa9-446b-4a0f-8008-07d02ca53527\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6d6487c47d-z54hv" Apr 21 16:04:46.875238 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:04:46.875214 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b76c1fa9-446b-4a0f-8008-07d02ca53527-kserve-provision-location\") pod \"scheduler-inline-config-test-kserve-6d6487c47d-z54hv\" (UID: \"b76c1fa9-446b-4a0f-8008-07d02ca53527\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6d6487c47d-z54hv" Apr 21 16:04:46.875655 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:04:46.875630 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/b76c1fa9-446b-4a0f-8008-07d02ca53527-home\") pod \"scheduler-inline-config-test-kserve-6d6487c47d-z54hv\" (UID: \"b76c1fa9-446b-4a0f-8008-07d02ca53527\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6d6487c47d-z54hv" Apr 21 16:04:46.875862 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:04:46.875841 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/b76c1fa9-446b-4a0f-8008-07d02ca53527-model-cache\") pod \"scheduler-inline-config-test-kserve-6d6487c47d-z54hv\" (UID: \"b76c1fa9-446b-4a0f-8008-07d02ca53527\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6d6487c47d-z54hv" Apr 21 16:04:46.877428 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:04:46.877406 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/b76c1fa9-446b-4a0f-8008-07d02ca53527-dshm\") pod \"scheduler-inline-config-test-kserve-6d6487c47d-z54hv\" (UID: \"b76c1fa9-446b-4a0f-8008-07d02ca53527\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6d6487c47d-z54hv" Apr 21 16:04:46.878066 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:04:46.878046 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b76c1fa9-446b-4a0f-8008-07d02ca53527-tls-certs\") pod \"scheduler-inline-config-test-kserve-6d6487c47d-z54hv\" (UID: \"b76c1fa9-446b-4a0f-8008-07d02ca53527\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6d6487c47d-z54hv" Apr 21 16:04:46.890483 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:04:46.890463 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rt7pp\" (UniqueName: \"kubernetes.io/projected/b76c1fa9-446b-4a0f-8008-07d02ca53527-kube-api-access-rt7pp\") pod \"scheduler-inline-config-test-kserve-6d6487c47d-z54hv\" (UID: \"b76c1fa9-446b-4a0f-8008-07d02ca53527\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6d6487c47d-z54hv" Apr 21 16:04:46.941294 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:04:46.941266 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6d6487c47d-z54hv" Apr 21 16:04:47.070785 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:04:47.070755 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6d6487c47d-z54hv"] Apr 21 16:04:47.072752 ip-10-0-136-123 kubenswrapper[2573]: W0421 16:04:47.072724 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb76c1fa9_446b_4a0f_8008_07d02ca53527.slice/crio-3c05b41b6e80cbf7a0462113f194bf13e4d0f33193b7500ce28afd38bc4410b2 WatchSource:0}: Error finding container 3c05b41b6e80cbf7a0462113f194bf13e4d0f33193b7500ce28afd38bc4410b2: Status 404 returned error can't find the container with id 3c05b41b6e80cbf7a0462113f194bf13e4d0f33193b7500ce28afd38bc4410b2 Apr 21 16:04:47.736960 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:04:47.736918 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6d6487c47d-z54hv" event={"ID":"b76c1fa9-446b-4a0f-8008-07d02ca53527","Type":"ContainerStarted","Data":"613cc7aa7a74a01db364f6d91dc9b8536aeed499c7f3ef72c9ff5169bdd53252"} Apr 21 16:04:47.736960 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:04:47.736960 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6d6487c47d-z54hv" event={"ID":"b76c1fa9-446b-4a0f-8008-07d02ca53527","Type":"ContainerStarted","Data":"3c05b41b6e80cbf7a0462113f194bf13e4d0f33193b7500ce28afd38bc4410b2"} Apr 21 16:04:48.256783 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:04:48.256743 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schetj9zn"] Apr 21 16:04:48.257201 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:04:48.257151 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schetj9zn" podUID="f086989a-62a5-4fa3-a5ad-2ff37aec85bc" containerName="main" containerID="cri-o://32ca676bc24430d0e4cf82f4a89911ae86e8a1ab5b2d4e3aa436939832283e4d" gracePeriod=30 Apr 21 16:04:48.257529 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:04:48.257504 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schetj9zn" podUID="f086989a-62a5-4fa3-a5ad-2ff37aec85bc" containerName="tokenizer" containerID="cri-o://9f37ce8c58af95e9b1c774b5f3308952e2144420e7654c2a1c9093d9ab8425c3" gracePeriod=30 Apr 21 16:04:48.742614 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:04:48.742568 2573 generic.go:358] "Generic (PLEG): container finished" podID="f086989a-62a5-4fa3-a5ad-2ff37aec85bc" containerID="32ca676bc24430d0e4cf82f4a89911ae86e8a1ab5b2d4e3aa436939832283e4d" exitCode=0 Apr 21 16:04:48.743198 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:04:48.742643 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schetj9zn" event={"ID":"f086989a-62a5-4fa3-a5ad-2ff37aec85bc","Type":"ContainerDied","Data":"32ca676bc24430d0e4cf82f4a89911ae86e8a1ab5b2d4e3aa436939832283e4d"} Apr 21 16:04:49.523925 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:04:49.523902 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schetj9zn" Apr 21 16:04:49.601106 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:04:49.601077 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/f086989a-62a5-4fa3-a5ad-2ff37aec85bc-tokenizer-tmp\") pod \"f086989a-62a5-4fa3-a5ad-2ff37aec85bc\" (UID: \"f086989a-62a5-4fa3-a5ad-2ff37aec85bc\") " Apr 21 16:04:49.601313 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:04:49.601158 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/f086989a-62a5-4fa3-a5ad-2ff37aec85bc-tokenizer-uds\") pod \"f086989a-62a5-4fa3-a5ad-2ff37aec85bc\" (UID: \"f086989a-62a5-4fa3-a5ad-2ff37aec85bc\") " Apr 21 16:04:49.601313 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:04:49.601205 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/f086989a-62a5-4fa3-a5ad-2ff37aec85bc-tokenizer-cache\") pod \"f086989a-62a5-4fa3-a5ad-2ff37aec85bc\" (UID: \"f086989a-62a5-4fa3-a5ad-2ff37aec85bc\") " Apr 21 16:04:49.601313 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:04:49.601241 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f086989a-62a5-4fa3-a5ad-2ff37aec85bc-kserve-provision-location\") pod \"f086989a-62a5-4fa3-a5ad-2ff37aec85bc\" (UID: \"f086989a-62a5-4fa3-a5ad-2ff37aec85bc\") " Apr 21 16:04:49.601313 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:04:49.601291 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f086989a-62a5-4fa3-a5ad-2ff37aec85bc-tls-certs\") pod \"f086989a-62a5-4fa3-a5ad-2ff37aec85bc\" (UID: \"f086989a-62a5-4fa3-a5ad-2ff37aec85bc\") " Apr 21 16:04:49.601561 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:04:49.601331 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzl6h\" (UniqueName: \"kubernetes.io/projected/f086989a-62a5-4fa3-a5ad-2ff37aec85bc-kube-api-access-lzl6h\") pod \"f086989a-62a5-4fa3-a5ad-2ff37aec85bc\" (UID: \"f086989a-62a5-4fa3-a5ad-2ff37aec85bc\") " Apr 21 16:04:49.601561 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:04:49.601442 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f086989a-62a5-4fa3-a5ad-2ff37aec85bc-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "f086989a-62a5-4fa3-a5ad-2ff37aec85bc" (UID: "f086989a-62a5-4fa3-a5ad-2ff37aec85bc"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 16:04:49.601737 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:04:49.601549 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f086989a-62a5-4fa3-a5ad-2ff37aec85bc-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "f086989a-62a5-4fa3-a5ad-2ff37aec85bc" (UID: "f086989a-62a5-4fa3-a5ad-2ff37aec85bc"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 16:04:49.601737 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:04:49.601563 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f086989a-62a5-4fa3-a5ad-2ff37aec85bc-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "f086989a-62a5-4fa3-a5ad-2ff37aec85bc" (UID: "f086989a-62a5-4fa3-a5ad-2ff37aec85bc"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 16:04:49.601737 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:04:49.601680 2573 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/f086989a-62a5-4fa3-a5ad-2ff37aec85bc-tokenizer-tmp\") on node \"ip-10-0-136-123.ec2.internal\" DevicePath \"\"" Apr 21 16:04:49.601737 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:04:49.601703 2573 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/f086989a-62a5-4fa3-a5ad-2ff37aec85bc-tokenizer-uds\") on node \"ip-10-0-136-123.ec2.internal\" DevicePath \"\"" Apr 21 16:04:49.601737 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:04:49.601720 2573 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/f086989a-62a5-4fa3-a5ad-2ff37aec85bc-tokenizer-cache\") on node \"ip-10-0-136-123.ec2.internal\" DevicePath \"\"" Apr 21 16:04:49.602101 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:04:49.601990 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f086989a-62a5-4fa3-a5ad-2ff37aec85bc-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "f086989a-62a5-4fa3-a5ad-2ff37aec85bc" (UID: "f086989a-62a5-4fa3-a5ad-2ff37aec85bc"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 16:04:49.603570 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:04:49.603546 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f086989a-62a5-4fa3-a5ad-2ff37aec85bc-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "f086989a-62a5-4fa3-a5ad-2ff37aec85bc" (UID: "f086989a-62a5-4fa3-a5ad-2ff37aec85bc"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 16:04:49.604591 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:04:49.604569 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f086989a-62a5-4fa3-a5ad-2ff37aec85bc-kube-api-access-lzl6h" (OuterVolumeSpecName: "kube-api-access-lzl6h") pod "f086989a-62a5-4fa3-a5ad-2ff37aec85bc" (UID: "f086989a-62a5-4fa3-a5ad-2ff37aec85bc"). InnerVolumeSpecName "kube-api-access-lzl6h". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 16:04:49.703010 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:04:49.702922 2573 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f086989a-62a5-4fa3-a5ad-2ff37aec85bc-kserve-provision-location\") on node \"ip-10-0-136-123.ec2.internal\" DevicePath \"\"" Apr 21 16:04:49.703010 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:04:49.702964 2573 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f086989a-62a5-4fa3-a5ad-2ff37aec85bc-tls-certs\") on node \"ip-10-0-136-123.ec2.internal\" DevicePath \"\"" Apr 21 16:04:49.703010 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:04:49.702975 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-lzl6h\" (UniqueName: \"kubernetes.io/projected/f086989a-62a5-4fa3-a5ad-2ff37aec85bc-kube-api-access-lzl6h\") on node \"ip-10-0-136-123.ec2.internal\" DevicePath \"\"" Apr 21 16:04:49.748857 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:04:49.748823 2573 generic.go:358] "Generic (PLEG): container finished" podID="f086989a-62a5-4fa3-a5ad-2ff37aec85bc" containerID="9f37ce8c58af95e9b1c774b5f3308952e2144420e7654c2a1c9093d9ab8425c3" exitCode=0 Apr 21 16:04:49.749246 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:04:49.748900 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schetj9zn" event={"ID":"f086989a-62a5-4fa3-a5ad-2ff37aec85bc","Type":"ContainerDied","Data":"9f37ce8c58af95e9b1c774b5f3308952e2144420e7654c2a1c9093d9ab8425c3"} Apr 21 16:04:49.749246 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:04:49.748921 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schetj9zn" event={"ID":"f086989a-62a5-4fa3-a5ad-2ff37aec85bc","Type":"ContainerDied","Data":"da26bc1adaf54629f58fb051f765e468398fca6e0153e9e1af60e9e05aafe259"} Apr 21 16:04:49.749246 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:04:49.748937 2573 scope.go:117] "RemoveContainer" containerID="9f37ce8c58af95e9b1c774b5f3308952e2144420e7654c2a1c9093d9ab8425c3" Apr 21 16:04:49.749246 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:04:49.749000 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schetj9zn" Apr 21 16:04:49.761656 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:04:49.761636 2573 scope.go:117] "RemoveContainer" containerID="32ca676bc24430d0e4cf82f4a89911ae86e8a1ab5b2d4e3aa436939832283e4d" Apr 21 16:04:49.771085 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:04:49.771060 2573 scope.go:117] "RemoveContainer" containerID="12e383e661f12f659fffe8b4bd2a99387d84a4d835cd6d58ec69d6d10c3f648b" Apr 21 16:04:49.777005 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:04:49.776965 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schetj9zn"] Apr 21 16:04:49.780105 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:04:49.780086 2573 scope.go:117] "RemoveContainer" containerID="9f37ce8c58af95e9b1c774b5f3308952e2144420e7654c2a1c9093d9ab8425c3" Apr 21 16:04:49.780426 ip-10-0-136-123 kubenswrapper[2573]: E0421 16:04:49.780402 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9f37ce8c58af95e9b1c774b5f3308952e2144420e7654c2a1c9093d9ab8425c3\": container with ID starting with 9f37ce8c58af95e9b1c774b5f3308952e2144420e7654c2a1c9093d9ab8425c3 not found: ID does not exist" containerID="9f37ce8c58af95e9b1c774b5f3308952e2144420e7654c2a1c9093d9ab8425c3" Apr 21 16:04:49.780523 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:04:49.780440 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f37ce8c58af95e9b1c774b5f3308952e2144420e7654c2a1c9093d9ab8425c3"} err="failed to get container status \"9f37ce8c58af95e9b1c774b5f3308952e2144420e7654c2a1c9093d9ab8425c3\": rpc error: code = NotFound desc = could not find container \"9f37ce8c58af95e9b1c774b5f3308952e2144420e7654c2a1c9093d9ab8425c3\": container with ID starting with 9f37ce8c58af95e9b1c774b5f3308952e2144420e7654c2a1c9093d9ab8425c3 not found: ID does not exist" Apr 21 16:04:49.780523 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:04:49.780464 2573 scope.go:117] "RemoveContainer" containerID="32ca676bc24430d0e4cf82f4a89911ae86e8a1ab5b2d4e3aa436939832283e4d" Apr 21 16:04:49.780750 ip-10-0-136-123 kubenswrapper[2573]: E0421 16:04:49.780732 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"32ca676bc24430d0e4cf82f4a89911ae86e8a1ab5b2d4e3aa436939832283e4d\": container with ID starting with 32ca676bc24430d0e4cf82f4a89911ae86e8a1ab5b2d4e3aa436939832283e4d not found: ID does not exist" containerID="32ca676bc24430d0e4cf82f4a89911ae86e8a1ab5b2d4e3aa436939832283e4d" Apr 21 16:04:49.780845 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:04:49.780759 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"32ca676bc24430d0e4cf82f4a89911ae86e8a1ab5b2d4e3aa436939832283e4d"} err="failed to get container status \"32ca676bc24430d0e4cf82f4a89911ae86e8a1ab5b2d4e3aa436939832283e4d\": rpc error: code = NotFound desc = could not find container \"32ca676bc24430d0e4cf82f4a89911ae86e8a1ab5b2d4e3aa436939832283e4d\": container with ID starting with 32ca676bc24430d0e4cf82f4a89911ae86e8a1ab5b2d4e3aa436939832283e4d not found: ID does not exist" Apr 21 16:04:49.780845 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:04:49.780781 2573 scope.go:117] "RemoveContainer" containerID="12e383e661f12f659fffe8b4bd2a99387d84a4d835cd6d58ec69d6d10c3f648b" Apr 21 16:04:49.781074 ip-10-0-136-123 kubenswrapper[2573]: E0421 16:04:49.781059 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"12e383e661f12f659fffe8b4bd2a99387d84a4d835cd6d58ec69d6d10c3f648b\": container with ID starting with 12e383e661f12f659fffe8b4bd2a99387d84a4d835cd6d58ec69d6d10c3f648b not found: ID does not exist" containerID="12e383e661f12f659fffe8b4bd2a99387d84a4d835cd6d58ec69d6d10c3f648b" Apr 21 16:04:49.781134 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:04:49.781080 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"12e383e661f12f659fffe8b4bd2a99387d84a4d835cd6d58ec69d6d10c3f648b"} err="failed to get container status \"12e383e661f12f659fffe8b4bd2a99387d84a4d835cd6d58ec69d6d10c3f648b\": rpc error: code = NotFound desc = could not find container \"12e383e661f12f659fffe8b4bd2a99387d84a4d835cd6d58ec69d6d10c3f648b\": container with ID starting with 12e383e661f12f659fffe8b4bd2a99387d84a4d835cd6d58ec69d6d10c3f648b not found: ID does not exist" Apr 21 16:04:49.781546 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:04:49.781524 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schetj9zn"] Apr 21 16:04:49.985681 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:04:49.985596 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1"] Apr 21 16:04:49.985899 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:04:49.985877 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" podUID="34799a69-4511-4755-9aee-d09129de977c" containerName="main" containerID="cri-o://2cf8e9c5f71acf6df006f272efdfdb1fc9828c2ecd40ede94eb110757bc4e58a" gracePeriod=30 Apr 21 16:04:50.755220 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:04:50.755183 2573 generic.go:358] "Generic (PLEG): container finished" podID="34799a69-4511-4755-9aee-d09129de977c" containerID="2cf8e9c5f71acf6df006f272efdfdb1fc9828c2ecd40ede94eb110757bc4e58a" exitCode=0 Apr 21 16:04:50.755577 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:04:50.755218 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" event={"ID":"34799a69-4511-4755-9aee-d09129de977c","Type":"ContainerDied","Data":"2cf8e9c5f71acf6df006f272efdfdb1fc9828c2ecd40ede94eb110757bc4e58a"} Apr 21 16:04:50.866637 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:04:50.866614 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 21 16:04:51.015000 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:04:51.014961 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-57vkv\" (UniqueName: \"kubernetes.io/projected/34799a69-4511-4755-9aee-d09129de977c-kube-api-access-57vkv\") pod \"34799a69-4511-4755-9aee-d09129de977c\" (UID: \"34799a69-4511-4755-9aee-d09129de977c\") " Apr 21 16:04:51.015181 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:04:51.015051 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/34799a69-4511-4755-9aee-d09129de977c-tls-certs\") pod \"34799a69-4511-4755-9aee-d09129de977c\" (UID: \"34799a69-4511-4755-9aee-d09129de977c\") " Apr 21 16:04:51.015181 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:04:51.015089 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/34799a69-4511-4755-9aee-d09129de977c-home\") pod \"34799a69-4511-4755-9aee-d09129de977c\" (UID: \"34799a69-4511-4755-9aee-d09129de977c\") " Apr 21 16:04:51.015181 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:04:51.015121 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/34799a69-4511-4755-9aee-d09129de977c-model-cache\") pod \"34799a69-4511-4755-9aee-d09129de977c\" (UID: \"34799a69-4511-4755-9aee-d09129de977c\") " Apr 21 16:04:51.015181 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:04:51.015169 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/34799a69-4511-4755-9aee-d09129de977c-dshm\") pod \"34799a69-4511-4755-9aee-d09129de977c\" (UID: \"34799a69-4511-4755-9aee-d09129de977c\") " Apr 21 16:04:51.015491 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:04:51.015218 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/34799a69-4511-4755-9aee-d09129de977c-kserve-provision-location\") pod \"34799a69-4511-4755-9aee-d09129de977c\" (UID: \"34799a69-4511-4755-9aee-d09129de977c\") " Apr 21 16:04:51.015604 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:04:51.015553 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/34799a69-4511-4755-9aee-d09129de977c-home" (OuterVolumeSpecName: "home") pod "34799a69-4511-4755-9aee-d09129de977c" (UID: "34799a69-4511-4755-9aee-d09129de977c"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 16:04:51.015726 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:04:51.015630 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/34799a69-4511-4755-9aee-d09129de977c-model-cache" (OuterVolumeSpecName: "model-cache") pod "34799a69-4511-4755-9aee-d09129de977c" (UID: "34799a69-4511-4755-9aee-d09129de977c"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 16:04:51.017813 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:04:51.017773 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34799a69-4511-4755-9aee-d09129de977c-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "34799a69-4511-4755-9aee-d09129de977c" (UID: "34799a69-4511-4755-9aee-d09129de977c"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 16:04:51.017913 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:04:51.017871 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/34799a69-4511-4755-9aee-d09129de977c-kube-api-access-57vkv" (OuterVolumeSpecName: "kube-api-access-57vkv") pod "34799a69-4511-4755-9aee-d09129de977c" (UID: "34799a69-4511-4755-9aee-d09129de977c"). InnerVolumeSpecName "kube-api-access-57vkv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 16:04:51.018015 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:04:51.017991 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/34799a69-4511-4755-9aee-d09129de977c-dshm" (OuterVolumeSpecName: "dshm") pod "34799a69-4511-4755-9aee-d09129de977c" (UID: "34799a69-4511-4755-9aee-d09129de977c"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 16:04:51.071839 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:04:51.071764 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/34799a69-4511-4755-9aee-d09129de977c-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "34799a69-4511-4755-9aee-d09129de977c" (UID: "34799a69-4511-4755-9aee-d09129de977c"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 16:04:51.116573 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:04:51.116462 2573 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/34799a69-4511-4755-9aee-d09129de977c-dshm\") on node \"ip-10-0-136-123.ec2.internal\" DevicePath \"\"" Apr 21 16:04:51.116573 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:04:51.116489 2573 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/34799a69-4511-4755-9aee-d09129de977c-kserve-provision-location\") on node \"ip-10-0-136-123.ec2.internal\" DevicePath \"\"" Apr 21 16:04:51.116573 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:04:51.116500 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-57vkv\" (UniqueName: \"kubernetes.io/projected/34799a69-4511-4755-9aee-d09129de977c-kube-api-access-57vkv\") on node \"ip-10-0-136-123.ec2.internal\" DevicePath \"\"" Apr 21 16:04:51.116573 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:04:51.116509 2573 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/34799a69-4511-4755-9aee-d09129de977c-tls-certs\") on node \"ip-10-0-136-123.ec2.internal\" DevicePath \"\"" Apr 21 16:04:51.116573 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:04:51.116519 2573 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/34799a69-4511-4755-9aee-d09129de977c-home\") on node \"ip-10-0-136-123.ec2.internal\" DevicePath \"\"" Apr 21 16:04:51.116573 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:04:51.116527 2573 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/34799a69-4511-4755-9aee-d09129de977c-model-cache\") on node \"ip-10-0-136-123.ec2.internal\" DevicePath \"\"" Apr 21 16:04:51.481269 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:04:51.481194 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f086989a-62a5-4fa3-a5ad-2ff37aec85bc" path="/var/lib/kubelet/pods/f086989a-62a5-4fa3-a5ad-2ff37aec85bc/volumes" Apr 21 16:04:51.761827 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:04:51.761720 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" event={"ID":"34799a69-4511-4755-9aee-d09129de977c","Type":"ContainerDied","Data":"465d9fcff044c8d1a433b64725ebc60eadb3c7208e1f15001f5cc1cbe059ea5e"} Apr 21 16:04:51.761827 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:04:51.761774 2573 scope.go:117] "RemoveContainer" containerID="2cf8e9c5f71acf6df006f272efdfdb1fc9828c2ecd40ede94eb110757bc4e58a" Apr 21 16:04:51.761827 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:04:51.761785 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 21 16:04:51.763288 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:04:51.763261 2573 generic.go:358] "Generic (PLEG): container finished" podID="b76c1fa9-446b-4a0f-8008-07d02ca53527" containerID="613cc7aa7a74a01db364f6d91dc9b8536aeed499c7f3ef72c9ff5169bdd53252" exitCode=0 Apr 21 16:04:51.763511 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:04:51.763344 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6d6487c47d-z54hv" event={"ID":"b76c1fa9-446b-4a0f-8008-07d02ca53527","Type":"ContainerDied","Data":"613cc7aa7a74a01db364f6d91dc9b8536aeed499c7f3ef72c9ff5169bdd53252"} Apr 21 16:04:51.779725 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:04:51.779706 2573 scope.go:117] "RemoveContainer" containerID="2ed10b75a645c7c254b2b5f03595163ed6aec007703deb9189dd99ce1e342de4" Apr 21 16:04:51.782973 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:04:51.782950 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1"] Apr 21 16:04:51.786941 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:04:51.786919 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1"] Apr 21 16:04:52.768395 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:04:52.768361 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6d6487c47d-z54hv" event={"ID":"b76c1fa9-446b-4a0f-8008-07d02ca53527","Type":"ContainerStarted","Data":"2d1f117acda57c9589cac34e1ecfad6eaee4adcba770a8f9129b815a305f47ea"} Apr 21 16:04:52.790377 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:04:52.790330 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6d6487c47d-z54hv" podStartSLOduration=6.790315512 podStartE2EDuration="6.790315512s" podCreationTimestamp="2026-04-21 16:04:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 16:04:52.789201307 +0000 UTC m=+1787.919254982" watchObservedRunningTime="2026-04-21 16:04:52.790315512 +0000 UTC m=+1787.920369140" Apr 21 16:04:53.482093 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:04:53.482062 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="34799a69-4511-4755-9aee-d09129de977c" path="/var/lib/kubelet/pods/34799a69-4511-4755-9aee-d09129de977c/volumes" Apr 21 16:04:56.941827 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:04:56.941715 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6d6487c47d-z54hv" Apr 21 16:04:56.941827 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:04:56.941784 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6d6487c47d-z54hv" Apr 21 16:04:56.954247 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:04:56.954224 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6d6487c47d-z54hv" Apr 21 16:04:57.797757 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:04:57.797720 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6d6487c47d-z54hv" Apr 21 16:05:05.439323 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:05:05.439297 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-8f8ffb558-6d6bw" Apr 21 16:05:05.519647 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:05:05.519619 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9v44z_a8821bf6-e244-4b55-bfcc-7d85dec39bc4/ovn-acl-logging/0.log" Apr 21 16:05:05.521646 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:05:05.521615 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bpwmz\" (UniqueName: \"kubernetes.io/projected/e042e45b-8908-48d8-8c84-1cdda8ab66bb-kube-api-access-bpwmz\") pod \"e042e45b-8908-48d8-8c84-1cdda8ab66bb\" (UID: \"e042e45b-8908-48d8-8c84-1cdda8ab66bb\") " Apr 21 16:05:05.521769 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:05:05.521688 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e042e45b-8908-48d8-8c84-1cdda8ab66bb-tls-certs\") pod \"e042e45b-8908-48d8-8c84-1cdda8ab66bb\" (UID: \"e042e45b-8908-48d8-8c84-1cdda8ab66bb\") " Apr 21 16:05:05.521769 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:05:05.521738 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e042e45b-8908-48d8-8c84-1cdda8ab66bb-kserve-provision-location\") pod \"e042e45b-8908-48d8-8c84-1cdda8ab66bb\" (UID: \"e042e45b-8908-48d8-8c84-1cdda8ab66bb\") " Apr 21 16:05:05.521769 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:05:05.521763 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/e042e45b-8908-48d8-8c84-1cdda8ab66bb-model-cache\") pod \"e042e45b-8908-48d8-8c84-1cdda8ab66bb\" (UID: \"e042e45b-8908-48d8-8c84-1cdda8ab66bb\") " Apr 21 16:05:05.521995 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:05:05.521827 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/e042e45b-8908-48d8-8c84-1cdda8ab66bb-home\") pod \"e042e45b-8908-48d8-8c84-1cdda8ab66bb\" (UID: \"e042e45b-8908-48d8-8c84-1cdda8ab66bb\") " Apr 21 16:05:05.521995 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:05:05.521841 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/e042e45b-8908-48d8-8c84-1cdda8ab66bb-dshm\") pod \"e042e45b-8908-48d8-8c84-1cdda8ab66bb\" (UID: \"e042e45b-8908-48d8-8c84-1cdda8ab66bb\") " Apr 21 16:05:05.522099 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:05:05.522055 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e042e45b-8908-48d8-8c84-1cdda8ab66bb-model-cache" (OuterVolumeSpecName: "model-cache") pod "e042e45b-8908-48d8-8c84-1cdda8ab66bb" (UID: "e042e45b-8908-48d8-8c84-1cdda8ab66bb"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 16:05:05.522226 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:05:05.522212 2573 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/e042e45b-8908-48d8-8c84-1cdda8ab66bb-model-cache\") on node \"ip-10-0-136-123.ec2.internal\" DevicePath \"\"" Apr 21 16:05:05.522334 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:05:05.522301 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e042e45b-8908-48d8-8c84-1cdda8ab66bb-home" (OuterVolumeSpecName: "home") pod "e042e45b-8908-48d8-8c84-1cdda8ab66bb" (UID: "e042e45b-8908-48d8-8c84-1cdda8ab66bb"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 16:05:05.523418 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:05:05.523398 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9v44z_a8821bf6-e244-4b55-bfcc-7d85dec39bc4/ovn-acl-logging/0.log" Apr 21 16:05:05.523999 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:05:05.523972 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e042e45b-8908-48d8-8c84-1cdda8ab66bb-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "e042e45b-8908-48d8-8c84-1cdda8ab66bb" (UID: "e042e45b-8908-48d8-8c84-1cdda8ab66bb"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 16:05:05.523999 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:05:05.523986 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e042e45b-8908-48d8-8c84-1cdda8ab66bb-kube-api-access-bpwmz" (OuterVolumeSpecName: "kube-api-access-bpwmz") pod "e042e45b-8908-48d8-8c84-1cdda8ab66bb" (UID: "e042e45b-8908-48d8-8c84-1cdda8ab66bb"). InnerVolumeSpecName "kube-api-access-bpwmz". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 16:05:05.524210 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:05:05.524190 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e042e45b-8908-48d8-8c84-1cdda8ab66bb-dshm" (OuterVolumeSpecName: "dshm") pod "e042e45b-8908-48d8-8c84-1cdda8ab66bb" (UID: "e042e45b-8908-48d8-8c84-1cdda8ab66bb"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 16:05:05.592103 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:05:05.592076 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e042e45b-8908-48d8-8c84-1cdda8ab66bb-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "e042e45b-8908-48d8-8c84-1cdda8ab66bb" (UID: "e042e45b-8908-48d8-8c84-1cdda8ab66bb"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 16:05:05.623491 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:05:05.623458 2573 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e042e45b-8908-48d8-8c84-1cdda8ab66bb-tls-certs\") on node \"ip-10-0-136-123.ec2.internal\" DevicePath \"\"" Apr 21 16:05:05.623491 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:05:05.623491 2573 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e042e45b-8908-48d8-8c84-1cdda8ab66bb-kserve-provision-location\") on node \"ip-10-0-136-123.ec2.internal\" DevicePath \"\"" Apr 21 16:05:05.623611 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:05:05.623502 2573 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/e042e45b-8908-48d8-8c84-1cdda8ab66bb-home\") on node \"ip-10-0-136-123.ec2.internal\" DevicePath \"\"" Apr 21 16:05:05.623611 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:05:05.623511 2573 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/e042e45b-8908-48d8-8c84-1cdda8ab66bb-dshm\") on node \"ip-10-0-136-123.ec2.internal\" DevicePath \"\"" Apr 21 16:05:05.623611 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:05:05.623519 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-bpwmz\" (UniqueName: \"kubernetes.io/projected/e042e45b-8908-48d8-8c84-1cdda8ab66bb-kube-api-access-bpwmz\") on node \"ip-10-0-136-123.ec2.internal\" DevicePath \"\"" Apr 21 16:05:05.812270 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:05:05.812243 2573 generic.go:358] "Generic (PLEG): container finished" podID="e042e45b-8908-48d8-8c84-1cdda8ab66bb" containerID="69923fe1a10e07835a7c62008592b2de54d0d935899e4441f516d7fbd23aced8" exitCode=137 Apr 21 16:05:05.812368 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:05:05.812299 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-8f8ffb558-6d6bw" event={"ID":"e042e45b-8908-48d8-8c84-1cdda8ab66bb","Type":"ContainerDied","Data":"69923fe1a10e07835a7c62008592b2de54d0d935899e4441f516d7fbd23aced8"} Apr 21 16:05:05.812368 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:05:05.812328 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-8f8ffb558-6d6bw" event={"ID":"e042e45b-8908-48d8-8c84-1cdda8ab66bb","Type":"ContainerDied","Data":"6092fd9e2686131c5007feb5cc191cdc143c44fcefd7e2e68d10cee2d18ec17c"} Apr 21 16:05:05.812368 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:05:05.812347 2573 scope.go:117] "RemoveContainer" containerID="69923fe1a10e07835a7c62008592b2de54d0d935899e4441f516d7fbd23aced8" Apr 21 16:05:05.812368 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:05:05.812349 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-8f8ffb558-6d6bw" Apr 21 16:05:05.834726 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:05:05.834704 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-8f8ffb558-6d6bw"] Apr 21 16:05:05.842350 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:05:05.842329 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-8f8ffb558-6d6bw"] Apr 21 16:05:05.842871 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:05:05.842856 2573 scope.go:117] "RemoveContainer" containerID="871ecce5a23bac17866bb1d1f0978ede28a4e47457f64c0b2d8368b6292f5781" Apr 21 16:05:05.851755 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:05:05.851740 2573 scope.go:117] "RemoveContainer" containerID="69923fe1a10e07835a7c62008592b2de54d0d935899e4441f516d7fbd23aced8" Apr 21 16:05:05.852018 ip-10-0-136-123 kubenswrapper[2573]: E0421 16:05:05.852001 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"69923fe1a10e07835a7c62008592b2de54d0d935899e4441f516d7fbd23aced8\": container with ID starting with 69923fe1a10e07835a7c62008592b2de54d0d935899e4441f516d7fbd23aced8 not found: ID does not exist" containerID="69923fe1a10e07835a7c62008592b2de54d0d935899e4441f516d7fbd23aced8" Apr 21 16:05:05.852089 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:05:05.852033 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"69923fe1a10e07835a7c62008592b2de54d0d935899e4441f516d7fbd23aced8"} err="failed to get container status \"69923fe1a10e07835a7c62008592b2de54d0d935899e4441f516d7fbd23aced8\": rpc error: code = NotFound desc = could not find container \"69923fe1a10e07835a7c62008592b2de54d0d935899e4441f516d7fbd23aced8\": container with ID starting with 69923fe1a10e07835a7c62008592b2de54d0d935899e4441f516d7fbd23aced8 not found: ID does not exist" Apr 21 16:05:05.852089 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:05:05.852052 2573 scope.go:117] "RemoveContainer" containerID="871ecce5a23bac17866bb1d1f0978ede28a4e47457f64c0b2d8368b6292f5781" Apr 21 16:05:05.852293 ip-10-0-136-123 kubenswrapper[2573]: E0421 16:05:05.852274 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"871ecce5a23bac17866bb1d1f0978ede28a4e47457f64c0b2d8368b6292f5781\": container with ID starting with 871ecce5a23bac17866bb1d1f0978ede28a4e47457f64c0b2d8368b6292f5781 not found: ID does not exist" containerID="871ecce5a23bac17866bb1d1f0978ede28a4e47457f64c0b2d8368b6292f5781" Apr 21 16:05:05.852345 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:05:05.852303 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"871ecce5a23bac17866bb1d1f0978ede28a4e47457f64c0b2d8368b6292f5781"} err="failed to get container status \"871ecce5a23bac17866bb1d1f0978ede28a4e47457f64c0b2d8368b6292f5781\": rpc error: code = NotFound desc = could not find container \"871ecce5a23bac17866bb1d1f0978ede28a4e47457f64c0b2d8368b6292f5781\": container with ID starting with 871ecce5a23bac17866bb1d1f0978ede28a4e47457f64c0b2d8368b6292f5781 not found: ID does not exist" Apr 21 16:05:07.481049 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:05:07.481015 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e042e45b-8908-48d8-8c84-1cdda8ab66bb" path="/var/lib/kubelet/pods/e042e45b-8908-48d8-8c84-1cdda8ab66bb/volumes" Apr 21 16:05:30.351243 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:05:30.351209 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6d6487c47d-z54hv"] Apr 21 16:05:30.351716 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:05:30.351497 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6d6487c47d-z54hv" podUID="b76c1fa9-446b-4a0f-8008-07d02ca53527" containerName="main" containerID="cri-o://2d1f117acda57c9589cac34e1ecfad6eaee4adcba770a8f9129b815a305f47ea" gracePeriod=30 Apr 21 16:05:30.590367 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:05:30.590344 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6d6487c47d-z54hv" Apr 21 16:05:30.606265 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:05:30.606219 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b76c1fa9-446b-4a0f-8008-07d02ca53527-tls-certs\") pod \"b76c1fa9-446b-4a0f-8008-07d02ca53527\" (UID: \"b76c1fa9-446b-4a0f-8008-07d02ca53527\") " Apr 21 16:05:30.606333 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:05:30.606277 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b76c1fa9-446b-4a0f-8008-07d02ca53527-kserve-provision-location\") pod \"b76c1fa9-446b-4a0f-8008-07d02ca53527\" (UID: \"b76c1fa9-446b-4a0f-8008-07d02ca53527\") " Apr 21 16:05:30.606333 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:05:30.606316 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/b76c1fa9-446b-4a0f-8008-07d02ca53527-model-cache\") pod \"b76c1fa9-446b-4a0f-8008-07d02ca53527\" (UID: \"b76c1fa9-446b-4a0f-8008-07d02ca53527\") " Apr 21 16:05:30.606411 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:05:30.606346 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/b76c1fa9-446b-4a0f-8008-07d02ca53527-dshm\") pod \"b76c1fa9-446b-4a0f-8008-07d02ca53527\" (UID: \"b76c1fa9-446b-4a0f-8008-07d02ca53527\") " Apr 21 16:05:30.606411 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:05:30.606362 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rt7pp\" (UniqueName: \"kubernetes.io/projected/b76c1fa9-446b-4a0f-8008-07d02ca53527-kube-api-access-rt7pp\") pod \"b76c1fa9-446b-4a0f-8008-07d02ca53527\" (UID: \"b76c1fa9-446b-4a0f-8008-07d02ca53527\") " Apr 21 16:05:30.606411 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:05:30.606377 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/b76c1fa9-446b-4a0f-8008-07d02ca53527-home\") pod \"b76c1fa9-446b-4a0f-8008-07d02ca53527\" (UID: \"b76c1fa9-446b-4a0f-8008-07d02ca53527\") " Apr 21 16:05:30.606720 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:05:30.606668 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b76c1fa9-446b-4a0f-8008-07d02ca53527-model-cache" (OuterVolumeSpecName: "model-cache") pod "b76c1fa9-446b-4a0f-8008-07d02ca53527" (UID: "b76c1fa9-446b-4a0f-8008-07d02ca53527"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 16:05:30.606834 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:05:30.606749 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b76c1fa9-446b-4a0f-8008-07d02ca53527-home" (OuterVolumeSpecName: "home") pod "b76c1fa9-446b-4a0f-8008-07d02ca53527" (UID: "b76c1fa9-446b-4a0f-8008-07d02ca53527"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 16:05:30.608414 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:05:30.608392 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b76c1fa9-446b-4a0f-8008-07d02ca53527-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "b76c1fa9-446b-4a0f-8008-07d02ca53527" (UID: "b76c1fa9-446b-4a0f-8008-07d02ca53527"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 16:05:30.608652 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:05:30.608638 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b76c1fa9-446b-4a0f-8008-07d02ca53527-dshm" (OuterVolumeSpecName: "dshm") pod "b76c1fa9-446b-4a0f-8008-07d02ca53527" (UID: "b76c1fa9-446b-4a0f-8008-07d02ca53527"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 16:05:30.608735 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:05:30.608719 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b76c1fa9-446b-4a0f-8008-07d02ca53527-kube-api-access-rt7pp" (OuterVolumeSpecName: "kube-api-access-rt7pp") pod "b76c1fa9-446b-4a0f-8008-07d02ca53527" (UID: "b76c1fa9-446b-4a0f-8008-07d02ca53527"). InnerVolumeSpecName "kube-api-access-rt7pp". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 16:05:30.685029 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:05:30.684994 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b76c1fa9-446b-4a0f-8008-07d02ca53527-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "b76c1fa9-446b-4a0f-8008-07d02ca53527" (UID: "b76c1fa9-446b-4a0f-8008-07d02ca53527"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 16:05:30.707003 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:05:30.706983 2573 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b76c1fa9-446b-4a0f-8008-07d02ca53527-tls-certs\") on node \"ip-10-0-136-123.ec2.internal\" DevicePath \"\"" Apr 21 16:05:30.707098 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:05:30.707007 2573 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b76c1fa9-446b-4a0f-8008-07d02ca53527-kserve-provision-location\") on node \"ip-10-0-136-123.ec2.internal\" DevicePath \"\"" Apr 21 16:05:30.707098 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:05:30.707017 2573 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/b76c1fa9-446b-4a0f-8008-07d02ca53527-model-cache\") on node \"ip-10-0-136-123.ec2.internal\" DevicePath \"\"" Apr 21 16:05:30.707098 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:05:30.707025 2573 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/b76c1fa9-446b-4a0f-8008-07d02ca53527-dshm\") on node \"ip-10-0-136-123.ec2.internal\" DevicePath \"\"" Apr 21 16:05:30.707098 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:05:30.707035 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-rt7pp\" (UniqueName: \"kubernetes.io/projected/b76c1fa9-446b-4a0f-8008-07d02ca53527-kube-api-access-rt7pp\") on node \"ip-10-0-136-123.ec2.internal\" DevicePath \"\"" Apr 21 16:05:30.707098 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:05:30.707043 2573 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/b76c1fa9-446b-4a0f-8008-07d02ca53527-home\") on node \"ip-10-0-136-123.ec2.internal\" DevicePath \"\"" Apr 21 16:05:30.900814 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:05:30.900730 2573 generic.go:358] "Generic (PLEG): container finished" podID="b76c1fa9-446b-4a0f-8008-07d02ca53527" containerID="2d1f117acda57c9589cac34e1ecfad6eaee4adcba770a8f9129b815a305f47ea" exitCode=0 Apr 21 16:05:30.900927 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:05:30.900816 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6d6487c47d-z54hv" event={"ID":"b76c1fa9-446b-4a0f-8008-07d02ca53527","Type":"ContainerDied","Data":"2d1f117acda57c9589cac34e1ecfad6eaee4adcba770a8f9129b815a305f47ea"} Apr 21 16:05:30.900927 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:05:30.900856 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6d6487c47d-z54hv" event={"ID":"b76c1fa9-446b-4a0f-8008-07d02ca53527","Type":"ContainerDied","Data":"3c05b41b6e80cbf7a0462113f194bf13e4d0f33193b7500ce28afd38bc4410b2"} Apr 21 16:05:30.900927 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:05:30.900872 2573 scope.go:117] "RemoveContainer" containerID="2d1f117acda57c9589cac34e1ecfad6eaee4adcba770a8f9129b815a305f47ea" Apr 21 16:05:30.900927 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:05:30.900824 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6d6487c47d-z54hv" Apr 21 16:05:30.910283 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:05:30.910059 2573 scope.go:117] "RemoveContainer" containerID="613cc7aa7a74a01db364f6d91dc9b8536aeed499c7f3ef72c9ff5169bdd53252" Apr 21 16:05:30.920532 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:05:30.920515 2573 scope.go:117] "RemoveContainer" containerID="2d1f117acda57c9589cac34e1ecfad6eaee4adcba770a8f9129b815a305f47ea" Apr 21 16:05:30.920776 ip-10-0-136-123 kubenswrapper[2573]: E0421 16:05:30.920760 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2d1f117acda57c9589cac34e1ecfad6eaee4adcba770a8f9129b815a305f47ea\": container with ID starting with 2d1f117acda57c9589cac34e1ecfad6eaee4adcba770a8f9129b815a305f47ea not found: ID does not exist" containerID="2d1f117acda57c9589cac34e1ecfad6eaee4adcba770a8f9129b815a305f47ea" Apr 21 16:05:30.920835 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:05:30.920783 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2d1f117acda57c9589cac34e1ecfad6eaee4adcba770a8f9129b815a305f47ea"} err="failed to get container status \"2d1f117acda57c9589cac34e1ecfad6eaee4adcba770a8f9129b815a305f47ea\": rpc error: code = NotFound desc = could not find container \"2d1f117acda57c9589cac34e1ecfad6eaee4adcba770a8f9129b815a305f47ea\": container with ID starting with 2d1f117acda57c9589cac34e1ecfad6eaee4adcba770a8f9129b815a305f47ea not found: ID does not exist" Apr 21 16:05:30.920835 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:05:30.920820 2573 scope.go:117] "RemoveContainer" containerID="613cc7aa7a74a01db364f6d91dc9b8536aeed499c7f3ef72c9ff5169bdd53252" Apr 21 16:05:30.921046 ip-10-0-136-123 kubenswrapper[2573]: E0421 16:05:30.921028 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"613cc7aa7a74a01db364f6d91dc9b8536aeed499c7f3ef72c9ff5169bdd53252\": container with ID starting with 613cc7aa7a74a01db364f6d91dc9b8536aeed499c7f3ef72c9ff5169bdd53252 not found: ID does not exist" containerID="613cc7aa7a74a01db364f6d91dc9b8536aeed499c7f3ef72c9ff5169bdd53252" Apr 21 16:05:30.921085 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:05:30.921052 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"613cc7aa7a74a01db364f6d91dc9b8536aeed499c7f3ef72c9ff5169bdd53252"} err="failed to get container status \"613cc7aa7a74a01db364f6d91dc9b8536aeed499c7f3ef72c9ff5169bdd53252\": rpc error: code = NotFound desc = could not find container \"613cc7aa7a74a01db364f6d91dc9b8536aeed499c7f3ef72c9ff5169bdd53252\": container with ID starting with 613cc7aa7a74a01db364f6d91dc9b8536aeed499c7f3ef72c9ff5169bdd53252 not found: ID does not exist" Apr 21 16:05:30.924624 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:05:30.924596 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6d6487c47d-z54hv"] Apr 21 16:05:30.926992 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:05:30.926971 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6d6487c47d-z54hv"] Apr 21 16:05:31.395832 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:05:31.395788 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-p4kt2/must-gather-zslsl"] Apr 21 16:05:31.396174 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:05:31.396106 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e042e45b-8908-48d8-8c84-1cdda8ab66bb" containerName="storage-initializer" Apr 21 16:05:31.396174 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:05:31.396117 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="e042e45b-8908-48d8-8c84-1cdda8ab66bb" containerName="storage-initializer" Apr 21 16:05:31.396174 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:05:31.396131 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b76c1fa9-446b-4a0f-8008-07d02ca53527" containerName="main" Apr 21 16:05:31.396174 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:05:31.396136 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="b76c1fa9-446b-4a0f-8008-07d02ca53527" containerName="main" Apr 21 16:05:31.396174 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:05:31.396148 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f086989a-62a5-4fa3-a5ad-2ff37aec85bc" containerName="main" Apr 21 16:05:31.396174 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:05:31.396153 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="f086989a-62a5-4fa3-a5ad-2ff37aec85bc" containerName="main" Apr 21 16:05:31.396174 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:05:31.396159 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b76c1fa9-446b-4a0f-8008-07d02ca53527" containerName="storage-initializer" Apr 21 16:05:31.396174 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:05:31.396164 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="b76c1fa9-446b-4a0f-8008-07d02ca53527" containerName="storage-initializer" Apr 21 16:05:31.396174 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:05:31.396173 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="34799a69-4511-4755-9aee-d09129de977c" containerName="storage-initializer" Apr 21 16:05:31.396174 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:05:31.396178 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="34799a69-4511-4755-9aee-d09129de977c" containerName="storage-initializer" Apr 21 16:05:31.396526 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:05:31.396185 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f086989a-62a5-4fa3-a5ad-2ff37aec85bc" containerName="tokenizer" Apr 21 16:05:31.396526 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:05:31.396191 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="f086989a-62a5-4fa3-a5ad-2ff37aec85bc" containerName="tokenizer" Apr 21 16:05:31.396526 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:05:31.396201 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f086989a-62a5-4fa3-a5ad-2ff37aec85bc" containerName="storage-initializer" Apr 21 16:05:31.396526 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:05:31.396206 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="f086989a-62a5-4fa3-a5ad-2ff37aec85bc" containerName="storage-initializer" Apr 21 16:05:31.396526 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:05:31.396212 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e042e45b-8908-48d8-8c84-1cdda8ab66bb" containerName="main" Apr 21 16:05:31.396526 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:05:31.396217 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="e042e45b-8908-48d8-8c84-1cdda8ab66bb" containerName="main" Apr 21 16:05:31.396526 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:05:31.396224 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="34799a69-4511-4755-9aee-d09129de977c" containerName="main" Apr 21 16:05:31.396526 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:05:31.396229 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="34799a69-4511-4755-9aee-d09129de977c" containerName="main" Apr 21 16:05:31.396526 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:05:31.396270 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="b76c1fa9-446b-4a0f-8008-07d02ca53527" containerName="main" Apr 21 16:05:31.396526 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:05:31.396279 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="34799a69-4511-4755-9aee-d09129de977c" containerName="main" Apr 21 16:05:31.396526 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:05:31.396285 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="f086989a-62a5-4fa3-a5ad-2ff37aec85bc" containerName="tokenizer" Apr 21 16:05:31.396526 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:05:31.396292 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="e042e45b-8908-48d8-8c84-1cdda8ab66bb" containerName="main" Apr 21 16:05:31.396526 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:05:31.396298 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="f086989a-62a5-4fa3-a5ad-2ff37aec85bc" containerName="main" Apr 21 16:05:31.401418 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:05:31.401400 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-p4kt2/must-gather-zslsl" Apr 21 16:05:31.404612 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:05:31.404580 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-p4kt2\"/\"default-dockercfg-lndnc\"" Apr 21 16:05:31.404755 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:05:31.404579 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-p4kt2\"/\"openshift-service-ca.crt\"" Apr 21 16:05:31.404755 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:05:31.404624 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-p4kt2\"/\"kube-root-ca.crt\"" Apr 21 16:05:31.405280 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:05:31.405259 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-p4kt2/must-gather-zslsl"] Apr 21 16:05:31.481354 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:05:31.481325 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b76c1fa9-446b-4a0f-8008-07d02ca53527" path="/var/lib/kubelet/pods/b76c1fa9-446b-4a0f-8008-07d02ca53527/volumes" Apr 21 16:05:31.513839 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:05:31.513816 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/23eff0b7-5be9-4040-9d23-628312a83023-must-gather-output\") pod \"must-gather-zslsl\" (UID: \"23eff0b7-5be9-4040-9d23-628312a83023\") " pod="openshift-must-gather-p4kt2/must-gather-zslsl" Apr 21 16:05:31.513938 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:05:31.513845 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sw9pk\" (UniqueName: \"kubernetes.io/projected/23eff0b7-5be9-4040-9d23-628312a83023-kube-api-access-sw9pk\") pod \"must-gather-zslsl\" (UID: \"23eff0b7-5be9-4040-9d23-628312a83023\") " pod="openshift-must-gather-p4kt2/must-gather-zslsl" Apr 21 16:05:31.614786 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:05:31.614758 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/23eff0b7-5be9-4040-9d23-628312a83023-must-gather-output\") pod \"must-gather-zslsl\" (UID: \"23eff0b7-5be9-4040-9d23-628312a83023\") " pod="openshift-must-gather-p4kt2/must-gather-zslsl" Apr 21 16:05:31.614914 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:05:31.614816 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sw9pk\" (UniqueName: \"kubernetes.io/projected/23eff0b7-5be9-4040-9d23-628312a83023-kube-api-access-sw9pk\") pod \"must-gather-zslsl\" (UID: \"23eff0b7-5be9-4040-9d23-628312a83023\") " pod="openshift-must-gather-p4kt2/must-gather-zslsl" Apr 21 16:05:31.615079 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:05:31.615058 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/23eff0b7-5be9-4040-9d23-628312a83023-must-gather-output\") pod \"must-gather-zslsl\" (UID: \"23eff0b7-5be9-4040-9d23-628312a83023\") " pod="openshift-must-gather-p4kt2/must-gather-zslsl" Apr 21 16:05:31.623288 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:05:31.623262 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sw9pk\" (UniqueName: \"kubernetes.io/projected/23eff0b7-5be9-4040-9d23-628312a83023-kube-api-access-sw9pk\") pod \"must-gather-zslsl\" (UID: \"23eff0b7-5be9-4040-9d23-628312a83023\") " pod="openshift-must-gather-p4kt2/must-gather-zslsl" Apr 21 16:05:31.711345 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:05:31.711294 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-p4kt2/must-gather-zslsl" Apr 21 16:05:31.827476 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:05:31.827436 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-p4kt2/must-gather-zslsl"] Apr 21 16:05:31.829779 ip-10-0-136-123 kubenswrapper[2573]: W0421 16:05:31.829750 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod23eff0b7_5be9_4040_9d23_628312a83023.slice/crio-87008d3f2754b32aa1293a8fb7d1e4fd1db681db4355fc984815e2549d798d7e WatchSource:0}: Error finding container 87008d3f2754b32aa1293a8fb7d1e4fd1db681db4355fc984815e2549d798d7e: Status 404 returned error can't find the container with id 87008d3f2754b32aa1293a8fb7d1e4fd1db681db4355fc984815e2549d798d7e Apr 21 16:05:31.905100 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:05:31.905068 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-p4kt2/must-gather-zslsl" event={"ID":"23eff0b7-5be9-4040-9d23-628312a83023","Type":"ContainerStarted","Data":"87008d3f2754b32aa1293a8fb7d1e4fd1db681db4355fc984815e2549d798d7e"} Apr 21 16:05:36.927051 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:05:36.927013 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-p4kt2/must-gather-zslsl" event={"ID":"23eff0b7-5be9-4040-9d23-628312a83023","Type":"ContainerStarted","Data":"4bd6c340d3ef81d4224fabb8fa61156c9e439bd34e25c12f1322445ca3209fdd"} Apr 21 16:05:36.927051 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:05:36.927051 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-p4kt2/must-gather-zslsl" event={"ID":"23eff0b7-5be9-4040-9d23-628312a83023","Type":"ContainerStarted","Data":"9162905db6668cc78e62a697d925b91cd8a9fe98d175beb197fb8051435f6ccd"} Apr 21 16:05:36.943244 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:05:36.943196 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-p4kt2/must-gather-zslsl" podStartSLOduration=1.470008521 podStartE2EDuration="5.943180927s" podCreationTimestamp="2026-04-21 16:05:31 +0000 UTC" firstStartedPulling="2026-04-21 16:05:31.831593621 +0000 UTC m=+1826.961647248" lastFinishedPulling="2026-04-21 16:05:36.304766039 +0000 UTC m=+1831.434819654" observedRunningTime="2026-04-21 16:05:36.942502122 +0000 UTC m=+1832.072555756" watchObservedRunningTime="2026-04-21 16:05:36.943180927 +0000 UTC m=+1832.073234560" Apr 21 16:05:45.339733 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:05:45.339698 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-pjq6c_4a4d299c-f46e-479b-aeee-2f438d299618/istio-proxy/0.log" Apr 21 16:05:45.399558 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:05:45.399531 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-nm684_93c9ec2a-89a5-4cf1-a9f1-0e414968c8cf/istio-proxy/0.log" Apr 21 16:05:46.320924 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:05:46.320894 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-pjq6c_4a4d299c-f46e-479b-aeee-2f438d299618/istio-proxy/0.log" Apr 21 16:05:46.336785 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:05:46.336758 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-nm684_93c9ec2a-89a5-4cf1-a9f1-0e414968c8cf/istio-proxy/0.log" Apr 21 16:05:47.248283 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:05:47.248250 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-pjq6c_4a4d299c-f46e-479b-aeee-2f438d299618/istio-proxy/0.log" Apr 21 16:05:47.266791 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:05:47.266760 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-nm684_93c9ec2a-89a5-4cf1-a9f1-0e414968c8cf/istio-proxy/0.log" Apr 21 16:05:48.140216 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:05:48.140188 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-pjq6c_4a4d299c-f46e-479b-aeee-2f438d299618/istio-proxy/0.log" Apr 21 16:05:48.155078 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:05:48.155044 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-nm684_93c9ec2a-89a5-4cf1-a9f1-0e414968c8cf/istio-proxy/0.log" Apr 21 16:05:49.021144 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:05:49.021114 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-pjq6c_4a4d299c-f46e-479b-aeee-2f438d299618/istio-proxy/0.log" Apr 21 16:05:49.036664 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:05:49.036629 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-nm684_93c9ec2a-89a5-4cf1-a9f1-0e414968c8cf/istio-proxy/0.log" Apr 21 16:05:49.914256 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:05:49.914226 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-pjq6c_4a4d299c-f46e-479b-aeee-2f438d299618/istio-proxy/0.log" Apr 21 16:05:49.930502 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:05:49.930472 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-nm684_93c9ec2a-89a5-4cf1-a9f1-0e414968c8cf/istio-proxy/0.log" Apr 21 16:05:50.813631 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:05:50.813603 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-pjq6c_4a4d299c-f46e-479b-aeee-2f438d299618/istio-proxy/0.log" Apr 21 16:05:50.829840 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:05:50.829814 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-nm684_93c9ec2a-89a5-4cf1-a9f1-0e414968c8cf/istio-proxy/0.log" Apr 21 16:05:51.693202 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:05:51.693173 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-pjq6c_4a4d299c-f46e-479b-aeee-2f438d299618/istio-proxy/0.log" Apr 21 16:05:51.708574 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:05:51.708548 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-nm684_93c9ec2a-89a5-4cf1-a9f1-0e414968c8cf/istio-proxy/0.log" Apr 21 16:05:52.580516 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:05:52.580481 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-pjq6c_4a4d299c-f46e-479b-aeee-2f438d299618/istio-proxy/0.log" Apr 21 16:05:52.596479 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:05:52.596451 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-nm684_93c9ec2a-89a5-4cf1-a9f1-0e414968c8cf/istio-proxy/0.log" Apr 21 16:05:53.493097 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:05:53.493071 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-pjq6c_4a4d299c-f46e-479b-aeee-2f438d299618/istio-proxy/0.log" Apr 21 16:05:53.511055 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:05:53.511031 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-nm684_93c9ec2a-89a5-4cf1-a9f1-0e414968c8cf/istio-proxy/0.log" Apr 21 16:05:54.460998 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:05:54.460969 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-pjq6c_4a4d299c-f46e-479b-aeee-2f438d299618/istio-proxy/0.log" Apr 21 16:05:54.480852 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:05:54.480831 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-nm684_93c9ec2a-89a5-4cf1-a9f1-0e414968c8cf/istio-proxy/0.log" Apr 21 16:05:55.416277 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:05:55.416244 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-pjq6c_4a4d299c-f46e-479b-aeee-2f438d299618/istio-proxy/0.log" Apr 21 16:05:55.433954 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:05:55.433925 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-nm684_93c9ec2a-89a5-4cf1-a9f1-0e414968c8cf/istio-proxy/0.log" Apr 21 16:05:56.425710 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:05:56.425683 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-pjq6c_4a4d299c-f46e-479b-aeee-2f438d299618/istio-proxy/0.log" Apr 21 16:05:56.442295 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:05:56.442268 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-nm684_93c9ec2a-89a5-4cf1-a9f1-0e414968c8cf/istio-proxy/0.log" Apr 21 16:05:57.374217 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:05:57.374189 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-pjq6c_4a4d299c-f46e-479b-aeee-2f438d299618/istio-proxy/0.log" Apr 21 16:05:57.390744 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:05:57.390702 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-nm684_93c9ec2a-89a5-4cf1-a9f1-0e414968c8cf/istio-proxy/0.log" Apr 21 16:05:58.392421 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:05:58.392394 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_openshift-ai-inference-openshift-default-7c5447bb76-8z8tz_79cfeeaf-2ff6-4d93-9ee3-7d75b97d3996/istio-proxy/0.log" Apr 21 16:05:59.276784 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:05:59.276755 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_openshift-ai-inference-openshift-default-7c5447bb76-8z8tz_79cfeeaf-2ff6-4d93-9ee3-7d75b97d3996/istio-proxy/0.log" Apr 21 16:06:00.121655 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:06:00.121610 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-operator-7587b89b76-jpxwt_bfa1b708-a291-48e4-a35f-8ddf287fc1b8/manager/0.log" Apr 21 16:06:00.214486 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:06:00.214456 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-limitador-67566c68b4-skws5_65e3be56-85d2-4fd7-8fc9-dd5648a64ac8/limitador/0.log" Apr 21 16:06:00.228695 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:06:00.228668 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-operator-controller-manager-c7fb4c8d5-8z254_31e563af-bf75-499d-93b3-29b3469180f2/manager/0.log" Apr 21 16:06:01.016337 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:06:01.016307 2573 generic.go:358] "Generic (PLEG): container finished" podID="23eff0b7-5be9-4040-9d23-628312a83023" containerID="9162905db6668cc78e62a697d925b91cd8a9fe98d175beb197fb8051435f6ccd" exitCode=0 Apr 21 16:06:01.016542 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:06:01.016360 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-p4kt2/must-gather-zslsl" event={"ID":"23eff0b7-5be9-4040-9d23-628312a83023","Type":"ContainerDied","Data":"9162905db6668cc78e62a697d925b91cd8a9fe98d175beb197fb8051435f6ccd"} Apr 21 16:06:01.016709 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:06:01.016692 2573 scope.go:117] "RemoveContainer" containerID="9162905db6668cc78e62a697d925b91cd8a9fe98d175beb197fb8051435f6ccd" Apr 21 16:06:01.889701 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:06:01.889669 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-p4kt2_must-gather-zslsl_23eff0b7-5be9-4040-9d23-628312a83023/gather/0.log" Apr 21 16:06:02.558845 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:06:02.558810 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-dt7xd/must-gather-pzt4m"] Apr 21 16:06:02.565033 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:06:02.565017 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-dt7xd/must-gather-pzt4m" Apr 21 16:06:02.567952 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:06:02.567926 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-dt7xd\"/\"default-dockercfg-k8dzq\"" Apr 21 16:06:02.569226 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:06:02.569203 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-dt7xd\"/\"kube-root-ca.crt\"" Apr 21 16:06:02.569226 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:06:02.569222 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-dt7xd\"/\"openshift-service-ca.crt\"" Apr 21 16:06:02.571496 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:06:02.571468 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-dt7xd/must-gather-pzt4m"] Apr 21 16:06:02.692215 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:06:02.692190 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/f3fd20f0-07a3-4202-9ded-94c48a06d179-must-gather-output\") pod \"must-gather-pzt4m\" (UID: \"f3fd20f0-07a3-4202-9ded-94c48a06d179\") " pod="openshift-must-gather-dt7xd/must-gather-pzt4m" Apr 21 16:06:02.692362 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:06:02.692238 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zhpq9\" (UniqueName: \"kubernetes.io/projected/f3fd20f0-07a3-4202-9ded-94c48a06d179-kube-api-access-zhpq9\") pod \"must-gather-pzt4m\" (UID: \"f3fd20f0-07a3-4202-9ded-94c48a06d179\") " pod="openshift-must-gather-dt7xd/must-gather-pzt4m" Apr 21 16:06:02.793330 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:06:02.793301 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zhpq9\" (UniqueName: \"kubernetes.io/projected/f3fd20f0-07a3-4202-9ded-94c48a06d179-kube-api-access-zhpq9\") pod \"must-gather-pzt4m\" (UID: \"f3fd20f0-07a3-4202-9ded-94c48a06d179\") " pod="openshift-must-gather-dt7xd/must-gather-pzt4m" Apr 21 16:06:02.793466 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:06:02.793353 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/f3fd20f0-07a3-4202-9ded-94c48a06d179-must-gather-output\") pod \"must-gather-pzt4m\" (UID: \"f3fd20f0-07a3-4202-9ded-94c48a06d179\") " pod="openshift-must-gather-dt7xd/must-gather-pzt4m" Apr 21 16:06:02.793662 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:06:02.793643 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/f3fd20f0-07a3-4202-9ded-94c48a06d179-must-gather-output\") pod \"must-gather-pzt4m\" (UID: \"f3fd20f0-07a3-4202-9ded-94c48a06d179\") " pod="openshift-must-gather-dt7xd/must-gather-pzt4m" Apr 21 16:06:02.802442 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:06:02.802411 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zhpq9\" (UniqueName: \"kubernetes.io/projected/f3fd20f0-07a3-4202-9ded-94c48a06d179-kube-api-access-zhpq9\") pod \"must-gather-pzt4m\" (UID: \"f3fd20f0-07a3-4202-9ded-94c48a06d179\") " pod="openshift-must-gather-dt7xd/must-gather-pzt4m" Apr 21 16:06:02.874305 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:06:02.874236 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-dt7xd/must-gather-pzt4m" Apr 21 16:06:03.201184 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:06:03.201159 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-dt7xd/must-gather-pzt4m"] Apr 21 16:06:03.203234 ip-10-0-136-123 kubenswrapper[2573]: W0421 16:06:03.203205 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf3fd20f0_07a3_4202_9ded_94c48a06d179.slice/crio-507afcf519b9b677af4e210900e46d224e8a376c8acdd6ed5f10113b875f4dc0 WatchSource:0}: Error finding container 507afcf519b9b677af4e210900e46d224e8a376c8acdd6ed5f10113b875f4dc0: Status 404 returned error can't find the container with id 507afcf519b9b677af4e210900e46d224e8a376c8acdd6ed5f10113b875f4dc0 Apr 21 16:06:04.036439 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:06:04.036405 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-dt7xd/must-gather-pzt4m" event={"ID":"f3fd20f0-07a3-4202-9ded-94c48a06d179","Type":"ContainerStarted","Data":"507afcf519b9b677af4e210900e46d224e8a376c8acdd6ed5f10113b875f4dc0"} Apr 21 16:06:05.043788 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:06:05.043751 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-dt7xd/must-gather-pzt4m" event={"ID":"f3fd20f0-07a3-4202-9ded-94c48a06d179","Type":"ContainerStarted","Data":"5cee1aab3bdbeafbe28b0d32916fecabd2a54869749904463ddc850e12f3ed2e"} Apr 21 16:06:05.044254 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:06:05.044074 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-dt7xd/must-gather-pzt4m" event={"ID":"f3fd20f0-07a3-4202-9ded-94c48a06d179","Type":"ContainerStarted","Data":"a175ea7a729156838c9e54b614c44556bc8ff7f7b3ed819794dcaabcd3b5e42c"} Apr 21 16:06:05.060358 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:06:05.060304 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-dt7xd/must-gather-pzt4m" podStartSLOduration=2.200622952 podStartE2EDuration="3.060285751s" podCreationTimestamp="2026-04-21 16:06:02 +0000 UTC" firstStartedPulling="2026-04-21 16:06:03.204988882 +0000 UTC m=+1858.335042494" lastFinishedPulling="2026-04-21 16:06:04.06465168 +0000 UTC m=+1859.194705293" observedRunningTime="2026-04-21 16:06:05.058974669 +0000 UTC m=+1860.189028304" watchObservedRunningTime="2026-04-21 16:06:05.060285751 +0000 UTC m=+1860.190339386" Apr 21 16:06:05.666627 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:06:05.666587 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-4pjd9_97199e5c-4c05-4197-84c9-e95b525f3ae1/global-pull-secret-syncer/0.log" Apr 21 16:06:05.801827 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:06:05.801778 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-xph98_63dd9652-ce6c-4395-ae74-cba66c5a8c72/konnectivity-agent/0.log" Apr 21 16:06:05.874884 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:06:05.874849 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-136-123.ec2.internal_e27d8eec64ea5a1e06e35be5839e2c48/haproxy/0.log" Apr 21 16:06:07.388226 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:06:07.388181 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-p4kt2/must-gather-zslsl"] Apr 21 16:06:07.388722 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:06:07.388476 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-must-gather-p4kt2/must-gather-zslsl" podUID="23eff0b7-5be9-4040-9d23-628312a83023" containerName="copy" containerID="cri-o://4bd6c340d3ef81d4224fabb8fa61156c9e439bd34e25c12f1322445ca3209fdd" gracePeriod=2 Apr 21 16:06:07.391030 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:06:07.390991 2573 status_manager.go:895] "Failed to get status for pod" podUID="23eff0b7-5be9-4040-9d23-628312a83023" pod="openshift-must-gather-p4kt2/must-gather-zslsl" err="pods \"must-gather-zslsl\" is forbidden: User \"system:node:ip-10-0-136-123.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-p4kt2\": no relationship found between node 'ip-10-0-136-123.ec2.internal' and this object" Apr 21 16:06:07.391605 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:06:07.391578 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-p4kt2/must-gather-zslsl"] Apr 21 16:06:07.755998 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:06:07.755731 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-p4kt2_must-gather-zslsl_23eff0b7-5be9-4040-9d23-628312a83023/copy/0.log" Apr 21 16:06:07.756497 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:06:07.756329 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-p4kt2/must-gather-zslsl" Apr 21 16:06:07.947846 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:06:07.947250 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/23eff0b7-5be9-4040-9d23-628312a83023-must-gather-output\") pod \"23eff0b7-5be9-4040-9d23-628312a83023\" (UID: \"23eff0b7-5be9-4040-9d23-628312a83023\") " Apr 21 16:06:07.947846 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:06:07.947327 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sw9pk\" (UniqueName: \"kubernetes.io/projected/23eff0b7-5be9-4040-9d23-628312a83023-kube-api-access-sw9pk\") pod \"23eff0b7-5be9-4040-9d23-628312a83023\" (UID: \"23eff0b7-5be9-4040-9d23-628312a83023\") " Apr 21 16:06:07.956107 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:06:07.950358 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/23eff0b7-5be9-4040-9d23-628312a83023-kube-api-access-sw9pk" (OuterVolumeSpecName: "kube-api-access-sw9pk") pod "23eff0b7-5be9-4040-9d23-628312a83023" (UID: "23eff0b7-5be9-4040-9d23-628312a83023"). InnerVolumeSpecName "kube-api-access-sw9pk". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 16:06:07.956936 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:06:07.956762 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/23eff0b7-5be9-4040-9d23-628312a83023-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "23eff0b7-5be9-4040-9d23-628312a83023" (UID: "23eff0b7-5be9-4040-9d23-628312a83023"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 16:06:08.049036 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:06:08.048916 2573 reconciler_common.go:299] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/23eff0b7-5be9-4040-9d23-628312a83023-must-gather-output\") on node \"ip-10-0-136-123.ec2.internal\" DevicePath \"\"" Apr 21 16:06:08.049036 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:06:08.048957 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-sw9pk\" (UniqueName: \"kubernetes.io/projected/23eff0b7-5be9-4040-9d23-628312a83023-kube-api-access-sw9pk\") on node \"ip-10-0-136-123.ec2.internal\" DevicePath \"\"" Apr 21 16:06:08.059869 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:06:08.059828 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-p4kt2_must-gather-zslsl_23eff0b7-5be9-4040-9d23-628312a83023/copy/0.log" Apr 21 16:06:08.060447 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:06:08.060421 2573 generic.go:358] "Generic (PLEG): container finished" podID="23eff0b7-5be9-4040-9d23-628312a83023" containerID="4bd6c340d3ef81d4224fabb8fa61156c9e439bd34e25c12f1322445ca3209fdd" exitCode=143 Apr 21 16:06:08.060646 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:06:08.060636 2573 scope.go:117] "RemoveContainer" containerID="4bd6c340d3ef81d4224fabb8fa61156c9e439bd34e25c12f1322445ca3209fdd" Apr 21 16:06:08.060879 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:06:08.060865 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-p4kt2/must-gather-zslsl" Apr 21 16:06:08.078136 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:06:08.077455 2573 scope.go:117] "RemoveContainer" containerID="9162905db6668cc78e62a697d925b91cd8a9fe98d175beb197fb8051435f6ccd" Apr 21 16:06:08.107572 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:06:08.107078 2573 scope.go:117] "RemoveContainer" containerID="4bd6c340d3ef81d4224fabb8fa61156c9e439bd34e25c12f1322445ca3209fdd" Apr 21 16:06:08.110771 ip-10-0-136-123 kubenswrapper[2573]: E0421 16:06:08.109106 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4bd6c340d3ef81d4224fabb8fa61156c9e439bd34e25c12f1322445ca3209fdd\": container with ID starting with 4bd6c340d3ef81d4224fabb8fa61156c9e439bd34e25c12f1322445ca3209fdd not found: ID does not exist" containerID="4bd6c340d3ef81d4224fabb8fa61156c9e439bd34e25c12f1322445ca3209fdd" Apr 21 16:06:08.110771 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:06:08.109151 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4bd6c340d3ef81d4224fabb8fa61156c9e439bd34e25c12f1322445ca3209fdd"} err="failed to get container status \"4bd6c340d3ef81d4224fabb8fa61156c9e439bd34e25c12f1322445ca3209fdd\": rpc error: code = NotFound desc = could not find container \"4bd6c340d3ef81d4224fabb8fa61156c9e439bd34e25c12f1322445ca3209fdd\": container with ID starting with 4bd6c340d3ef81d4224fabb8fa61156c9e439bd34e25c12f1322445ca3209fdd not found: ID does not exist" Apr 21 16:06:08.110771 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:06:08.109176 2573 scope.go:117] "RemoveContainer" containerID="9162905db6668cc78e62a697d925b91cd8a9fe98d175beb197fb8051435f6ccd" Apr 21 16:06:08.114869 ip-10-0-136-123 kubenswrapper[2573]: E0421 16:06:08.113054 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9162905db6668cc78e62a697d925b91cd8a9fe98d175beb197fb8051435f6ccd\": container with ID starting with 9162905db6668cc78e62a697d925b91cd8a9fe98d175beb197fb8051435f6ccd not found: ID does not exist" containerID="9162905db6668cc78e62a697d925b91cd8a9fe98d175beb197fb8051435f6ccd" Apr 21 16:06:08.114869 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:06:08.113095 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9162905db6668cc78e62a697d925b91cd8a9fe98d175beb197fb8051435f6ccd"} err="failed to get container status \"9162905db6668cc78e62a697d925b91cd8a9fe98d175beb197fb8051435f6ccd\": rpc error: code = NotFound desc = could not find container \"9162905db6668cc78e62a697d925b91cd8a9fe98d175beb197fb8051435f6ccd\": container with ID starting with 9162905db6668cc78e62a697d925b91cd8a9fe98d175beb197fb8051435f6ccd not found: ID does not exist" Apr 21 16:06:09.484374 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:06:09.484335 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="23eff0b7-5be9-4040-9d23-628312a83023" path="/var/lib/kubelet/pods/23eff0b7-5be9-4040-9d23-628312a83023/volumes" Apr 21 16:06:09.715620 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:06:09.715478 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-operator-7587b89b76-jpxwt_bfa1b708-a291-48e4-a35f-8ddf287fc1b8/manager/0.log" Apr 21 16:06:09.885278 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:06:09.885244 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-limitador-67566c68b4-skws5_65e3be56-85d2-4fd7-8fc9-dd5648a64ac8/limitador/0.log" Apr 21 16:06:09.918186 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:06:09.918151 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-operator-controller-manager-c7fb4c8d5-8z254_31e563af-bf75-499d-93b3-29b3469180f2/manager/0.log" Apr 21 16:06:11.575525 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:06:11.575496 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-vmrmq_9a3f9599-c99a-4c6a-b295-b12b9a4fbc96/node-exporter/0.log" Apr 21 16:06:11.597506 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:06:11.597480 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-vmrmq_9a3f9599-c99a-4c6a-b295-b12b9a4fbc96/kube-rbac-proxy/0.log" Apr 21 16:06:11.619479 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:06:11.619445 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-vmrmq_9a3f9599-c99a-4c6a-b295-b12b9a4fbc96/init-textfile/0.log" Apr 21 16:06:14.543962 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:06:14.543925 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-dt7xd/perf-node-gather-daemonset-dfb9j"] Apr 21 16:06:14.544439 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:06:14.544401 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="23eff0b7-5be9-4040-9d23-628312a83023" containerName="gather" Apr 21 16:06:14.544439 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:06:14.544419 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="23eff0b7-5be9-4040-9d23-628312a83023" containerName="gather" Apr 21 16:06:14.544513 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:06:14.544440 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="23eff0b7-5be9-4040-9d23-628312a83023" containerName="copy" Apr 21 16:06:14.544513 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:06:14.544451 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="23eff0b7-5be9-4040-9d23-628312a83023" containerName="copy" Apr 21 16:06:14.544576 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:06:14.544529 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="23eff0b7-5be9-4040-9d23-628312a83023" containerName="copy" Apr 21 16:06:14.544576 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:06:14.544541 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="23eff0b7-5be9-4040-9d23-628312a83023" containerName="gather" Apr 21 16:06:14.549485 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:06:14.549466 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-dt7xd/perf-node-gather-daemonset-dfb9j" Apr 21 16:06:14.556964 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:06:14.556936 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-dt7xd/perf-node-gather-daemonset-dfb9j"] Apr 21 16:06:14.631492 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:06:14.631455 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/7a1131eb-2432-476e-abd8-cff48d9580f7-podres\") pod \"perf-node-gather-daemonset-dfb9j\" (UID: \"7a1131eb-2432-476e-abd8-cff48d9580f7\") " pod="openshift-must-gather-dt7xd/perf-node-gather-daemonset-dfb9j" Apr 21 16:06:14.631861 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:06:14.631825 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7a1131eb-2432-476e-abd8-cff48d9580f7-sys\") pod \"perf-node-gather-daemonset-dfb9j\" (UID: \"7a1131eb-2432-476e-abd8-cff48d9580f7\") " pod="openshift-must-gather-dt7xd/perf-node-gather-daemonset-dfb9j" Apr 21 16:06:14.631998 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:06:14.631959 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/7a1131eb-2432-476e-abd8-cff48d9580f7-proc\") pod \"perf-node-gather-daemonset-dfb9j\" (UID: \"7a1131eb-2432-476e-abd8-cff48d9580f7\") " pod="openshift-must-gather-dt7xd/perf-node-gather-daemonset-dfb9j" Apr 21 16:06:14.631998 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:06:14.631979 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/7a1131eb-2432-476e-abd8-cff48d9580f7-lib-modules\") pod \"perf-node-gather-daemonset-dfb9j\" (UID: \"7a1131eb-2432-476e-abd8-cff48d9580f7\") " pod="openshift-must-gather-dt7xd/perf-node-gather-daemonset-dfb9j" Apr 21 16:06:14.632106 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:06:14.632081 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4p9sq\" (UniqueName: \"kubernetes.io/projected/7a1131eb-2432-476e-abd8-cff48d9580f7-kube-api-access-4p9sq\") pod \"perf-node-gather-daemonset-dfb9j\" (UID: \"7a1131eb-2432-476e-abd8-cff48d9580f7\") " pod="openshift-must-gather-dt7xd/perf-node-gather-daemonset-dfb9j" Apr 21 16:06:14.733560 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:06:14.733510 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/7a1131eb-2432-476e-abd8-cff48d9580f7-podres\") pod \"perf-node-gather-daemonset-dfb9j\" (UID: \"7a1131eb-2432-476e-abd8-cff48d9580f7\") " pod="openshift-must-gather-dt7xd/perf-node-gather-daemonset-dfb9j" Apr 21 16:06:14.733831 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:06:14.733574 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7a1131eb-2432-476e-abd8-cff48d9580f7-sys\") pod \"perf-node-gather-daemonset-dfb9j\" (UID: \"7a1131eb-2432-476e-abd8-cff48d9580f7\") " pod="openshift-must-gather-dt7xd/perf-node-gather-daemonset-dfb9j" Apr 21 16:06:14.733831 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:06:14.733621 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/7a1131eb-2432-476e-abd8-cff48d9580f7-proc\") pod \"perf-node-gather-daemonset-dfb9j\" (UID: \"7a1131eb-2432-476e-abd8-cff48d9580f7\") " pod="openshift-must-gather-dt7xd/perf-node-gather-daemonset-dfb9j" Apr 21 16:06:14.733831 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:06:14.733645 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/7a1131eb-2432-476e-abd8-cff48d9580f7-lib-modules\") pod \"perf-node-gather-daemonset-dfb9j\" (UID: \"7a1131eb-2432-476e-abd8-cff48d9580f7\") " pod="openshift-must-gather-dt7xd/perf-node-gather-daemonset-dfb9j" Apr 21 16:06:14.733831 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:06:14.733715 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4p9sq\" (UniqueName: \"kubernetes.io/projected/7a1131eb-2432-476e-abd8-cff48d9580f7-kube-api-access-4p9sq\") pod \"perf-node-gather-daemonset-dfb9j\" (UID: \"7a1131eb-2432-476e-abd8-cff48d9580f7\") " pod="openshift-must-gather-dt7xd/perf-node-gather-daemonset-dfb9j" Apr 21 16:06:14.733831 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:06:14.733741 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7a1131eb-2432-476e-abd8-cff48d9580f7-sys\") pod \"perf-node-gather-daemonset-dfb9j\" (UID: \"7a1131eb-2432-476e-abd8-cff48d9580f7\") " pod="openshift-must-gather-dt7xd/perf-node-gather-daemonset-dfb9j" Apr 21 16:06:14.733831 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:06:14.733789 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/7a1131eb-2432-476e-abd8-cff48d9580f7-proc\") pod \"perf-node-gather-daemonset-dfb9j\" (UID: \"7a1131eb-2432-476e-abd8-cff48d9580f7\") " pod="openshift-must-gather-dt7xd/perf-node-gather-daemonset-dfb9j" Apr 21 16:06:14.733831 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:06:14.733741 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/7a1131eb-2432-476e-abd8-cff48d9580f7-podres\") pod \"perf-node-gather-daemonset-dfb9j\" (UID: \"7a1131eb-2432-476e-abd8-cff48d9580f7\") " pod="openshift-must-gather-dt7xd/perf-node-gather-daemonset-dfb9j" Apr 21 16:06:14.734219 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:06:14.733899 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/7a1131eb-2432-476e-abd8-cff48d9580f7-lib-modules\") pod \"perf-node-gather-daemonset-dfb9j\" (UID: \"7a1131eb-2432-476e-abd8-cff48d9580f7\") " pod="openshift-must-gather-dt7xd/perf-node-gather-daemonset-dfb9j" Apr 21 16:06:14.743196 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:06:14.743163 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4p9sq\" (UniqueName: \"kubernetes.io/projected/7a1131eb-2432-476e-abd8-cff48d9580f7-kube-api-access-4p9sq\") pod \"perf-node-gather-daemonset-dfb9j\" (UID: \"7a1131eb-2432-476e-abd8-cff48d9580f7\") " pod="openshift-must-gather-dt7xd/perf-node-gather-daemonset-dfb9j" Apr 21 16:06:14.862140 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:06:14.862095 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-dt7xd/perf-node-gather-daemonset-dfb9j" Apr 21 16:06:15.000386 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:06:15.000361 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-dt7xd/perf-node-gather-daemonset-dfb9j"] Apr 21 16:06:15.002903 ip-10-0-136-123 kubenswrapper[2573]: W0421 16:06:15.002860 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod7a1131eb_2432_476e_abd8_cff48d9580f7.slice/crio-6d1681ac8abd3fc3e6a729120252cb9b48de9502b798ddfde17e12dc5faba641 WatchSource:0}: Error finding container 6d1681ac8abd3fc3e6a729120252cb9b48de9502b798ddfde17e12dc5faba641: Status 404 returned error can't find the container with id 6d1681ac8abd3fc3e6a729120252cb9b48de9502b798ddfde17e12dc5faba641 Apr 21 16:06:15.103301 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:06:15.103269 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-dt7xd/perf-node-gather-daemonset-dfb9j" event={"ID":"7a1131eb-2432-476e-abd8-cff48d9580f7","Type":"ContainerStarted","Data":"6d1681ac8abd3fc3e6a729120252cb9b48de9502b798ddfde17e12dc5faba641"} Apr 21 16:06:15.626428 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:06:15.626379 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-p6s7m_965e7720-2b43-4a79-9af6-74b4a24a9047/dns/0.log" Apr 21 16:06:15.646003 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:06:15.645976 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-p6s7m_965e7720-2b43-4a79-9af6-74b4a24a9047/kube-rbac-proxy/0.log" Apr 21 16:06:15.716820 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:06:15.716778 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-lkj85_96e29eb1-d270-4d82-a139-d970d1863b1c/dns-node-resolver/0.log" Apr 21 16:06:16.109773 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:06:16.109737 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-dt7xd/perf-node-gather-daemonset-dfb9j" event={"ID":"7a1131eb-2432-476e-abd8-cff48d9580f7","Type":"ContainerStarted","Data":"d3b8d9e5d1403a2c21820e9f7b488258874433e0cdcb892c269e59de9ebce750"} Apr 21 16:06:16.110013 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:06:16.109814 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-dt7xd/perf-node-gather-daemonset-dfb9j" Apr 21 16:06:16.134095 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:06:16.129091 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-dt7xd/perf-node-gather-daemonset-dfb9j" podStartSLOduration=2.128779003 podStartE2EDuration="2.128779003s" podCreationTimestamp="2026-04-21 16:06:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 16:06:16.125887326 +0000 UTC m=+1871.255940985" watchObservedRunningTime="2026-04-21 16:06:16.128779003 +0000 UTC m=+1871.258832628" Apr 21 16:06:16.251584 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:06:16.251554 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-m9xpg_39fccf23-7816-40f1-9d1a-0711aca322c8/node-ca/0.log" Apr 21 16:06:17.183518 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:06:17.183492 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_openshift-ai-inference-openshift-default-7c5447bb76-8z8tz_79cfeeaf-2ff6-4d93-9ee3-7d75b97d3996/istio-proxy/0.log" Apr 21 16:06:17.692048 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:06:17.692003 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-zswf8_ecc2bf4d-8668-46f7-a489-514b0b505d8c/serve-healthcheck-canary/0.log" Apr 21 16:06:18.302888 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:06:18.302858 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-htkxj_97c1233c-3be7-4359-982f-fb2aaa9a7fea/kube-rbac-proxy/0.log" Apr 21 16:06:18.322156 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:06:18.322126 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-htkxj_97c1233c-3be7-4359-982f-fb2aaa9a7fea/exporter/0.log" Apr 21 16:06:18.342009 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:06:18.341990 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-htkxj_97c1233c-3be7-4359-982f-fb2aaa9a7fea/extractor/0.log" Apr 21 16:06:21.382618 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:06:21.382589 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_kserve-controller-manager-9c85dd4d8-zp5r9_4ad48c88-fd28-4a98-bf60-c7e0dd7a58f6/manager/0.log" Apr 21 16:06:21.471086 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:06:21.471059 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_model-serving-api-86f7b4b499-l4gvp_2c857db3-273e-45ae-afd5-424141f11fbb/server/0.log" Apr 21 16:06:21.682555 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:06:21.682476 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_odh-model-controller-696fc77849-lnjn8_a8b86f3e-2c7b-4474-bc10-18c0c1269089/manager/0.log" Apr 21 16:06:21.700711 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:06:21.700678 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_s3-init-hgbvx_d697e35b-9a68-4f52-b436-5e4f4784ff7c/s3-init/0.log" Apr 21 16:06:21.725274 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:06:21.725245 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_seaweedfs-86cc847c5c-b8b7x_06c79565-0f30-407d-99a1-82fd07c760f3/seaweedfs/0.log" Apr 21 16:06:22.122707 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:06:22.122674 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-dt7xd/perf-node-gather-daemonset-dfb9j" Apr 21 16:06:27.595876 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:06:27.595839 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-gkvb2_31c04054-fa66-445a-9246-9c32b20cd60d/kube-multus-additional-cni-plugins/0.log" Apr 21 16:06:27.617462 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:06:27.617435 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-gkvb2_31c04054-fa66-445a-9246-9c32b20cd60d/egress-router-binary-copy/0.log" Apr 21 16:06:27.637983 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:06:27.637960 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-gkvb2_31c04054-fa66-445a-9246-9c32b20cd60d/cni-plugins/0.log" Apr 21 16:06:27.656948 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:06:27.656913 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-gkvb2_31c04054-fa66-445a-9246-9c32b20cd60d/bond-cni-plugin/0.log" Apr 21 16:06:27.676158 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:06:27.676130 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-gkvb2_31c04054-fa66-445a-9246-9c32b20cd60d/routeoverride-cni/0.log" Apr 21 16:06:27.698316 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:06:27.698296 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-gkvb2_31c04054-fa66-445a-9246-9c32b20cd60d/whereabouts-cni-bincopy/0.log" Apr 21 16:06:27.719556 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:06:27.719536 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-gkvb2_31c04054-fa66-445a-9246-9c32b20cd60d/whereabouts-cni/0.log" Apr 21 16:06:28.135456 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:06:28.135427 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-rvv9j_a7bb4a1e-4105-43ec-a600-43495885c030/kube-multus/0.log" Apr 21 16:06:28.192311 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:06:28.192279 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-28b7m_9c107ca7-f14c-4f8c-a8d4-4e08e3acb233/network-metrics-daemon/0.log" Apr 21 16:06:28.212995 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:06:28.212962 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-28b7m_9c107ca7-f14c-4f8c-a8d4-4e08e3acb233/kube-rbac-proxy/0.log" Apr 21 16:06:29.450652 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:06:29.450622 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9v44z_a8821bf6-e244-4b55-bfcc-7d85dec39bc4/ovn-controller/0.log" Apr 21 16:06:29.468085 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:06:29.468062 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9v44z_a8821bf6-e244-4b55-bfcc-7d85dec39bc4/ovn-acl-logging/0.log" Apr 21 16:06:29.477284 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:06:29.477262 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9v44z_a8821bf6-e244-4b55-bfcc-7d85dec39bc4/ovn-acl-logging/1.log" Apr 21 16:06:29.497418 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:06:29.497394 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9v44z_a8821bf6-e244-4b55-bfcc-7d85dec39bc4/kube-rbac-proxy-node/0.log" Apr 21 16:06:29.517652 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:06:29.517632 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9v44z_a8821bf6-e244-4b55-bfcc-7d85dec39bc4/kube-rbac-proxy-ovn-metrics/0.log" Apr 21 16:06:29.535467 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:06:29.535442 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9v44z_a8821bf6-e244-4b55-bfcc-7d85dec39bc4/northd/0.log" Apr 21 16:06:29.556060 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:06:29.556039 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9v44z_a8821bf6-e244-4b55-bfcc-7d85dec39bc4/nbdb/0.log" Apr 21 16:06:29.579903 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:06:29.579884 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9v44z_a8821bf6-e244-4b55-bfcc-7d85dec39bc4/sbdb/0.log" Apr 21 16:06:29.689530 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:06:29.689503 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9v44z_a8821bf6-e244-4b55-bfcc-7d85dec39bc4/ovnkube-controller/0.log" Apr 21 16:06:31.164335 ip-10-0-136-123 kubenswrapper[2573]: I0421 16:06:31.164260 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-ntgnx_8ea4d113-155e-4fa2-b765-c12d26b37fa1/network-check-target-container/0.log"