Apr 16 17:37:34.442073 ip-10-0-138-45 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 16 17:37:34.442084 ip-10-0-138-45 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 16 17:37:34.442091 ip-10-0-138-45 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 16 17:37:34.442310 ip-10-0-138-45 systemd[1]: Failed to start Kubernetes Kubelet. Apr 16 17:37:44.619333 ip-10-0-138-45 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 16 17:37:44.619374 ip-10-0-138-45 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot 721f901bbed14c60bb696c5f37f1b5d4 -- Apr 16 17:39:49.705897 ip-10-0-138-45 systemd[1]: Starting Kubernetes Kubelet... Apr 16 17:39:50.216795 ip-10-0-138-45 kubenswrapper[2574]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 17:39:50.216795 ip-10-0-138-45 kubenswrapper[2574]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 16 17:39:50.216795 ip-10-0-138-45 kubenswrapper[2574]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 17:39:50.216795 ip-10-0-138-45 kubenswrapper[2574]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 16 17:39:50.216795 ip-10-0-138-45 kubenswrapper[2574]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 17:39:50.217968 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.217888 2574 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 16 17:39:50.222435 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.222420 2574 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 17:39:50.222435 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.222435 2574 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 17:39:50.222497 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.222439 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 17:39:50.222497 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.222442 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 17:39:50.222497 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.222445 2574 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 17:39:50.222497 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.222448 2574 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 17:39:50.222497 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.222453 2574 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 17:39:50.222497 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.222457 2574 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 17:39:50.222497 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.222460 2574 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 17:39:50.222497 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.222463 2574 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 17:39:50.222497 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.222465 2574 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 17:39:50.222497 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.222468 2574 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 17:39:50.222497 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.222471 2574 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 17:39:50.222497 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.222474 2574 feature_gate.go:328] unrecognized feature gate: Example Apr 16 17:39:50.222497 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.222477 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 17:39:50.222497 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.222485 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 17:39:50.222497 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.222488 2574 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 17:39:50.222497 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.222491 2574 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 17:39:50.222497 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.222493 2574 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 17:39:50.222497 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.222496 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 17:39:50.222497 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.222499 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 17:39:50.222497 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.222501 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 17:39:50.222986 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.222504 2574 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 17:39:50.222986 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.222507 2574 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 17:39:50.222986 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.222510 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 17:39:50.222986 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.222512 2574 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 17:39:50.222986 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.222515 2574 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 17:39:50.222986 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.222518 2574 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 17:39:50.222986 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.222520 2574 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 17:39:50.222986 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.222523 2574 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 17:39:50.222986 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.222525 2574 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 17:39:50.222986 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.222527 2574 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 17:39:50.222986 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.222530 2574 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 17:39:50.222986 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.222532 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 17:39:50.222986 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.222535 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 17:39:50.222986 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.222537 2574 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 17:39:50.222986 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.222540 2574 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 17:39:50.222986 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.222542 2574 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 17:39:50.222986 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.222544 2574 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 17:39:50.222986 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.222547 2574 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 17:39:50.222986 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.222550 2574 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 17:39:50.222986 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.222553 2574 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 17:39:50.223452 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.222555 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 17:39:50.223452 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.222558 2574 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 17:39:50.223452 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.222561 2574 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 17:39:50.223452 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.222563 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 17:39:50.223452 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.222566 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 17:39:50.223452 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.222587 2574 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 17:39:50.223452 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.222590 2574 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 17:39:50.223452 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.222592 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 17:39:50.223452 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.222595 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 17:39:50.223452 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.222597 2574 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 17:39:50.223452 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.222601 2574 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 17:39:50.223452 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.222605 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 17:39:50.223452 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.222608 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 17:39:50.223452 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.222611 2574 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 17:39:50.223452 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.222614 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 17:39:50.223452 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.222617 2574 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 17:39:50.223452 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.222620 2574 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 17:39:50.223452 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.222622 2574 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 17:39:50.223452 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.222625 2574 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 17:39:50.223452 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.222627 2574 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 17:39:50.223973 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.222629 2574 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 17:39:50.223973 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.222632 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 17:39:50.223973 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.222635 2574 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 17:39:50.223973 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.222637 2574 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 17:39:50.223973 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.222640 2574 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 17:39:50.223973 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.222642 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 17:39:50.223973 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.222645 2574 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 17:39:50.223973 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.222647 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 17:39:50.223973 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.222650 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 17:39:50.223973 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.222652 2574 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 17:39:50.223973 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.222654 2574 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 17:39:50.223973 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.222660 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 17:39:50.223973 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.222663 2574 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 17:39:50.223973 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.222666 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 17:39:50.223973 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.222668 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 17:39:50.223973 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.222671 2574 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 17:39:50.223973 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.222674 2574 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 17:39:50.223973 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.222676 2574 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 17:39:50.223973 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.222678 2574 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 17:39:50.224442 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.222681 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 17:39:50.224442 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.222683 2574 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 17:39:50.224442 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.222687 2574 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 17:39:50.224442 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.222689 2574 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 17:39:50.224442 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.222692 2574 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 17:39:50.224442 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.223082 2574 feature_gate.go:328] unrecognized feature gate: Example Apr 16 17:39:50.224442 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.223088 2574 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 17:39:50.224442 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.223090 2574 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 17:39:50.224442 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.223093 2574 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 17:39:50.224442 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.223096 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 17:39:50.224442 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.223099 2574 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 17:39:50.224442 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.223101 2574 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 17:39:50.224442 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.223104 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 17:39:50.224442 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.223106 2574 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 17:39:50.224442 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.223109 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 17:39:50.224442 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.223112 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 17:39:50.224442 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.223114 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 17:39:50.224442 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.223117 2574 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 17:39:50.224442 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.223119 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 17:39:50.224442 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.223122 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 17:39:50.224442 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.223124 2574 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 17:39:50.224964 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.223126 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 17:39:50.224964 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.223129 2574 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 17:39:50.224964 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.223131 2574 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 17:39:50.224964 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.223140 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 17:39:50.224964 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.223143 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 17:39:50.224964 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.223146 2574 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 17:39:50.224964 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.223148 2574 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 17:39:50.224964 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.223151 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 17:39:50.224964 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.223154 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 17:39:50.224964 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.223156 2574 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 17:39:50.224964 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.223159 2574 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 17:39:50.224964 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.223161 2574 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 17:39:50.224964 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.223164 2574 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 17:39:50.224964 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.223166 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 17:39:50.224964 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.223169 2574 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 17:39:50.224964 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.223171 2574 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 17:39:50.224964 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.223174 2574 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 17:39:50.224964 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.223177 2574 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 17:39:50.224964 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.223180 2574 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 17:39:50.224964 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.223182 2574 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 17:39:50.225458 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.223184 2574 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 17:39:50.225458 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.223187 2574 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 17:39:50.225458 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.223189 2574 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 17:39:50.225458 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.223192 2574 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 17:39:50.225458 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.223194 2574 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 17:39:50.225458 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.223196 2574 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 17:39:50.225458 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.223199 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 17:39:50.225458 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.223201 2574 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 17:39:50.225458 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.223204 2574 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 17:39:50.225458 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.223206 2574 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 17:39:50.225458 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.223208 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 17:39:50.225458 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.223211 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 17:39:50.225458 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.223213 2574 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 17:39:50.225458 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.223216 2574 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 17:39:50.225458 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.223218 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 17:39:50.225458 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.223220 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 17:39:50.225458 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.223223 2574 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 17:39:50.225458 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.223225 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 17:39:50.225458 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.223228 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 17:39:50.225458 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.223230 2574 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 17:39:50.225949 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.223232 2574 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 17:39:50.225949 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.223235 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 17:39:50.225949 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.223237 2574 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 17:39:50.225949 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.223239 2574 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 17:39:50.225949 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.223242 2574 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 17:39:50.225949 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.223244 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 17:39:50.225949 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.223246 2574 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 17:39:50.225949 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.223249 2574 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 17:39:50.225949 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.223252 2574 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 17:39:50.225949 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.223255 2574 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 17:39:50.225949 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.223258 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 17:39:50.225949 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.223263 2574 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 17:39:50.225949 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.223267 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 17:39:50.225949 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.223269 2574 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 17:39:50.225949 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.223272 2574 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 17:39:50.225949 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.223274 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 17:39:50.225949 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.223277 2574 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 17:39:50.225949 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.223279 2574 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 17:39:50.225949 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.223282 2574 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 17:39:50.226410 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.223284 2574 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 17:39:50.226410 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.223286 2574 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 17:39:50.226410 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.223289 2574 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 17:39:50.226410 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.223291 2574 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 17:39:50.226410 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.223295 2574 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 17:39:50.226410 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.223298 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 17:39:50.226410 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.223301 2574 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 17:39:50.226410 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.223304 2574 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 17:39:50.226410 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.223306 2574 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 17:39:50.226410 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.223309 2574 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 17:39:50.226410 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.223312 2574 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 17:39:50.226410 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.223387 2574 flags.go:64] FLAG: --address="0.0.0.0" Apr 16 17:39:50.226410 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.223394 2574 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 16 17:39:50.226410 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.223401 2574 flags.go:64] FLAG: --anonymous-auth="true" Apr 16 17:39:50.226410 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.223405 2574 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 16 17:39:50.226410 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.223409 2574 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 16 17:39:50.226410 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.223413 2574 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 16 17:39:50.226410 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.223418 2574 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 16 17:39:50.226410 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.223422 2574 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 16 17:39:50.226410 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.223425 2574 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 16 17:39:50.226905 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.223428 2574 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 16 17:39:50.226905 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.223432 2574 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 16 17:39:50.226905 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.223436 2574 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 16 17:39:50.226905 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.223439 2574 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 16 17:39:50.226905 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.223442 2574 flags.go:64] FLAG: --cgroup-root="" Apr 16 17:39:50.226905 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.223445 2574 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 16 17:39:50.226905 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.223448 2574 flags.go:64] FLAG: --client-ca-file="" Apr 16 17:39:50.226905 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.223451 2574 flags.go:64] FLAG: --cloud-config="" Apr 16 17:39:50.226905 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.223454 2574 flags.go:64] FLAG: --cloud-provider="external" Apr 16 17:39:50.226905 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.223457 2574 flags.go:64] FLAG: --cluster-dns="[]" Apr 16 17:39:50.226905 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.223461 2574 flags.go:64] FLAG: --cluster-domain="" Apr 16 17:39:50.226905 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.223464 2574 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 16 17:39:50.226905 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.223468 2574 flags.go:64] FLAG: --config-dir="" Apr 16 17:39:50.226905 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.223470 2574 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 16 17:39:50.226905 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.223474 2574 flags.go:64] FLAG: --container-log-max-files="5" Apr 16 17:39:50.226905 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.223478 2574 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 16 17:39:50.226905 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.223481 2574 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 16 17:39:50.226905 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.223484 2574 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 16 17:39:50.226905 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.223487 2574 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 16 17:39:50.226905 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.223490 2574 flags.go:64] FLAG: --contention-profiling="false" Apr 16 17:39:50.226905 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.223493 2574 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 16 17:39:50.226905 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.223496 2574 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 16 17:39:50.226905 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.223499 2574 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 16 17:39:50.226905 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.223503 2574 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 16 17:39:50.226905 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.223507 2574 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 16 17:39:50.227492 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.223510 2574 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 16 17:39:50.227492 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.223513 2574 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 16 17:39:50.227492 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.223516 2574 flags.go:64] FLAG: --enable-load-reader="false" Apr 16 17:39:50.227492 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.223519 2574 flags.go:64] FLAG: --enable-server="true" Apr 16 17:39:50.227492 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.223522 2574 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 16 17:39:50.227492 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.223526 2574 flags.go:64] FLAG: --event-burst="100" Apr 16 17:39:50.227492 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.223530 2574 flags.go:64] FLAG: --event-qps="50" Apr 16 17:39:50.227492 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.223532 2574 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 16 17:39:50.227492 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.223535 2574 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 16 17:39:50.227492 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.223539 2574 flags.go:64] FLAG: --eviction-hard="" Apr 16 17:39:50.227492 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.223543 2574 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 16 17:39:50.227492 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.223546 2574 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 16 17:39:50.227492 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.223549 2574 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 16 17:39:50.227492 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.223552 2574 flags.go:64] FLAG: --eviction-soft="" Apr 16 17:39:50.227492 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.223555 2574 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 16 17:39:50.227492 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.223558 2574 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 16 17:39:50.227492 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.223561 2574 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 16 17:39:50.227492 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.223563 2574 flags.go:64] FLAG: --experimental-mounter-path="" Apr 16 17:39:50.227492 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.223566 2574 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 16 17:39:50.227492 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.223583 2574 flags.go:64] FLAG: --fail-swap-on="true" Apr 16 17:39:50.227492 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.223586 2574 flags.go:64] FLAG: --feature-gates="" Apr 16 17:39:50.227492 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.223591 2574 flags.go:64] FLAG: --file-check-frequency="20s" Apr 16 17:39:50.227492 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.223594 2574 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 16 17:39:50.227492 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.223597 2574 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 16 17:39:50.227492 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.223600 2574 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 16 17:39:50.228093 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.223604 2574 flags.go:64] FLAG: --healthz-port="10248" Apr 16 17:39:50.228093 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.223607 2574 flags.go:64] FLAG: --help="false" Apr 16 17:39:50.228093 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.223610 2574 flags.go:64] FLAG: --hostname-override="ip-10-0-138-45.ec2.internal" Apr 16 17:39:50.228093 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.223613 2574 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 16 17:39:50.228093 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.223616 2574 flags.go:64] FLAG: --http-check-frequency="20s" Apr 16 17:39:50.228093 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.223619 2574 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 16 17:39:50.228093 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.223627 2574 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 16 17:39:50.228093 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.223631 2574 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 16 17:39:50.228093 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.223633 2574 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 16 17:39:50.228093 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.223636 2574 flags.go:64] FLAG: --image-service-endpoint="" Apr 16 17:39:50.228093 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.223639 2574 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 16 17:39:50.228093 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.223642 2574 flags.go:64] FLAG: --kube-api-burst="100" Apr 16 17:39:50.228093 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.223645 2574 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 16 17:39:50.228093 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.223648 2574 flags.go:64] FLAG: --kube-api-qps="50" Apr 16 17:39:50.228093 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.223651 2574 flags.go:64] FLAG: --kube-reserved="" Apr 16 17:39:50.228093 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.223653 2574 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 16 17:39:50.228093 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.223656 2574 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 16 17:39:50.228093 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.223659 2574 flags.go:64] FLAG: --kubelet-cgroups="" Apr 16 17:39:50.228093 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.223662 2574 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 16 17:39:50.228093 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.223665 2574 flags.go:64] FLAG: --lock-file="" Apr 16 17:39:50.228093 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.223668 2574 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 16 17:39:50.228093 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.223671 2574 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 16 17:39:50.228093 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.223674 2574 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 16 17:39:50.228093 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.223679 2574 flags.go:64] FLAG: --log-json-split-stream="false" Apr 16 17:39:50.228721 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.223682 2574 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 16 17:39:50.228721 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.223685 2574 flags.go:64] FLAG: --log-text-split-stream="false" Apr 16 17:39:50.228721 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.223687 2574 flags.go:64] FLAG: --logging-format="text" Apr 16 17:39:50.228721 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.223690 2574 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 16 17:39:50.228721 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.223693 2574 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 16 17:39:50.228721 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.223696 2574 flags.go:64] FLAG: --manifest-url="" Apr 16 17:39:50.228721 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.223699 2574 flags.go:64] FLAG: --manifest-url-header="" Apr 16 17:39:50.228721 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.223703 2574 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 16 17:39:50.228721 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.223706 2574 flags.go:64] FLAG: --max-open-files="1000000" Apr 16 17:39:50.228721 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.223710 2574 flags.go:64] FLAG: --max-pods="110" Apr 16 17:39:50.228721 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.223713 2574 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 16 17:39:50.228721 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.223716 2574 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 16 17:39:50.228721 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.223719 2574 flags.go:64] FLAG: --memory-manager-policy="None" Apr 16 17:39:50.228721 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.223722 2574 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 16 17:39:50.228721 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.223725 2574 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 16 17:39:50.228721 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.223728 2574 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 16 17:39:50.228721 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.223731 2574 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 16 17:39:50.228721 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.223738 2574 flags.go:64] FLAG: --node-status-max-images="50" Apr 16 17:39:50.228721 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.223741 2574 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 16 17:39:50.228721 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.223744 2574 flags.go:64] FLAG: --oom-score-adj="-999" Apr 16 17:39:50.228721 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.223747 2574 flags.go:64] FLAG: --pod-cidr="" Apr 16 17:39:50.228721 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.223750 2574 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dc76bab72f320de3d4105c90d73c4fb139c09e20ce0fa8dcbc0cb59920d27dec" Apr 16 17:39:50.228721 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.223756 2574 flags.go:64] FLAG: --pod-manifest-path="" Apr 16 17:39:50.229271 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.223758 2574 flags.go:64] FLAG: --pod-max-pids="-1" Apr 16 17:39:50.229271 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.223762 2574 flags.go:64] FLAG: --pods-per-core="0" Apr 16 17:39:50.229271 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.223765 2574 flags.go:64] FLAG: --port="10250" Apr 16 17:39:50.229271 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.223768 2574 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 16 17:39:50.229271 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.223771 2574 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0f646cc0a47cef750" Apr 16 17:39:50.229271 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.223774 2574 flags.go:64] FLAG: --qos-reserved="" Apr 16 17:39:50.229271 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.223777 2574 flags.go:64] FLAG: --read-only-port="10255" Apr 16 17:39:50.229271 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.223780 2574 flags.go:64] FLAG: --register-node="true" Apr 16 17:39:50.229271 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.223783 2574 flags.go:64] FLAG: --register-schedulable="true" Apr 16 17:39:50.229271 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.223786 2574 flags.go:64] FLAG: --register-with-taints="" Apr 16 17:39:50.229271 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.223790 2574 flags.go:64] FLAG: --registry-burst="10" Apr 16 17:39:50.229271 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.223792 2574 flags.go:64] FLAG: --registry-qps="5" Apr 16 17:39:50.229271 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.223795 2574 flags.go:64] FLAG: --reserved-cpus="" Apr 16 17:39:50.229271 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.223798 2574 flags.go:64] FLAG: --reserved-memory="" Apr 16 17:39:50.229271 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.223801 2574 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 16 17:39:50.229271 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.223804 2574 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 16 17:39:50.229271 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.223806 2574 flags.go:64] FLAG: --rotate-certificates="false" Apr 16 17:39:50.229271 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.223809 2574 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 16 17:39:50.229271 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.223812 2574 flags.go:64] FLAG: --runonce="false" Apr 16 17:39:50.229271 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.223815 2574 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 16 17:39:50.229271 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.223817 2574 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 16 17:39:50.229271 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.223820 2574 flags.go:64] FLAG: --seccomp-default="false" Apr 16 17:39:50.229271 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.223823 2574 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 16 17:39:50.229271 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.223826 2574 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 16 17:39:50.229271 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.223829 2574 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 16 17:39:50.229271 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.223833 2574 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 16 17:39:50.229948 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.223836 2574 flags.go:64] FLAG: --storage-driver-password="root" Apr 16 17:39:50.229948 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.223839 2574 flags.go:64] FLAG: --storage-driver-secure="false" Apr 16 17:39:50.229948 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.223842 2574 flags.go:64] FLAG: --storage-driver-table="stats" Apr 16 17:39:50.229948 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.223845 2574 flags.go:64] FLAG: --storage-driver-user="root" Apr 16 17:39:50.229948 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.223848 2574 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 16 17:39:50.229948 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.223851 2574 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 16 17:39:50.229948 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.223856 2574 flags.go:64] FLAG: --system-cgroups="" Apr 16 17:39:50.229948 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.223858 2574 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 16 17:39:50.229948 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.223864 2574 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 16 17:39:50.229948 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.223866 2574 flags.go:64] FLAG: --tls-cert-file="" Apr 16 17:39:50.229948 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.223869 2574 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 16 17:39:50.229948 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.223874 2574 flags.go:64] FLAG: --tls-min-version="" Apr 16 17:39:50.229948 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.223877 2574 flags.go:64] FLAG: --tls-private-key-file="" Apr 16 17:39:50.229948 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.223879 2574 flags.go:64] FLAG: --topology-manager-policy="none" Apr 16 17:39:50.229948 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.223882 2574 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 16 17:39:50.229948 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.223885 2574 flags.go:64] FLAG: --topology-manager-scope="container" Apr 16 17:39:50.229948 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.223888 2574 flags.go:64] FLAG: --v="2" Apr 16 17:39:50.229948 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.223893 2574 flags.go:64] FLAG: --version="false" Apr 16 17:39:50.229948 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.223897 2574 flags.go:64] FLAG: --vmodule="" Apr 16 17:39:50.229948 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.223901 2574 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 16 17:39:50.229948 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.223904 2574 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 16 17:39:50.229948 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.224015 2574 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 17:39:50.229948 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.224018 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 17:39:50.229948 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.224021 2574 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 17:39:50.229948 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.224024 2574 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 17:39:50.230543 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.224027 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 17:39:50.230543 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.224030 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 17:39:50.230543 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.224032 2574 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 17:39:50.230543 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.224035 2574 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 17:39:50.230543 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.224038 2574 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 17:39:50.230543 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.224040 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 17:39:50.230543 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.224043 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 17:39:50.230543 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.224047 2574 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 17:39:50.230543 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.224049 2574 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 17:39:50.230543 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.224053 2574 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 17:39:50.230543 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.224056 2574 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 17:39:50.230543 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.224058 2574 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 17:39:50.230543 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.224060 2574 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 17:39:50.230543 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.224065 2574 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 17:39:50.230543 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.224067 2574 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 17:39:50.230543 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.224070 2574 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 17:39:50.230543 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.224072 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 17:39:50.230543 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.224075 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 17:39:50.230543 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.224077 2574 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 17:39:50.230543 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.224080 2574 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 17:39:50.231096 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.224082 2574 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 17:39:50.231096 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.224085 2574 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 17:39:50.231096 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.224088 2574 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 17:39:50.231096 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.224090 2574 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 17:39:50.231096 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.224093 2574 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 17:39:50.231096 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.224095 2574 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 17:39:50.231096 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.224097 2574 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 17:39:50.231096 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.224100 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 17:39:50.231096 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.224103 2574 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 17:39:50.231096 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.224105 2574 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 17:39:50.231096 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.224108 2574 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 17:39:50.231096 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.224110 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 17:39:50.231096 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.224113 2574 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 17:39:50.231096 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.224115 2574 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 17:39:50.231096 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.224117 2574 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 17:39:50.231096 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.224120 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 17:39:50.231096 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.224123 2574 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 17:39:50.231096 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.224125 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 17:39:50.231096 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.224127 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 17:39:50.231096 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.224130 2574 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 17:39:50.231636 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.224134 2574 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 17:39:50.231636 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.224139 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 17:39:50.231636 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.224141 2574 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 17:39:50.231636 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.224144 2574 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 17:39:50.231636 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.224146 2574 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 17:39:50.231636 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.224150 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 17:39:50.231636 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.224154 2574 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 17:39:50.231636 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.224157 2574 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 17:39:50.231636 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.224161 2574 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 17:39:50.231636 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.224163 2574 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 17:39:50.231636 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.224166 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 17:39:50.231636 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.224169 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 17:39:50.231636 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.224172 2574 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 17:39:50.231636 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.224174 2574 feature_gate.go:328] unrecognized feature gate: Example Apr 16 17:39:50.231636 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.224177 2574 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 17:39:50.231636 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.224180 2574 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 17:39:50.231636 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.224183 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 17:39:50.231636 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.224186 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 17:39:50.231636 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.224189 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 17:39:50.232506 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.224191 2574 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 17:39:50.232506 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.224193 2574 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 17:39:50.232506 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.224196 2574 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 17:39:50.232506 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.224198 2574 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 17:39:50.232506 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.224201 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 17:39:50.232506 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.224203 2574 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 17:39:50.232506 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.224206 2574 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 17:39:50.232506 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.224208 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 17:39:50.232506 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.224211 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 17:39:50.232506 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.224213 2574 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 17:39:50.232506 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.224215 2574 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 17:39:50.232506 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.224218 2574 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 17:39:50.232506 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.224220 2574 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 17:39:50.232506 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.224223 2574 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 17:39:50.232506 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.224227 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 17:39:50.232506 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.224230 2574 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 17:39:50.232506 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.224233 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 17:39:50.232506 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.224235 2574 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 17:39:50.232506 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.224239 2574 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 17:39:50.233077 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.224242 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 17:39:50.233077 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.224244 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 17:39:50.233077 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.224246 2574 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 17:39:50.233077 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.224249 2574 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 17:39:50.233077 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.225211 2574 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 17:39:50.233624 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.233607 2574 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 16 17:39:50.233662 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.233625 2574 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 16 17:39:50.233692 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.233676 2574 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 17:39:50.233692 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.233681 2574 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 17:39:50.233692 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.233684 2574 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 17:39:50.233692 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.233688 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 17:39:50.233692 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.233691 2574 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 17:39:50.233692 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.233694 2574 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 17:39:50.233845 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.233696 2574 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 17:39:50.233845 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.233699 2574 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 17:39:50.233845 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.233702 2574 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 17:39:50.233845 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.233705 2574 feature_gate.go:328] unrecognized feature gate: Example Apr 16 17:39:50.233845 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.233707 2574 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 17:39:50.233845 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.233710 2574 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 17:39:50.233845 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.233712 2574 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 17:39:50.233845 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.233715 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 17:39:50.233845 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.233717 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 17:39:50.233845 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.233720 2574 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 17:39:50.233845 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.233722 2574 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 17:39:50.233845 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.233725 2574 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 17:39:50.233845 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.233728 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 17:39:50.233845 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.233730 2574 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 17:39:50.233845 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.233733 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 17:39:50.233845 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.233736 2574 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 17:39:50.233845 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.233738 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 17:39:50.233845 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.233741 2574 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 17:39:50.233845 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.233744 2574 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 17:39:50.233845 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.233747 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 17:39:50.234327 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.233749 2574 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 17:39:50.234327 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.233752 2574 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 17:39:50.234327 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.233754 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 17:39:50.234327 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.233757 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 17:39:50.234327 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.233759 2574 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 17:39:50.234327 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.233762 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 17:39:50.234327 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.233764 2574 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 17:39:50.234327 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.233767 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 17:39:50.234327 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.233770 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 17:39:50.234327 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.233772 2574 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 17:39:50.234327 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.233775 2574 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 17:39:50.234327 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.233777 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 17:39:50.234327 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.233779 2574 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 17:39:50.234327 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.233783 2574 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 17:39:50.234327 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.233786 2574 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 17:39:50.234327 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.233788 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 17:39:50.234327 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.233791 2574 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 17:39:50.234327 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.233794 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 17:39:50.234327 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.233796 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 17:39:50.234327 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.233799 2574 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 17:39:50.234848 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.233801 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 17:39:50.234848 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.233804 2574 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 17:39:50.234848 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.233806 2574 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 17:39:50.234848 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.233809 2574 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 17:39:50.234848 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.233813 2574 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 17:39:50.234848 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.233818 2574 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 17:39:50.234848 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.233821 2574 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 17:39:50.234848 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.233823 2574 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 17:39:50.234848 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.233826 2574 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 17:39:50.234848 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.233828 2574 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 17:39:50.234848 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.233831 2574 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 17:39:50.234848 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.233833 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 17:39:50.234848 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.233836 2574 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 17:39:50.234848 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.233838 2574 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 17:39:50.234848 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.233841 2574 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 17:39:50.234848 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.233843 2574 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 17:39:50.234848 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.233846 2574 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 17:39:50.234848 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.233848 2574 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 17:39:50.234848 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.233851 2574 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 17:39:50.235304 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.233854 2574 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 17:39:50.235304 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.233857 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 17:39:50.235304 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.233860 2574 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 17:39:50.235304 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.233863 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 17:39:50.235304 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.233866 2574 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 17:39:50.235304 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.233868 2574 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 17:39:50.235304 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.233871 2574 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 17:39:50.235304 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.233874 2574 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 17:39:50.235304 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.233877 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 17:39:50.235304 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.233879 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 17:39:50.235304 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.233882 2574 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 17:39:50.235304 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.233886 2574 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 17:39:50.235304 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.233888 2574 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 17:39:50.235304 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.233892 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 17:39:50.235304 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.233894 2574 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 17:39:50.235304 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.233897 2574 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 17:39:50.235304 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.233899 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 17:39:50.235304 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.233902 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 17:39:50.235304 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.233904 2574 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 17:39:50.235781 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.233906 2574 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 17:39:50.235781 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.233909 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 17:39:50.235781 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.233914 2574 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 17:39:50.235781 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.234019 2574 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 17:39:50.235781 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.234024 2574 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 17:39:50.235781 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.234027 2574 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 17:39:50.235781 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.234030 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 17:39:50.235781 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.234033 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 17:39:50.235781 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.234035 2574 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 17:39:50.235781 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.234037 2574 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 17:39:50.235781 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.234040 2574 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 17:39:50.235781 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.234042 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 17:39:50.235781 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.234045 2574 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 17:39:50.235781 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.234047 2574 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 17:39:50.235781 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.234050 2574 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 17:39:50.235781 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.234052 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 17:39:50.236166 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.234055 2574 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 17:39:50.236166 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.234057 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 17:39:50.236166 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.234060 2574 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 17:39:50.236166 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.234062 2574 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 17:39:50.236166 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.234065 2574 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 17:39:50.236166 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.234068 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 17:39:50.236166 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.234071 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 17:39:50.236166 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.234074 2574 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 17:39:50.236166 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.234077 2574 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 17:39:50.236166 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.234080 2574 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 17:39:50.236166 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.234083 2574 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 17:39:50.236166 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.234085 2574 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 17:39:50.236166 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.234088 2574 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 17:39:50.236166 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.234090 2574 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 17:39:50.236166 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.234093 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 17:39:50.236166 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.234095 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 17:39:50.236166 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.234097 2574 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 17:39:50.236166 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.234100 2574 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 17:39:50.236166 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.234103 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 17:39:50.236166 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.234105 2574 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 17:39:50.236683 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.234108 2574 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 17:39:50.236683 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.234110 2574 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 17:39:50.236683 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.234113 2574 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 17:39:50.236683 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.234115 2574 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 17:39:50.236683 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.234117 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 17:39:50.236683 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.234120 2574 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 17:39:50.236683 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.234122 2574 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 17:39:50.236683 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.234124 2574 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 17:39:50.236683 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.234127 2574 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 17:39:50.236683 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.234129 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 17:39:50.236683 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.234132 2574 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 17:39:50.236683 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.234134 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 17:39:50.236683 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.234136 2574 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 17:39:50.236683 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.234139 2574 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 17:39:50.236683 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.234141 2574 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 17:39:50.236683 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.234144 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 17:39:50.236683 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.234146 2574 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 17:39:50.236683 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.234149 2574 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 17:39:50.236683 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.234151 2574 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 17:39:50.236683 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.234154 2574 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 17:39:50.237163 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.234156 2574 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 17:39:50.237163 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.234158 2574 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 17:39:50.237163 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.234163 2574 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 17:39:50.237163 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.234166 2574 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 17:39:50.237163 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.234169 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 17:39:50.237163 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.234171 2574 feature_gate.go:328] unrecognized feature gate: Example Apr 16 17:39:50.237163 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.234174 2574 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 17:39:50.237163 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.234177 2574 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 17:39:50.237163 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.234179 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 17:39:50.237163 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.234182 2574 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 17:39:50.237163 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.234184 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 17:39:50.237163 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.234186 2574 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 17:39:50.237163 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.234189 2574 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 17:39:50.237163 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.234192 2574 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 17:39:50.237163 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.234195 2574 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 17:39:50.237163 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.234198 2574 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 17:39:50.237163 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.234200 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 17:39:50.237163 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.234202 2574 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 17:39:50.237163 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.234205 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 17:39:50.237642 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.234207 2574 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 17:39:50.237642 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.234210 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 17:39:50.237642 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.234212 2574 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 17:39:50.237642 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.234214 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 17:39:50.237642 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.234217 2574 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 17:39:50.237642 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.234219 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 17:39:50.237642 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.234222 2574 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 17:39:50.237642 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.234224 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 17:39:50.237642 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.234227 2574 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 17:39:50.237642 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.234230 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 17:39:50.237642 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.234233 2574 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 17:39:50.237642 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.234235 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 17:39:50.237642 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.234238 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 17:39:50.237642 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:50.234240 2574 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 17:39:50.237642 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.234245 2574 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 17:39:50.238085 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.234355 2574 server.go:962] "Client rotation is on, will bootstrap in background" Apr 16 17:39:50.238085 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.237587 2574 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 16 17:39:50.238656 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.238644 2574 server.go:1019] "Starting client certificate rotation" Apr 16 17:39:50.238765 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.238745 2574 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 17:39:50.238802 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.238787 2574 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 17:39:50.269097 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.269071 2574 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 17:39:50.271755 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.271735 2574 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 17:39:50.285703 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.285681 2574 log.go:25] "Validated CRI v1 runtime API" Apr 16 17:39:50.294488 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.294466 2574 log.go:25] "Validated CRI v1 image API" Apr 16 17:39:50.298218 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.298195 2574 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 16 17:39:50.300733 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.300715 2574 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 17:39:50.301066 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.301043 2574 fs.go:135] Filesystem UUIDs: map[3fd0ca34-6910-43f7-88e2-dc90881952ff:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2 80106355-a640-46a9-b911-7fc28cee3df8:/dev/nvme0n1p3] Apr 16 17:39:50.301108 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.301068 2574 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 16 17:39:50.306230 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.306114 2574 manager.go:217] Machine: {Timestamp:2026-04-16 17:39:50.304773081 +0000 UTC m=+0.464143373 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3259974 MemoryCapacity:32812175360 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec20adb027f4928e8feb956281cd9471 SystemUUID:ec20adb0-27f4-928e-8feb-956281cd9471 BootID:721f901b-bed1-4c60-bb69-6c5f37f1b5d4 Filesystems:[{Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6562435072 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6098944 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16406089728 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16406085632 Type:vfs Inodes:4005392 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:4c:ef:b4:00:9f Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:4c:ef:b4:00:9f Speed:0 Mtu:9001} {Name:ovs-system MacAddress:3e:19:74:78:19:7c Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:32812175360 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:34603008 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 16 17:39:50.306230 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.306226 2574 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 16 17:39:50.306336 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.306320 2574 manager.go:233] Version: {KernelVersion:5.14.0-570.104.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260401-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 16 17:39:50.307626 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.307596 2574 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 16 17:39:50.307766 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.307628 2574 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-138-45.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 16 17:39:50.307815 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.307776 2574 topology_manager.go:138] "Creating topology manager with none policy" Apr 16 17:39:50.307815 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.307785 2574 container_manager_linux.go:306] "Creating device plugin manager" Apr 16 17:39:50.307815 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.307798 2574 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 17:39:50.308623 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.308612 2574 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 17:39:50.310553 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.310542 2574 state_mem.go:36] "Initialized new in-memory state store" Apr 16 17:39:50.310871 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.310861 2574 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 16 17:39:50.313677 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.313667 2574 kubelet.go:491] "Attempting to sync node with API server" Apr 16 17:39:50.313724 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.313690 2574 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 16 17:39:50.313724 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.313705 2574 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 16 17:39:50.313724 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.313716 2574 kubelet.go:397] "Adding apiserver pod source" Apr 16 17:39:50.313842 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.313725 2574 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 16 17:39:50.314941 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.314929 2574 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 17:39:50.314994 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.314949 2574 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 17:39:50.318587 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.318550 2574 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 16 17:39:50.320600 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.320562 2574 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 16 17:39:50.322086 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.322074 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 16 17:39:50.322150 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.322091 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 16 17:39:50.322150 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.322098 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 16 17:39:50.322150 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.322104 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 16 17:39:50.322150 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.322109 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 16 17:39:50.322150 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.322115 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 16 17:39:50.322150 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.322120 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 16 17:39:50.322150 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.322126 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 16 17:39:50.322150 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.322132 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 16 17:39:50.322150 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.322138 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 16 17:39:50.322150 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.322146 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 16 17:39:50.322150 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.322154 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 16 17:39:50.324004 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.323993 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 16 17:39:50.324004 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.324003 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 16 17:39:50.327735 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.327721 2574 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 16 17:39:50.327800 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.327761 2574 server.go:1295] "Started kubelet" Apr 16 17:39:50.328601 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.328549 2574 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 16 17:39:50.328678 ip-10-0-138-45 systemd[1]: Started Kubernetes Kubelet. Apr 16 17:39:50.329057 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.328799 2574 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 16 17:39:50.329057 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.328891 2574 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 16 17:39:50.329623 ip-10-0-138-45 kubenswrapper[2574]: E0416 17:39:50.329438 2574 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-138-45.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 16 17:39:50.329623 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.329467 2574 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-138-45.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 16 17:39:50.329623 ip-10-0-138-45 kubenswrapper[2574]: E0416 17:39:50.329564 2574 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 16 17:39:50.330687 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.330660 2574 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 16 17:39:50.331628 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.331615 2574 server.go:317] "Adding debug handlers to kubelet server" Apr 16 17:39:50.334872 ip-10-0-138-45 kubenswrapper[2574]: E0416 17:39:50.333368 2574 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-138-45.ec2.internal.18a6e7174e46512b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-138-45.ec2.internal,UID:ip-10-0-138-45.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-138-45.ec2.internal,},FirstTimestamp:2026-04-16 17:39:50.327734571 +0000 UTC m=+0.487104844,LastTimestamp:2026-04-16 17:39:50.327734571 +0000 UTC m=+0.487104844,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-138-45.ec2.internal,}" Apr 16 17:39:50.337764 ip-10-0-138-45 kubenswrapper[2574]: E0416 17:39:50.337746 2574 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 16 17:39:50.338058 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.338041 2574 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 16 17:39:50.338738 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.338723 2574 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 16 17:39:50.340905 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.340874 2574 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 16 17:39:50.341001 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.340885 2574 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 16 17:39:50.341001 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.340989 2574 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 16 17:39:50.341172 ip-10-0-138-45 kubenswrapper[2574]: E0416 17:39:50.341149 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-45.ec2.internal\" not found" Apr 16 17:39:50.341172 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.341167 2574 reconstruct.go:97] "Volume reconstruction finished" Apr 16 17:39:50.341312 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.341182 2574 reconciler.go:26] "Reconciler: start to sync state" Apr 16 17:39:50.343341 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.343326 2574 factory.go:55] Registering systemd factory Apr 16 17:39:50.343451 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.343426 2574 factory.go:223] Registration of the systemd container factory successfully Apr 16 17:39:50.344138 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.344121 2574 factory.go:153] Registering CRI-O factory Apr 16 17:39:50.344138 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.344138 2574 factory.go:223] Registration of the crio container factory successfully Apr 16 17:39:50.344237 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.344220 2574 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 16 17:39:50.344273 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.344247 2574 factory.go:103] Registering Raw factory Apr 16 17:39:50.344273 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.344266 2574 manager.go:1196] Started watching for new ooms in manager Apr 16 17:39:50.344732 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.344715 2574 manager.go:319] Starting recovery of all containers Apr 16 17:39:50.345004 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.344985 2574 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-xjrwr" Apr 16 17:39:50.352195 ip-10-0-138-45 kubenswrapper[2574]: E0416 17:39:50.352041 2574 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 16 17:39:50.352302 ip-10-0-138-45 kubenswrapper[2574]: E0416 17:39:50.352080 2574 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-138-45.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 16 17:39:50.355109 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.355088 2574 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-xjrwr" Apr 16 17:39:50.355357 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.355340 2574 manager.go:324] Recovery completed Apr 16 17:39:50.359992 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.359979 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 17:39:50.363015 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.362999 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-45.ec2.internal" event="NodeHasSufficientMemory" Apr 16 17:39:50.363083 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.363029 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-45.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 17:39:50.363083 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.363040 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-45.ec2.internal" event="NodeHasSufficientPID" Apr 16 17:39:50.363743 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.363718 2574 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 16 17:39:50.363743 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.363732 2574 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 16 17:39:50.363743 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.363747 2574 state_mem.go:36] "Initialized new in-memory state store" Apr 16 17:39:50.367795 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.367783 2574 policy_none.go:49] "None policy: Start" Apr 16 17:39:50.367841 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.367800 2574 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 16 17:39:50.367841 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.367811 2574 state_mem.go:35] "Initializing new in-memory state store" Apr 16 17:39:50.419200 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.419182 2574 manager.go:341] "Starting Device Plugin manager" Apr 16 17:39:50.432207 ip-10-0-138-45 kubenswrapper[2574]: E0416 17:39:50.419221 2574 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 16 17:39:50.432207 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.419234 2574 server.go:85] "Starting device plugin registration server" Apr 16 17:39:50.432207 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.419478 2574 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 16 17:39:50.432207 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.419490 2574 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 16 17:39:50.432207 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.419653 2574 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 16 17:39:50.432207 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.419733 2574 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 16 17:39:50.432207 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.419741 2574 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 16 17:39:50.432207 ip-10-0-138-45 kubenswrapper[2574]: E0416 17:39:50.420265 2574 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 16 17:39:50.432207 ip-10-0-138-45 kubenswrapper[2574]: E0416 17:39:50.420310 2574 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-138-45.ec2.internal\" not found" Apr 16 17:39:50.448322 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.448288 2574 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 16 17:39:50.449708 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.449655 2574 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 16 17:39:50.449708 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.449681 2574 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 16 17:39:50.449708 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.449699 2574 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 16 17:39:50.449708 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.449705 2574 kubelet.go:2451] "Starting kubelet main sync loop" Apr 16 17:39:50.449920 ip-10-0-138-45 kubenswrapper[2574]: E0416 17:39:50.449736 2574 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 16 17:39:50.455442 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.455420 2574 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 17:39:50.520614 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.520557 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 17:39:50.521949 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.521928 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-45.ec2.internal" event="NodeHasSufficientMemory" Apr 16 17:39:50.522050 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.521968 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-45.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 17:39:50.522050 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.521982 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-45.ec2.internal" event="NodeHasSufficientPID" Apr 16 17:39:50.522050 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.522014 2574 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-138-45.ec2.internal" Apr 16 17:39:50.530141 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.530128 2574 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-138-45.ec2.internal" Apr 16 17:39:50.530224 ip-10-0-138-45 kubenswrapper[2574]: E0416 17:39:50.530148 2574 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-138-45.ec2.internal\": node \"ip-10-0-138-45.ec2.internal\" not found" Apr 16 17:39:50.549122 ip-10-0-138-45 kubenswrapper[2574]: E0416 17:39:50.549092 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-45.ec2.internal\" not found" Apr 16 17:39:50.550059 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.550029 2574 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-45.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-138-45.ec2.internal"] Apr 16 17:39:50.550113 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.550104 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 17:39:50.553994 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.553979 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-45.ec2.internal" event="NodeHasSufficientMemory" Apr 16 17:39:50.554079 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.554005 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-45.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 17:39:50.554079 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.554015 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-45.ec2.internal" event="NodeHasSufficientPID" Apr 16 17:39:50.556378 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.556366 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 17:39:50.556536 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.556521 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-45.ec2.internal" Apr 16 17:39:50.556590 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.556551 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 17:39:50.557501 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.557484 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-45.ec2.internal" event="NodeHasSufficientMemory" Apr 16 17:39:50.557599 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.557507 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-45.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 17:39:50.557599 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.557516 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-45.ec2.internal" event="NodeHasSufficientPID" Apr 16 17:39:50.557599 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.557539 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-45.ec2.internal" event="NodeHasSufficientMemory" Apr 16 17:39:50.557599 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.557563 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-45.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 17:39:50.557599 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.557593 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-45.ec2.internal" event="NodeHasSufficientPID" Apr 16 17:39:50.560361 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.560344 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-45.ec2.internal" Apr 16 17:39:50.560438 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.560372 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 17:39:50.561075 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.561062 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-45.ec2.internal" event="NodeHasSufficientMemory" Apr 16 17:39:50.561144 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.561085 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-45.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 17:39:50.561144 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.561095 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-45.ec2.internal" event="NodeHasSufficientPID" Apr 16 17:39:50.590440 ip-10-0-138-45 kubenswrapper[2574]: E0416 17:39:50.590419 2574 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-138-45.ec2.internal\" not found" node="ip-10-0-138-45.ec2.internal" Apr 16 17:39:50.594960 ip-10-0-138-45 kubenswrapper[2574]: E0416 17:39:50.594944 2574 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-138-45.ec2.internal\" not found" node="ip-10-0-138-45.ec2.internal" Apr 16 17:39:50.643620 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.643588 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/cc1dec6d25ddbc91146c8c11e71acdd6-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-138-45.ec2.internal\" (UID: \"cc1dec6d25ddbc91146c8c11e71acdd6\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-45.ec2.internal" Apr 16 17:39:50.643734 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.643624 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/cc1dec6d25ddbc91146c8c11e71acdd6-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-138-45.ec2.internal\" (UID: \"cc1dec6d25ddbc91146c8c11e71acdd6\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-45.ec2.internal" Apr 16 17:39:50.643734 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.643642 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/2eb2699ca22310dd4864f65d554362ea-config\") pod \"kube-apiserver-proxy-ip-10-0-138-45.ec2.internal\" (UID: \"2eb2699ca22310dd4864f65d554362ea\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-138-45.ec2.internal" Apr 16 17:39:50.649772 ip-10-0-138-45 kubenswrapper[2574]: E0416 17:39:50.649751 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-45.ec2.internal\" not found" Apr 16 17:39:50.744106 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.744033 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/cc1dec6d25ddbc91146c8c11e71acdd6-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-138-45.ec2.internal\" (UID: \"cc1dec6d25ddbc91146c8c11e71acdd6\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-45.ec2.internal" Apr 16 17:39:50.744106 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.744078 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/cc1dec6d25ddbc91146c8c11e71acdd6-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-138-45.ec2.internal\" (UID: \"cc1dec6d25ddbc91146c8c11e71acdd6\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-45.ec2.internal" Apr 16 17:39:50.744106 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.744096 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/2eb2699ca22310dd4864f65d554362ea-config\") pod \"kube-apiserver-proxy-ip-10-0-138-45.ec2.internal\" (UID: \"2eb2699ca22310dd4864f65d554362ea\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-138-45.ec2.internal" Apr 16 17:39:50.744245 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.744139 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/2eb2699ca22310dd4864f65d554362ea-config\") pod \"kube-apiserver-proxy-ip-10-0-138-45.ec2.internal\" (UID: \"2eb2699ca22310dd4864f65d554362ea\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-138-45.ec2.internal" Apr 16 17:39:50.744245 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.744164 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/cc1dec6d25ddbc91146c8c11e71acdd6-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-138-45.ec2.internal\" (UID: \"cc1dec6d25ddbc91146c8c11e71acdd6\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-45.ec2.internal" Apr 16 17:39:50.744245 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.744165 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/cc1dec6d25ddbc91146c8c11e71acdd6-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-138-45.ec2.internal\" (UID: \"cc1dec6d25ddbc91146c8c11e71acdd6\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-45.ec2.internal" Apr 16 17:39:50.750134 ip-10-0-138-45 kubenswrapper[2574]: E0416 17:39:50.750113 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-45.ec2.internal\" not found" Apr 16 17:39:50.851132 ip-10-0-138-45 kubenswrapper[2574]: E0416 17:39:50.851093 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-45.ec2.internal\" not found" Apr 16 17:39:50.893312 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.893285 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-45.ec2.internal" Apr 16 17:39:50.897787 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:50.897760 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-45.ec2.internal" Apr 16 17:39:50.951795 ip-10-0-138-45 kubenswrapper[2574]: E0416 17:39:50.951749 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-45.ec2.internal\" not found" Apr 16 17:39:51.052424 ip-10-0-138-45 kubenswrapper[2574]: E0416 17:39:51.052347 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-45.ec2.internal\" not found" Apr 16 17:39:51.153016 ip-10-0-138-45 kubenswrapper[2574]: E0416 17:39:51.152974 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-45.ec2.internal\" not found" Apr 16 17:39:51.210278 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:51.210245 2574 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 17:39:51.238445 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:51.238412 2574 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 16 17:39:51.238977 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:51.238560 2574 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 17:39:51.238977 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:51.238607 2574 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 17:39:51.253924 ip-10-0-138-45 kubenswrapper[2574]: E0416 17:39:51.253897 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-45.ec2.internal\" not found" Apr 16 17:39:51.285655 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:51.285626 2574 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 17:39:51.338466 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:51.338433 2574 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 16 17:39:51.348739 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:51.348711 2574 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 17:39:51.354173 ip-10-0-138-45 kubenswrapper[2574]: E0416 17:39:51.354156 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-45.ec2.internal\" not found" Apr 16 17:39:51.357332 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:51.357302 2574 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-15 17:34:50 +0000 UTC" deadline="2027-10-25 00:21:11.964263142 +0000 UTC" Apr 16 17:39:51.357332 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:51.357331 2574 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="13350h41m20.606936454s" Apr 16 17:39:51.370265 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:51.370243 2574 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-nkzrm" Apr 16 17:39:51.376416 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:51.376400 2574 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 17:39:51.379695 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:51.379682 2574 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-nkzrm" Apr 16 17:39:51.439684 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:51.439655 2574 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-45.ec2.internal" Apr 16 17:39:51.450023 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:51.449985 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcc1dec6d25ddbc91146c8c11e71acdd6.slice/crio-5d93906cfe9865b6ddd1bcb05763747b9d1f4341d69b72462908aa4d005e4194 WatchSource:0}: Error finding container 5d93906cfe9865b6ddd1bcb05763747b9d1f4341d69b72462908aa4d005e4194: Status 404 returned error can't find the container with id 5d93906cfe9865b6ddd1bcb05763747b9d1f4341d69b72462908aa4d005e4194 Apr 16 17:39:51.450265 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:51.450250 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2eb2699ca22310dd4864f65d554362ea.slice/crio-0f75f0e4af8febe19f9fc5f4e29aae8e0cc610f627e9eb51e15ad24a6c2a89e0 WatchSource:0}: Error finding container 0f75f0e4af8febe19f9fc5f4e29aae8e0cc610f627e9eb51e15ad24a6c2a89e0: Status 404 returned error can't find the container with id 0f75f0e4af8febe19f9fc5f4e29aae8e0cc610f627e9eb51e15ad24a6c2a89e0 Apr 16 17:39:51.454056 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:51.454040 2574 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 17:39:51.456059 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:51.456042 2574 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 17:39:51.457123 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:51.457108 2574 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-45.ec2.internal" Apr 16 17:39:51.463434 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:51.463419 2574 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 17:39:52.314823 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.314796 2574 apiserver.go:52] "Watching apiserver" Apr 16 17:39:52.322075 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.322047 2574 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 16 17:39:52.323274 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.323250 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/konnectivity-agent-9465m","kube-system/kube-apiserver-proxy-ip-10-0-138-45.ec2.internal","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-45.ec2.internal","openshift-multus/multus-7zg22","openshift-network-diagnostics/network-check-target-2kgzs","openshift-network-operator/iptables-alerter-cwp8j","openshift-ovn-kubernetes/ovnkube-node-42vwn","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cv6hz","openshift-cluster-node-tuning-operator/tuned-thhnl","openshift-image-registry/node-ca-gbhm8","openshift-multus/multus-additional-cni-plugins-5ww9l","openshift-multus/network-metrics-daemon-cxxrm"] Apr 16 17:39:52.327742 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.327716 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-9465m" Apr 16 17:39:52.329871 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.329829 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-7zg22" Apr 16 17:39:52.330388 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.330368 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-h9tm8\"" Apr 16 17:39:52.330644 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.330627 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 16 17:39:52.331938 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.331918 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2kgzs" Apr 16 17:39:52.332021 ip-10-0-138-45 kubenswrapper[2574]: E0416 17:39:52.332000 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2kgzs" podUID="b2fa5fab-8dc1-4b89-bb5a-d252dd35fa2d" Apr 16 17:39:52.332075 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.332028 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-cwp8j" Apr 16 17:39:52.334200 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.334183 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-42vwn" Apr 16 17:39:52.336244 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.336218 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 16 17:39:52.337258 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.336628 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 16 17:39:52.337258 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.336669 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 16 17:39:52.337866 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.337849 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cv6hz" Apr 16 17:39:52.339919 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.339898 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 16 17:39:52.340912 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.340892 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-thhnl" Apr 16 17:39:52.341540 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.341520 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 16 17:39:52.341974 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.341955 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 16 17:39:52.342095 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.341978 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-xlkxb\"" Apr 16 17:39:52.342095 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.341961 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 16 17:39:52.342095 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.342010 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-7srdm\"" Apr 16 17:39:52.342095 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.342092 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 16 17:39:52.342288 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.342174 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 16 17:39:52.342288 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.342183 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 16 17:39:52.342288 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.342195 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-tknzl\"" Apr 16 17:39:52.342288 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.342279 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 16 17:39:52.342470 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.342177 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-mnbq7\"" Apr 16 17:39:52.342470 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.342356 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 16 17:39:52.342593 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.342514 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 16 17:39:52.343004 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.342794 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 16 17:39:52.343004 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.342834 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 16 17:39:52.343004 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.342795 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 16 17:39:52.343004 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.342914 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 16 17:39:52.343539 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.343522 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 16 17:39:52.343836 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.343814 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-gbhm8" Apr 16 17:39:52.344180 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.344158 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 16 17:39:52.344356 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.344338 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-5v4bz\"" Apr 16 17:39:52.346539 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.346479 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-5ww9l" Apr 16 17:39:52.347587 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.347367 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 16 17:39:52.347587 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.347404 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-q85p4\"" Apr 16 17:39:52.347587 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.347428 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 16 17:39:52.349030 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.349009 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cxxrm" Apr 16 17:39:52.349135 ip-10-0-138-45 kubenswrapper[2574]: E0416 17:39:52.349075 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cxxrm" podUID="88ac585a-175b-45f2-8696-1c754b2a5a05" Apr 16 17:39:52.351501 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.351304 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 16 17:39:52.351501 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.351343 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 16 17:39:52.351501 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.351370 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-hgp2z\"" Apr 16 17:39:52.351501 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.351409 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 16 17:39:52.353654 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.353636 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c1f0a51e-003d-4753-8b66-791533e66dea-cni-binary-copy\") pod \"multus-7zg22\" (UID: \"c1f0a51e-003d-4753-8b66-791533e66dea\") " pod="openshift-multus/multus-7zg22" Apr 16 17:39:52.353749 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.353658 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/810acbe7-5369-4316-8f52-ddfcdd34e838-serviceca\") pod \"node-ca-gbhm8\" (UID: \"810acbe7-5369-4316-8f52-ddfcdd34e838\") " pod="openshift-image-registry/node-ca-gbhm8" Apr 16 17:39:52.353749 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.353683 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2ck7g\" (UniqueName: \"kubernetes.io/projected/810acbe7-5369-4316-8f52-ddfcdd34e838-kube-api-access-2ck7g\") pod \"node-ca-gbhm8\" (UID: \"810acbe7-5369-4316-8f52-ddfcdd34e838\") " pod="openshift-image-registry/node-ca-gbhm8" Apr 16 17:39:52.353749 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.353700 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/fb3c3b95-67e6-454b-854a-5ec64e63536a-registration-dir\") pod \"aws-ebs-csi-driver-node-cv6hz\" (UID: \"fb3c3b95-67e6-454b-854a-5ec64e63536a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cv6hz" Apr 16 17:39:52.353749 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.353722 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/fb3c3b95-67e6-454b-854a-5ec64e63536a-device-dir\") pod \"aws-ebs-csi-driver-node-cv6hz\" (UID: \"fb3c3b95-67e6-454b-854a-5ec64e63536a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cv6hz" Apr 16 17:39:52.353954 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.353768 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/a3277964-04b5-4058-a09e-f604a2b93fe1-etc-sysconfig\") pod \"tuned-thhnl\" (UID: \"a3277964-04b5-4058-a09e-f604a2b93fe1\") " pod="openshift-cluster-node-tuning-operator/tuned-thhnl" Apr 16 17:39:52.353954 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.353823 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/a3277964-04b5-4058-a09e-f604a2b93fe1-etc-tuned\") pod \"tuned-thhnl\" (UID: \"a3277964-04b5-4058-a09e-f604a2b93fe1\") " pod="openshift-cluster-node-tuning-operator/tuned-thhnl" Apr 16 17:39:52.353954 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.353866 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/c1f0a51e-003d-4753-8b66-791533e66dea-hostroot\") pod \"multus-7zg22\" (UID: \"c1f0a51e-003d-4753-8b66-791533e66dea\") " pod="openshift-multus/multus-7zg22" Apr 16 17:39:52.353954 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.353898 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4mmnm\" (UniqueName: \"kubernetes.io/projected/08a0b4d9-4a6f-45bc-821e-59612b76469a-kube-api-access-4mmnm\") pod \"iptables-alerter-cwp8j\" (UID: \"08a0b4d9-4a6f-45bc-821e-59612b76469a\") " pod="openshift-network-operator/iptables-alerter-cwp8j" Apr 16 17:39:52.353954 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.353931 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/a3277964-04b5-4058-a09e-f604a2b93fe1-etc-sysctl-d\") pod \"tuned-thhnl\" (UID: \"a3277964-04b5-4058-a09e-f604a2b93fe1\") " pod="openshift-cluster-node-tuning-operator/tuned-thhnl" Apr 16 17:39:52.354192 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.353974 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/a3277964-04b5-4058-a09e-f604a2b93fe1-tmp\") pod \"tuned-thhnl\" (UID: \"a3277964-04b5-4058-a09e-f604a2b93fe1\") " pod="openshift-cluster-node-tuning-operator/tuned-thhnl" Apr 16 17:39:52.354192 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.354001 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/2ee69609-b777-4f06-988b-8e74dd389f13-run-systemd\") pod \"ovnkube-node-42vwn\" (UID: \"2ee69609-b777-4f06-988b-8e74dd389f13\") " pod="openshift-ovn-kubernetes/ovnkube-node-42vwn" Apr 16 17:39:52.354192 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.354024 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c1f0a51e-003d-4753-8b66-791533e66dea-os-release\") pod \"multus-7zg22\" (UID: \"c1f0a51e-003d-4753-8b66-791533e66dea\") " pod="openshift-multus/multus-7zg22" Apr 16 17:39:52.354192 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.354071 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c1f0a51e-003d-4753-8b66-791533e66dea-host-var-lib-cni-bin\") pod \"multus-7zg22\" (UID: \"c1f0a51e-003d-4753-8b66-791533e66dea\") " pod="openshift-multus/multus-7zg22" Apr 16 17:39:52.354192 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.354096 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2qx9d\" (UniqueName: \"kubernetes.io/projected/2ee69609-b777-4f06-988b-8e74dd389f13-kube-api-access-2qx9d\") pod \"ovnkube-node-42vwn\" (UID: \"2ee69609-b777-4f06-988b-8e74dd389f13\") " pod="openshift-ovn-kubernetes/ovnkube-node-42vwn" Apr 16 17:39:52.354192 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.354121 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rvl7j\" (UniqueName: \"kubernetes.io/projected/b2fa5fab-8dc1-4b89-bb5a-d252dd35fa2d-kube-api-access-rvl7j\") pod \"network-check-target-2kgzs\" (UID: \"b2fa5fab-8dc1-4b89-bb5a-d252dd35fa2d\") " pod="openshift-network-diagnostics/network-check-target-2kgzs" Apr 16 17:39:52.354192 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.354143 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a3277964-04b5-4058-a09e-f604a2b93fe1-sys\") pod \"tuned-thhnl\" (UID: \"a3277964-04b5-4058-a09e-f604a2b93fe1\") " pod="openshift-cluster-node-tuning-operator/tuned-thhnl" Apr 16 17:39:52.354192 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.354170 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c1f0a51e-003d-4753-8b66-791533e66dea-system-cni-dir\") pod \"multus-7zg22\" (UID: \"c1f0a51e-003d-4753-8b66-791533e66dea\") " pod="openshift-multus/multus-7zg22" Apr 16 17:39:52.354564 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.354195 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/2ee69609-b777-4f06-988b-8e74dd389f13-host-kubelet\") pod \"ovnkube-node-42vwn\" (UID: \"2ee69609-b777-4f06-988b-8e74dd389f13\") " pod="openshift-ovn-kubernetes/ovnkube-node-42vwn" Apr 16 17:39:52.354564 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.354237 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/fb3c3b95-67e6-454b-854a-5ec64e63536a-kubelet-dir\") pod \"aws-ebs-csi-driver-node-cv6hz\" (UID: \"fb3c3b95-67e6-454b-854a-5ec64e63536a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cv6hz" Apr 16 17:39:52.354564 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.354284 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c1f0a51e-003d-4753-8b66-791533e66dea-cnibin\") pod \"multus-7zg22\" (UID: \"c1f0a51e-003d-4753-8b66-791533e66dea\") " pod="openshift-multus/multus-7zg22" Apr 16 17:39:52.354564 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.354316 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2ee69609-b777-4f06-988b-8e74dd389f13-var-lib-openvswitch\") pod \"ovnkube-node-42vwn\" (UID: \"2ee69609-b777-4f06-988b-8e74dd389f13\") " pod="openshift-ovn-kubernetes/ovnkube-node-42vwn" Apr 16 17:39:52.354564 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.354344 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2ee69609-b777-4f06-988b-8e74dd389f13-host-run-ovn-kubernetes\") pod \"ovnkube-node-42vwn\" (UID: \"2ee69609-b777-4f06-988b-8e74dd389f13\") " pod="openshift-ovn-kubernetes/ovnkube-node-42vwn" Apr 16 17:39:52.354564 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.354366 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a3277964-04b5-4058-a09e-f604a2b93fe1-lib-modules\") pod \"tuned-thhnl\" (UID: \"a3277964-04b5-4058-a09e-f604a2b93fe1\") " pod="openshift-cluster-node-tuning-operator/tuned-thhnl" Apr 16 17:39:52.354564 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.354381 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a3277964-04b5-4058-a09e-f604a2b93fe1-var-lib-kubelet\") pod \"tuned-thhnl\" (UID: \"a3277964-04b5-4058-a09e-f604a2b93fe1\") " pod="openshift-cluster-node-tuning-operator/tuned-thhnl" Apr 16 17:39:52.354564 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.354396 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2ee69609-b777-4f06-988b-8e74dd389f13-etc-openvswitch\") pod \"ovnkube-node-42vwn\" (UID: \"2ee69609-b777-4f06-988b-8e74dd389f13\") " pod="openshift-ovn-kubernetes/ovnkube-node-42vwn" Apr 16 17:39:52.354564 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.354413 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c1f0a51e-003d-4753-8b66-791533e66dea-host-run-netns\") pod \"multus-7zg22\" (UID: \"c1f0a51e-003d-4753-8b66-791533e66dea\") " pod="openshift-multus/multus-7zg22" Apr 16 17:39:52.354564 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.354434 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c1f0a51e-003d-4753-8b66-791533e66dea-etc-kubernetes\") pod \"multus-7zg22\" (UID: \"c1f0a51e-003d-4753-8b66-791533e66dea\") " pod="openshift-multus/multus-7zg22" Apr 16 17:39:52.354564 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.354455 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/2ee69609-b777-4f06-988b-8e74dd389f13-systemd-units\") pod \"ovnkube-node-42vwn\" (UID: \"2ee69609-b777-4f06-988b-8e74dd389f13\") " pod="openshift-ovn-kubernetes/ovnkube-node-42vwn" Apr 16 17:39:52.354564 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.354475 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/c9af89df-b4ab-4045-b2fc-4762862c0d37-agent-certs\") pod \"konnectivity-agent-9465m\" (UID: \"c9af89df-b4ab-4045-b2fc-4762862c0d37\") " pod="kube-system/konnectivity-agent-9465m" Apr 16 17:39:52.354564 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.354526 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/a3277964-04b5-4058-a09e-f604a2b93fe1-etc-systemd\") pod \"tuned-thhnl\" (UID: \"a3277964-04b5-4058-a09e-f604a2b93fe1\") " pod="openshift-cluster-node-tuning-operator/tuned-thhnl" Apr 16 17:39:52.355274 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.354610 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cvrjx\" (UniqueName: \"kubernetes.io/projected/fb3c3b95-67e6-454b-854a-5ec64e63536a-kube-api-access-cvrjx\") pod \"aws-ebs-csi-driver-node-cv6hz\" (UID: \"fb3c3b95-67e6-454b-854a-5ec64e63536a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cv6hz" Apr 16 17:39:52.355274 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.354640 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c1f0a51e-003d-4753-8b66-791533e66dea-multus-cni-dir\") pod \"multus-7zg22\" (UID: \"c1f0a51e-003d-4753-8b66-791533e66dea\") " pod="openshift-multus/multus-7zg22" Apr 16 17:39:52.355274 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.354662 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/2ee69609-b777-4f06-988b-8e74dd389f13-run-ovn\") pod \"ovnkube-node-42vwn\" (UID: \"2ee69609-b777-4f06-988b-8e74dd389f13\") " pod="openshift-ovn-kubernetes/ovnkube-node-42vwn" Apr 16 17:39:52.355274 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.354688 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/2ee69609-b777-4f06-988b-8e74dd389f13-log-socket\") pod \"ovnkube-node-42vwn\" (UID: \"2ee69609-b777-4f06-988b-8e74dd389f13\") " pod="openshift-ovn-kubernetes/ovnkube-node-42vwn" Apr 16 17:39:52.355274 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.354711 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/a3277964-04b5-4058-a09e-f604a2b93fe1-etc-sysctl-conf\") pod \"tuned-thhnl\" (UID: \"a3277964-04b5-4058-a09e-f604a2b93fe1\") " pod="openshift-cluster-node-tuning-operator/tuned-thhnl" Apr 16 17:39:52.355274 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.354732 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/a3277964-04b5-4058-a09e-f604a2b93fe1-run\") pod \"tuned-thhnl\" (UID: \"a3277964-04b5-4058-a09e-f604a2b93fe1\") " pod="openshift-cluster-node-tuning-operator/tuned-thhnl" Apr 16 17:39:52.355274 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.354767 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2ee69609-b777-4f06-988b-8e74dd389f13-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-42vwn\" (UID: \"2ee69609-b777-4f06-988b-8e74dd389f13\") " pod="openshift-ovn-kubernetes/ovnkube-node-42vwn" Apr 16 17:39:52.355274 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.354799 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/c9af89df-b4ab-4045-b2fc-4762862c0d37-konnectivity-ca\") pod \"konnectivity-agent-9465m\" (UID: \"c9af89df-b4ab-4045-b2fc-4762862c0d37\") " pod="kube-system/konnectivity-agent-9465m" Apr 16 17:39:52.355274 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.354824 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/fb3c3b95-67e6-454b-854a-5ec64e63536a-sys-fs\") pod \"aws-ebs-csi-driver-node-cv6hz\" (UID: \"fb3c3b95-67e6-454b-854a-5ec64e63536a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cv6hz" Apr 16 17:39:52.355274 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.354845 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a3277964-04b5-4058-a09e-f604a2b93fe1-etc-kubernetes\") pod \"tuned-thhnl\" (UID: \"a3277964-04b5-4058-a09e-f604a2b93fe1\") " pod="openshift-cluster-node-tuning-operator/tuned-thhnl" Apr 16 17:39:52.355274 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.354866 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2ee69609-b777-4f06-988b-8e74dd389f13-host-cni-bin\") pod \"ovnkube-node-42vwn\" (UID: \"2ee69609-b777-4f06-988b-8e74dd389f13\") " pod="openshift-ovn-kubernetes/ovnkube-node-42vwn" Apr 16 17:39:52.355274 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.354888 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/fb3c3b95-67e6-454b-854a-5ec64e63536a-etc-selinux\") pod \"aws-ebs-csi-driver-node-cv6hz\" (UID: \"fb3c3b95-67e6-454b-854a-5ec64e63536a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cv6hz" Apr 16 17:39:52.355274 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.354921 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rkgdj\" (UniqueName: \"kubernetes.io/projected/a3277964-04b5-4058-a09e-f604a2b93fe1-kube-api-access-rkgdj\") pod \"tuned-thhnl\" (UID: \"a3277964-04b5-4058-a09e-f604a2b93fe1\") " pod="openshift-cluster-node-tuning-operator/tuned-thhnl" Apr 16 17:39:52.355274 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.354949 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/c1f0a51e-003d-4753-8b66-791533e66dea-host-run-k8s-cni-cncf-io\") pod \"multus-7zg22\" (UID: \"c1f0a51e-003d-4753-8b66-791533e66dea\") " pod="openshift-multus/multus-7zg22" Apr 16 17:39:52.355274 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.354979 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/c1f0a51e-003d-4753-8b66-791533e66dea-host-run-multus-certs\") pod \"multus-7zg22\" (UID: \"c1f0a51e-003d-4753-8b66-791533e66dea\") " pod="openshift-multus/multus-7zg22" Apr 16 17:39:52.355274 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.355007 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2ee69609-b777-4f06-988b-8e74dd389f13-host-slash\") pod \"ovnkube-node-42vwn\" (UID: \"2ee69609-b777-4f06-988b-8e74dd389f13\") " pod="openshift-ovn-kubernetes/ovnkube-node-42vwn" Apr 16 17:39:52.356050 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.355033 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/2ee69609-b777-4f06-988b-8e74dd389f13-node-log\") pod \"ovnkube-node-42vwn\" (UID: \"2ee69609-b777-4f06-988b-8e74dd389f13\") " pod="openshift-ovn-kubernetes/ovnkube-node-42vwn" Apr 16 17:39:52.356050 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.355047 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c1f0a51e-003d-4753-8b66-791533e66dea-multus-conf-dir\") pod \"multus-7zg22\" (UID: \"c1f0a51e-003d-4753-8b66-791533e66dea\") " pod="openshift-multus/multus-7zg22" Apr 16 17:39:52.356050 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.355062 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2ee69609-b777-4f06-988b-8e74dd389f13-ovn-node-metrics-cert\") pod \"ovnkube-node-42vwn\" (UID: \"2ee69609-b777-4f06-988b-8e74dd389f13\") " pod="openshift-ovn-kubernetes/ovnkube-node-42vwn" Apr 16 17:39:52.356050 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.355082 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/a3277964-04b5-4058-a09e-f604a2b93fe1-etc-modprobe-d\") pod \"tuned-thhnl\" (UID: \"a3277964-04b5-4058-a09e-f604a2b93fe1\") " pod="openshift-cluster-node-tuning-operator/tuned-thhnl" Apr 16 17:39:52.356050 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.355118 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/c1f0a51e-003d-4753-8b66-791533e66dea-host-var-lib-cni-multus\") pod \"multus-7zg22\" (UID: \"c1f0a51e-003d-4753-8b66-791533e66dea\") " pod="openshift-multus/multus-7zg22" Apr 16 17:39:52.356050 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.355155 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/2ee69609-b777-4f06-988b-8e74dd389f13-host-cni-netd\") pod \"ovnkube-node-42vwn\" (UID: \"2ee69609-b777-4f06-988b-8e74dd389f13\") " pod="openshift-ovn-kubernetes/ovnkube-node-42vwn" Apr 16 17:39:52.356050 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.355177 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/810acbe7-5369-4316-8f52-ddfcdd34e838-host\") pod \"node-ca-gbhm8\" (UID: \"810acbe7-5369-4316-8f52-ddfcdd34e838\") " pod="openshift-image-registry/node-ca-gbhm8" Apr 16 17:39:52.356050 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.355194 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a3277964-04b5-4058-a09e-f604a2b93fe1-host\") pod \"tuned-thhnl\" (UID: \"a3277964-04b5-4058-a09e-f604a2b93fe1\") " pod="openshift-cluster-node-tuning-operator/tuned-thhnl" Apr 16 17:39:52.356050 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.355237 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2ee69609-b777-4f06-988b-8e74dd389f13-run-openvswitch\") pod \"ovnkube-node-42vwn\" (UID: \"2ee69609-b777-4f06-988b-8e74dd389f13\") " pod="openshift-ovn-kubernetes/ovnkube-node-42vwn" Apr 16 17:39:52.356050 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.355268 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/2ee69609-b777-4f06-988b-8e74dd389f13-ovnkube-script-lib\") pod \"ovnkube-node-42vwn\" (UID: \"2ee69609-b777-4f06-988b-8e74dd389f13\") " pod="openshift-ovn-kubernetes/ovnkube-node-42vwn" Apr 16 17:39:52.356050 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.355286 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/08a0b4d9-4a6f-45bc-821e-59612b76469a-host-slash\") pod \"iptables-alerter-cwp8j\" (UID: \"08a0b4d9-4a6f-45bc-821e-59612b76469a\") " pod="openshift-network-operator/iptables-alerter-cwp8j" Apr 16 17:39:52.356050 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.355301 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/08a0b4d9-4a6f-45bc-821e-59612b76469a-iptables-alerter-script\") pod \"iptables-alerter-cwp8j\" (UID: \"08a0b4d9-4a6f-45bc-821e-59612b76469a\") " pod="openshift-network-operator/iptables-alerter-cwp8j" Apr 16 17:39:52.356050 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.355342 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/fb3c3b95-67e6-454b-854a-5ec64e63536a-socket-dir\") pod \"aws-ebs-csi-driver-node-cv6hz\" (UID: \"fb3c3b95-67e6-454b-854a-5ec64e63536a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cv6hz" Apr 16 17:39:52.356050 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.355358 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2ee69609-b777-4f06-988b-8e74dd389f13-host-run-netns\") pod \"ovnkube-node-42vwn\" (UID: \"2ee69609-b777-4f06-988b-8e74dd389f13\") " pod="openshift-ovn-kubernetes/ovnkube-node-42vwn" Apr 16 17:39:52.356050 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.355373 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/c1f0a51e-003d-4753-8b66-791533e66dea-multus-socket-dir-parent\") pod \"multus-7zg22\" (UID: \"c1f0a51e-003d-4753-8b66-791533e66dea\") " pod="openshift-multus/multus-7zg22" Apr 16 17:39:52.356050 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.355403 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/c1f0a51e-003d-4753-8b66-791533e66dea-multus-daemon-config\") pod \"multus-7zg22\" (UID: \"c1f0a51e-003d-4753-8b66-791533e66dea\") " pod="openshift-multus/multus-7zg22" Apr 16 17:39:52.356717 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.355425 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j2rj2\" (UniqueName: \"kubernetes.io/projected/c1f0a51e-003d-4753-8b66-791533e66dea-kube-api-access-j2rj2\") pod \"multus-7zg22\" (UID: \"c1f0a51e-003d-4753-8b66-791533e66dea\") " pod="openshift-multus/multus-7zg22" Apr 16 17:39:52.356717 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.355460 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/2ee69609-b777-4f06-988b-8e74dd389f13-env-overrides\") pod \"ovnkube-node-42vwn\" (UID: \"2ee69609-b777-4f06-988b-8e74dd389f13\") " pod="openshift-ovn-kubernetes/ovnkube-node-42vwn" Apr 16 17:39:52.356717 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.355487 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c1f0a51e-003d-4753-8b66-791533e66dea-host-var-lib-kubelet\") pod \"multus-7zg22\" (UID: \"c1f0a51e-003d-4753-8b66-791533e66dea\") " pod="openshift-multus/multus-7zg22" Apr 16 17:39:52.356717 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.355527 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/2ee69609-b777-4f06-988b-8e74dd389f13-ovnkube-config\") pod \"ovnkube-node-42vwn\" (UID: \"2ee69609-b777-4f06-988b-8e74dd389f13\") " pod="openshift-ovn-kubernetes/ovnkube-node-42vwn" Apr 16 17:39:52.369373 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.369354 2574 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 17:39:52.380828 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.380798 2574 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 17:34:51 +0000 UTC" deadline="2027-11-26 01:37:51.046855224 +0000 UTC" Apr 16 17:39:52.380828 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.380827 2574 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14119h57m58.666031971s" Apr 16 17:39:52.441619 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.441586 2574 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 16 17:39:52.454122 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.454073 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-45.ec2.internal" event={"ID":"2eb2699ca22310dd4864f65d554362ea","Type":"ContainerStarted","Data":"0f75f0e4af8febe19f9fc5f4e29aae8e0cc610f627e9eb51e15ad24a6c2a89e0"} Apr 16 17:39:52.455141 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.455114 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-45.ec2.internal" event={"ID":"cc1dec6d25ddbc91146c8c11e71acdd6","Type":"ContainerStarted","Data":"5d93906cfe9865b6ddd1bcb05763747b9d1f4341d69b72462908aa4d005e4194"} Apr 16 17:39:52.456327 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.456308 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/c1f0a51e-003d-4753-8b66-791533e66dea-host-run-k8s-cni-cncf-io\") pod \"multus-7zg22\" (UID: \"c1f0a51e-003d-4753-8b66-791533e66dea\") " pod="openshift-multus/multus-7zg22" Apr 16 17:39:52.456443 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.456337 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/c1f0a51e-003d-4753-8b66-791533e66dea-host-run-multus-certs\") pod \"multus-7zg22\" (UID: \"c1f0a51e-003d-4753-8b66-791533e66dea\") " pod="openshift-multus/multus-7zg22" Apr 16 17:39:52.456443 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.456362 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2ee69609-b777-4f06-988b-8e74dd389f13-host-slash\") pod \"ovnkube-node-42vwn\" (UID: \"2ee69609-b777-4f06-988b-8e74dd389f13\") " pod="openshift-ovn-kubernetes/ovnkube-node-42vwn" Apr 16 17:39:52.456443 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.456387 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/2ee69609-b777-4f06-988b-8e74dd389f13-node-log\") pod \"ovnkube-node-42vwn\" (UID: \"2ee69609-b777-4f06-988b-8e74dd389f13\") " pod="openshift-ovn-kubernetes/ovnkube-node-42vwn" Apr 16 17:39:52.456443 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.456409 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c1f0a51e-003d-4753-8b66-791533e66dea-multus-conf-dir\") pod \"multus-7zg22\" (UID: \"c1f0a51e-003d-4753-8b66-791533e66dea\") " pod="openshift-multus/multus-7zg22" Apr 16 17:39:52.456443 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.456434 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2ee69609-b777-4f06-988b-8e74dd389f13-ovn-node-metrics-cert\") pod \"ovnkube-node-42vwn\" (UID: \"2ee69609-b777-4f06-988b-8e74dd389f13\") " pod="openshift-ovn-kubernetes/ovnkube-node-42vwn" Apr 16 17:39:52.456443 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.456434 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/c1f0a51e-003d-4753-8b66-791533e66dea-host-run-k8s-cni-cncf-io\") pod \"multus-7zg22\" (UID: \"c1f0a51e-003d-4753-8b66-791533e66dea\") " pod="openshift-multus/multus-7zg22" Apr 16 17:39:52.456762 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.456436 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2ee69609-b777-4f06-988b-8e74dd389f13-host-slash\") pod \"ovnkube-node-42vwn\" (UID: \"2ee69609-b777-4f06-988b-8e74dd389f13\") " pod="openshift-ovn-kubernetes/ovnkube-node-42vwn" Apr 16 17:39:52.456762 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.456457 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/a3277964-04b5-4058-a09e-f604a2b93fe1-etc-modprobe-d\") pod \"tuned-thhnl\" (UID: \"a3277964-04b5-4058-a09e-f604a2b93fe1\") " pod="openshift-cluster-node-tuning-operator/tuned-thhnl" Apr 16 17:39:52.456762 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.456477 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c1f0a51e-003d-4753-8b66-791533e66dea-multus-conf-dir\") pod \"multus-7zg22\" (UID: \"c1f0a51e-003d-4753-8b66-791533e66dea\") " pod="openshift-multus/multus-7zg22" Apr 16 17:39:52.456762 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.456491 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/2ee69609-b777-4f06-988b-8e74dd389f13-node-log\") pod \"ovnkube-node-42vwn\" (UID: \"2ee69609-b777-4f06-988b-8e74dd389f13\") " pod="openshift-ovn-kubernetes/ovnkube-node-42vwn" Apr 16 17:39:52.456762 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.456438 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/c1f0a51e-003d-4753-8b66-791533e66dea-host-run-multus-certs\") pod \"multus-7zg22\" (UID: \"c1f0a51e-003d-4753-8b66-791533e66dea\") " pod="openshift-multus/multus-7zg22" Apr 16 17:39:52.456762 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.456584 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/a3277964-04b5-4058-a09e-f604a2b93fe1-etc-modprobe-d\") pod \"tuned-thhnl\" (UID: \"a3277964-04b5-4058-a09e-f604a2b93fe1\") " pod="openshift-cluster-node-tuning-operator/tuned-thhnl" Apr 16 17:39:52.456762 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.456605 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/c1f0a51e-003d-4753-8b66-791533e66dea-host-var-lib-cni-multus\") pod \"multus-7zg22\" (UID: \"c1f0a51e-003d-4753-8b66-791533e66dea\") " pod="openshift-multus/multus-7zg22" Apr 16 17:39:52.456762 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.456632 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/2ee69609-b777-4f06-988b-8e74dd389f13-host-cni-netd\") pod \"ovnkube-node-42vwn\" (UID: \"2ee69609-b777-4f06-988b-8e74dd389f13\") " pod="openshift-ovn-kubernetes/ovnkube-node-42vwn" Apr 16 17:39:52.456762 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.456661 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/810acbe7-5369-4316-8f52-ddfcdd34e838-host\") pod \"node-ca-gbhm8\" (UID: \"810acbe7-5369-4316-8f52-ddfcdd34e838\") " pod="openshift-image-registry/node-ca-gbhm8" Apr 16 17:39:52.456762 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.456684 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a3277964-04b5-4058-a09e-f604a2b93fe1-host\") pod \"tuned-thhnl\" (UID: \"a3277964-04b5-4058-a09e-f604a2b93fe1\") " pod="openshift-cluster-node-tuning-operator/tuned-thhnl" Apr 16 17:39:52.456762 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.456713 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/37af66f0-77b6-4e5a-9cd0-c7267f06ab11-tuning-conf-dir\") pod \"multus-additional-cni-plugins-5ww9l\" (UID: \"37af66f0-77b6-4e5a-9cd0-c7267f06ab11\") " pod="openshift-multus/multus-additional-cni-plugins-5ww9l" Apr 16 17:39:52.456762 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.456740 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2ee69609-b777-4f06-988b-8e74dd389f13-run-openvswitch\") pod \"ovnkube-node-42vwn\" (UID: \"2ee69609-b777-4f06-988b-8e74dd389f13\") " pod="openshift-ovn-kubernetes/ovnkube-node-42vwn" Apr 16 17:39:52.456762 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.456757 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/810acbe7-5369-4316-8f52-ddfcdd34e838-host\") pod \"node-ca-gbhm8\" (UID: \"810acbe7-5369-4316-8f52-ddfcdd34e838\") " pod="openshift-image-registry/node-ca-gbhm8" Apr 16 17:39:52.457341 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.456764 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/2ee69609-b777-4f06-988b-8e74dd389f13-ovnkube-script-lib\") pod \"ovnkube-node-42vwn\" (UID: \"2ee69609-b777-4f06-988b-8e74dd389f13\") " pod="openshift-ovn-kubernetes/ovnkube-node-42vwn" Apr 16 17:39:52.457341 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.456807 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/08a0b4d9-4a6f-45bc-821e-59612b76469a-host-slash\") pod \"iptables-alerter-cwp8j\" (UID: \"08a0b4d9-4a6f-45bc-821e-59612b76469a\") " pod="openshift-network-operator/iptables-alerter-cwp8j" Apr 16 17:39:52.457341 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.456837 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/08a0b4d9-4a6f-45bc-821e-59612b76469a-iptables-alerter-script\") pod \"iptables-alerter-cwp8j\" (UID: \"08a0b4d9-4a6f-45bc-821e-59612b76469a\") " pod="openshift-network-operator/iptables-alerter-cwp8j" Apr 16 17:39:52.457341 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.456850 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/2ee69609-b777-4f06-988b-8e74dd389f13-host-cni-netd\") pod \"ovnkube-node-42vwn\" (UID: \"2ee69609-b777-4f06-988b-8e74dd389f13\") " pod="openshift-ovn-kubernetes/ovnkube-node-42vwn" Apr 16 17:39:52.457341 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.456849 2574 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 16 17:39:52.457341 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.456881 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/fb3c3b95-67e6-454b-854a-5ec64e63536a-socket-dir\") pod \"aws-ebs-csi-driver-node-cv6hz\" (UID: \"fb3c3b95-67e6-454b-854a-5ec64e63536a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cv6hz" Apr 16 17:39:52.457341 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.456902 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/08a0b4d9-4a6f-45bc-821e-59612b76469a-host-slash\") pod \"iptables-alerter-cwp8j\" (UID: \"08a0b4d9-4a6f-45bc-821e-59612b76469a\") " pod="openshift-network-operator/iptables-alerter-cwp8j" Apr 16 17:39:52.457341 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.456907 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2ee69609-b777-4f06-988b-8e74dd389f13-host-run-netns\") pod \"ovnkube-node-42vwn\" (UID: \"2ee69609-b777-4f06-988b-8e74dd389f13\") " pod="openshift-ovn-kubernetes/ovnkube-node-42vwn" Apr 16 17:39:52.457341 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.456938 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/c1f0a51e-003d-4753-8b66-791533e66dea-multus-socket-dir-parent\") pod \"multus-7zg22\" (UID: \"c1f0a51e-003d-4753-8b66-791533e66dea\") " pod="openshift-multus/multus-7zg22" Apr 16 17:39:52.457341 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.456961 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/c1f0a51e-003d-4753-8b66-791533e66dea-multus-daemon-config\") pod \"multus-7zg22\" (UID: \"c1f0a51e-003d-4753-8b66-791533e66dea\") " pod="openshift-multus/multus-7zg22" Apr 16 17:39:52.457341 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.456983 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j2rj2\" (UniqueName: \"kubernetes.io/projected/c1f0a51e-003d-4753-8b66-791533e66dea-kube-api-access-j2rj2\") pod \"multus-7zg22\" (UID: \"c1f0a51e-003d-4753-8b66-791533e66dea\") " pod="openshift-multus/multus-7zg22" Apr 16 17:39:52.457341 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.457007 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/2ee69609-b777-4f06-988b-8e74dd389f13-env-overrides\") pod \"ovnkube-node-42vwn\" (UID: \"2ee69609-b777-4f06-988b-8e74dd389f13\") " pod="openshift-ovn-kubernetes/ovnkube-node-42vwn" Apr 16 17:39:52.457341 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.457032 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/37af66f0-77b6-4e5a-9cd0-c7267f06ab11-system-cni-dir\") pod \"multus-additional-cni-plugins-5ww9l\" (UID: \"37af66f0-77b6-4e5a-9cd0-c7267f06ab11\") " pod="openshift-multus/multus-additional-cni-plugins-5ww9l" Apr 16 17:39:52.457341 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.457059 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/37af66f0-77b6-4e5a-9cd0-c7267f06ab11-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-5ww9l\" (UID: \"37af66f0-77b6-4e5a-9cd0-c7267f06ab11\") " pod="openshift-multus/multus-additional-cni-plugins-5ww9l" Apr 16 17:39:52.457341 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.457086 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c1f0a51e-003d-4753-8b66-791533e66dea-host-var-lib-kubelet\") pod \"multus-7zg22\" (UID: \"c1f0a51e-003d-4753-8b66-791533e66dea\") " pod="openshift-multus/multus-7zg22" Apr 16 17:39:52.457341 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.457112 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/2ee69609-b777-4f06-988b-8e74dd389f13-ovnkube-config\") pod \"ovnkube-node-42vwn\" (UID: \"2ee69609-b777-4f06-988b-8e74dd389f13\") " pod="openshift-ovn-kubernetes/ovnkube-node-42vwn" Apr 16 17:39:52.457341 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.457140 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/37af66f0-77b6-4e5a-9cd0-c7267f06ab11-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-5ww9l\" (UID: \"37af66f0-77b6-4e5a-9cd0-c7267f06ab11\") " pod="openshift-multus/multus-additional-cni-plugins-5ww9l" Apr 16 17:39:52.458143 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.457173 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c1f0a51e-003d-4753-8b66-791533e66dea-cni-binary-copy\") pod \"multus-7zg22\" (UID: \"c1f0a51e-003d-4753-8b66-791533e66dea\") " pod="openshift-multus/multus-7zg22" Apr 16 17:39:52.458143 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.457199 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/810acbe7-5369-4316-8f52-ddfcdd34e838-serviceca\") pod \"node-ca-gbhm8\" (UID: \"810acbe7-5369-4316-8f52-ddfcdd34e838\") " pod="openshift-image-registry/node-ca-gbhm8" Apr 16 17:39:52.458143 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.457225 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2ck7g\" (UniqueName: \"kubernetes.io/projected/810acbe7-5369-4316-8f52-ddfcdd34e838-kube-api-access-2ck7g\") pod \"node-ca-gbhm8\" (UID: \"810acbe7-5369-4316-8f52-ddfcdd34e838\") " pod="openshift-image-registry/node-ca-gbhm8" Apr 16 17:39:52.458143 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.457248 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/fb3c3b95-67e6-454b-854a-5ec64e63536a-registration-dir\") pod \"aws-ebs-csi-driver-node-cv6hz\" (UID: \"fb3c3b95-67e6-454b-854a-5ec64e63536a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cv6hz" Apr 16 17:39:52.458143 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.457272 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/fb3c3b95-67e6-454b-854a-5ec64e63536a-device-dir\") pod \"aws-ebs-csi-driver-node-cv6hz\" (UID: \"fb3c3b95-67e6-454b-854a-5ec64e63536a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cv6hz" Apr 16 17:39:52.458143 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.457298 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/a3277964-04b5-4058-a09e-f604a2b93fe1-etc-sysconfig\") pod \"tuned-thhnl\" (UID: \"a3277964-04b5-4058-a09e-f604a2b93fe1\") " pod="openshift-cluster-node-tuning-operator/tuned-thhnl" Apr 16 17:39:52.458143 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.457320 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/2ee69609-b777-4f06-988b-8e74dd389f13-ovnkube-script-lib\") pod \"ovnkube-node-42vwn\" (UID: \"2ee69609-b777-4f06-988b-8e74dd389f13\") " pod="openshift-ovn-kubernetes/ovnkube-node-42vwn" Apr 16 17:39:52.458143 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.457326 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/a3277964-04b5-4058-a09e-f604a2b93fe1-etc-tuned\") pod \"tuned-thhnl\" (UID: \"a3277964-04b5-4058-a09e-f604a2b93fe1\") " pod="openshift-cluster-node-tuning-operator/tuned-thhnl" Apr 16 17:39:52.458143 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.457372 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/c1f0a51e-003d-4753-8b66-791533e66dea-hostroot\") pod \"multus-7zg22\" (UID: \"c1f0a51e-003d-4753-8b66-791533e66dea\") " pod="openshift-multus/multus-7zg22" Apr 16 17:39:52.458143 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.457400 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a3277964-04b5-4058-a09e-f604a2b93fe1-host\") pod \"tuned-thhnl\" (UID: \"a3277964-04b5-4058-a09e-f604a2b93fe1\") " pod="openshift-cluster-node-tuning-operator/tuned-thhnl" Apr 16 17:39:52.458143 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.457415 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/08a0b4d9-4a6f-45bc-821e-59612b76469a-iptables-alerter-script\") pod \"iptables-alerter-cwp8j\" (UID: \"08a0b4d9-4a6f-45bc-821e-59612b76469a\") " pod="openshift-network-operator/iptables-alerter-cwp8j" Apr 16 17:39:52.458143 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.457422 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/c1f0a51e-003d-4753-8b66-791533e66dea-hostroot\") pod \"multus-7zg22\" (UID: \"c1f0a51e-003d-4753-8b66-791533e66dea\") " pod="openshift-multus/multus-7zg22" Apr 16 17:39:52.458143 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.456713 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/c1f0a51e-003d-4753-8b66-791533e66dea-host-var-lib-cni-multus\") pod \"multus-7zg22\" (UID: \"c1f0a51e-003d-4753-8b66-791533e66dea\") " pod="openshift-multus/multus-7zg22" Apr 16 17:39:52.458143 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.457465 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2ee69609-b777-4f06-988b-8e74dd389f13-run-openvswitch\") pod \"ovnkube-node-42vwn\" (UID: \"2ee69609-b777-4f06-988b-8e74dd389f13\") " pod="openshift-ovn-kubernetes/ovnkube-node-42vwn" Apr 16 17:39:52.458143 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.457467 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/fb3c3b95-67e6-454b-854a-5ec64e63536a-socket-dir\") pod \"aws-ebs-csi-driver-node-cv6hz\" (UID: \"fb3c3b95-67e6-454b-854a-5ec64e63536a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cv6hz" Apr 16 17:39:52.458143 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.457483 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2ee69609-b777-4f06-988b-8e74dd389f13-host-run-netns\") pod \"ovnkube-node-42vwn\" (UID: \"2ee69609-b777-4f06-988b-8e74dd389f13\") " pod="openshift-ovn-kubernetes/ovnkube-node-42vwn" Apr 16 17:39:52.458143 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.457844 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/2ee69609-b777-4f06-988b-8e74dd389f13-env-overrides\") pod \"ovnkube-node-42vwn\" (UID: \"2ee69609-b777-4f06-988b-8e74dd389f13\") " pod="openshift-ovn-kubernetes/ovnkube-node-42vwn" Apr 16 17:39:52.458143 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.457903 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/810acbe7-5369-4316-8f52-ddfcdd34e838-serviceca\") pod \"node-ca-gbhm8\" (UID: \"810acbe7-5369-4316-8f52-ddfcdd34e838\") " pod="openshift-image-registry/node-ca-gbhm8" Apr 16 17:39:52.459005 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.457903 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/c1f0a51e-003d-4753-8b66-791533e66dea-multus-socket-dir-parent\") pod \"multus-7zg22\" (UID: \"c1f0a51e-003d-4753-8b66-791533e66dea\") " pod="openshift-multus/multus-7zg22" Apr 16 17:39:52.459005 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.457978 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c1f0a51e-003d-4753-8b66-791533e66dea-host-var-lib-kubelet\") pod \"multus-7zg22\" (UID: \"c1f0a51e-003d-4753-8b66-791533e66dea\") " pod="openshift-multus/multus-7zg22" Apr 16 17:39:52.459005 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.458020 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c1f0a51e-003d-4753-8b66-791533e66dea-cni-binary-copy\") pod \"multus-7zg22\" (UID: \"c1f0a51e-003d-4753-8b66-791533e66dea\") " pod="openshift-multus/multus-7zg22" Apr 16 17:39:52.459005 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.458041 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/c1f0a51e-003d-4753-8b66-791533e66dea-multus-daemon-config\") pod \"multus-7zg22\" (UID: \"c1f0a51e-003d-4753-8b66-791533e66dea\") " pod="openshift-multus/multus-7zg22" Apr 16 17:39:52.459005 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.458054 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4mmnm\" (UniqueName: \"kubernetes.io/projected/08a0b4d9-4a6f-45bc-821e-59612b76469a-kube-api-access-4mmnm\") pod \"iptables-alerter-cwp8j\" (UID: \"08a0b4d9-4a6f-45bc-821e-59612b76469a\") " pod="openshift-network-operator/iptables-alerter-cwp8j" Apr 16 17:39:52.459005 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.458081 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/a3277964-04b5-4058-a09e-f604a2b93fe1-etc-sysctl-d\") pod \"tuned-thhnl\" (UID: \"a3277964-04b5-4058-a09e-f604a2b93fe1\") " pod="openshift-cluster-node-tuning-operator/tuned-thhnl" Apr 16 17:39:52.459005 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.458104 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/a3277964-04b5-4058-a09e-f604a2b93fe1-tmp\") pod \"tuned-thhnl\" (UID: \"a3277964-04b5-4058-a09e-f604a2b93fe1\") " pod="openshift-cluster-node-tuning-operator/tuned-thhnl" Apr 16 17:39:52.459005 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.458115 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/fb3c3b95-67e6-454b-854a-5ec64e63536a-device-dir\") pod \"aws-ebs-csi-driver-node-cv6hz\" (UID: \"fb3c3b95-67e6-454b-854a-5ec64e63536a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cv6hz" Apr 16 17:39:52.459005 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.458127 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/2ee69609-b777-4f06-988b-8e74dd389f13-run-systemd\") pod \"ovnkube-node-42vwn\" (UID: \"2ee69609-b777-4f06-988b-8e74dd389f13\") " pod="openshift-ovn-kubernetes/ovnkube-node-42vwn" Apr 16 17:39:52.459005 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.458149 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c1f0a51e-003d-4753-8b66-791533e66dea-os-release\") pod \"multus-7zg22\" (UID: \"c1f0a51e-003d-4753-8b66-791533e66dea\") " pod="openshift-multus/multus-7zg22" Apr 16 17:39:52.459005 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.458166 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/fb3c3b95-67e6-454b-854a-5ec64e63536a-registration-dir\") pod \"aws-ebs-csi-driver-node-cv6hz\" (UID: \"fb3c3b95-67e6-454b-854a-5ec64e63536a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cv6hz" Apr 16 17:39:52.459005 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.458173 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c1f0a51e-003d-4753-8b66-791533e66dea-host-var-lib-cni-bin\") pod \"multus-7zg22\" (UID: \"c1f0a51e-003d-4753-8b66-791533e66dea\") " pod="openshift-multus/multus-7zg22" Apr 16 17:39:52.459005 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.458216 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c1f0a51e-003d-4753-8b66-791533e66dea-host-var-lib-cni-bin\") pod \"multus-7zg22\" (UID: \"c1f0a51e-003d-4753-8b66-791533e66dea\") " pod="openshift-multus/multus-7zg22" Apr 16 17:39:52.459005 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.458216 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/a3277964-04b5-4058-a09e-f604a2b93fe1-etc-sysconfig\") pod \"tuned-thhnl\" (UID: \"a3277964-04b5-4058-a09e-f604a2b93fe1\") " pod="openshift-cluster-node-tuning-operator/tuned-thhnl" Apr 16 17:39:52.459005 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.458371 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/2ee69609-b777-4f06-988b-8e74dd389f13-ovnkube-config\") pod \"ovnkube-node-42vwn\" (UID: \"2ee69609-b777-4f06-988b-8e74dd389f13\") " pod="openshift-ovn-kubernetes/ovnkube-node-42vwn" Apr 16 17:39:52.459005 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.458425 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/2ee69609-b777-4f06-988b-8e74dd389f13-run-systemd\") pod \"ovnkube-node-42vwn\" (UID: \"2ee69609-b777-4f06-988b-8e74dd389f13\") " pod="openshift-ovn-kubernetes/ovnkube-node-42vwn" Apr 16 17:39:52.459005 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.458473 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c1f0a51e-003d-4753-8b66-791533e66dea-os-release\") pod \"multus-7zg22\" (UID: \"c1f0a51e-003d-4753-8b66-791533e66dea\") " pod="openshift-multus/multus-7zg22" Apr 16 17:39:52.459005 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.458541 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/a3277964-04b5-4058-a09e-f604a2b93fe1-etc-sysctl-d\") pod \"tuned-thhnl\" (UID: \"a3277964-04b5-4058-a09e-f604a2b93fe1\") " pod="openshift-cluster-node-tuning-operator/tuned-thhnl" Apr 16 17:39:52.459820 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.458558 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2qx9d\" (UniqueName: \"kubernetes.io/projected/2ee69609-b777-4f06-988b-8e74dd389f13-kube-api-access-2qx9d\") pod \"ovnkube-node-42vwn\" (UID: \"2ee69609-b777-4f06-988b-8e74dd389f13\") " pod="openshift-ovn-kubernetes/ovnkube-node-42vwn" Apr 16 17:39:52.459820 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.458610 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rvl7j\" (UniqueName: \"kubernetes.io/projected/b2fa5fab-8dc1-4b89-bb5a-d252dd35fa2d-kube-api-access-rvl7j\") pod \"network-check-target-2kgzs\" (UID: \"b2fa5fab-8dc1-4b89-bb5a-d252dd35fa2d\") " pod="openshift-network-diagnostics/network-check-target-2kgzs" Apr 16 17:39:52.459820 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.458648 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a3277964-04b5-4058-a09e-f604a2b93fe1-sys\") pod \"tuned-thhnl\" (UID: \"a3277964-04b5-4058-a09e-f604a2b93fe1\") " pod="openshift-cluster-node-tuning-operator/tuned-thhnl" Apr 16 17:39:52.459820 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.458675 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/37af66f0-77b6-4e5a-9cd0-c7267f06ab11-cnibin\") pod \"multus-additional-cni-plugins-5ww9l\" (UID: \"37af66f0-77b6-4e5a-9cd0-c7267f06ab11\") " pod="openshift-multus/multus-additional-cni-plugins-5ww9l" Apr 16 17:39:52.459820 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.458701 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c1f0a51e-003d-4753-8b66-791533e66dea-system-cni-dir\") pod \"multus-7zg22\" (UID: \"c1f0a51e-003d-4753-8b66-791533e66dea\") " pod="openshift-multus/multus-7zg22" Apr 16 17:39:52.459820 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.458726 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/2ee69609-b777-4f06-988b-8e74dd389f13-host-kubelet\") pod \"ovnkube-node-42vwn\" (UID: \"2ee69609-b777-4f06-988b-8e74dd389f13\") " pod="openshift-ovn-kubernetes/ovnkube-node-42vwn" Apr 16 17:39:52.459820 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.458749 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/fb3c3b95-67e6-454b-854a-5ec64e63536a-kubelet-dir\") pod \"aws-ebs-csi-driver-node-cv6hz\" (UID: \"fb3c3b95-67e6-454b-854a-5ec64e63536a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cv6hz" Apr 16 17:39:52.459820 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.458778 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/37af66f0-77b6-4e5a-9cd0-c7267f06ab11-os-release\") pod \"multus-additional-cni-plugins-5ww9l\" (UID: \"37af66f0-77b6-4e5a-9cd0-c7267f06ab11\") " pod="openshift-multus/multus-additional-cni-plugins-5ww9l" Apr 16 17:39:52.459820 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.458809 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-db2l4\" (UniqueName: \"kubernetes.io/projected/88ac585a-175b-45f2-8696-1c754b2a5a05-kube-api-access-db2l4\") pod \"network-metrics-daemon-cxxrm\" (UID: \"88ac585a-175b-45f2-8696-1c754b2a5a05\") " pod="openshift-multus/network-metrics-daemon-cxxrm" Apr 16 17:39:52.459820 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.458833 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c1f0a51e-003d-4753-8b66-791533e66dea-cnibin\") pod \"multus-7zg22\" (UID: \"c1f0a51e-003d-4753-8b66-791533e66dea\") " pod="openshift-multus/multus-7zg22" Apr 16 17:39:52.459820 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.458859 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2ee69609-b777-4f06-988b-8e74dd389f13-var-lib-openvswitch\") pod \"ovnkube-node-42vwn\" (UID: \"2ee69609-b777-4f06-988b-8e74dd389f13\") " pod="openshift-ovn-kubernetes/ovnkube-node-42vwn" Apr 16 17:39:52.459820 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.458927 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2ee69609-b777-4f06-988b-8e74dd389f13-host-run-ovn-kubernetes\") pod \"ovnkube-node-42vwn\" (UID: \"2ee69609-b777-4f06-988b-8e74dd389f13\") " pod="openshift-ovn-kubernetes/ovnkube-node-42vwn" Apr 16 17:39:52.459820 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.458970 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/2ee69609-b777-4f06-988b-8e74dd389f13-host-kubelet\") pod \"ovnkube-node-42vwn\" (UID: \"2ee69609-b777-4f06-988b-8e74dd389f13\") " pod="openshift-ovn-kubernetes/ovnkube-node-42vwn" Apr 16 17:39:52.459820 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.458999 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/fb3c3b95-67e6-454b-854a-5ec64e63536a-kubelet-dir\") pod \"aws-ebs-csi-driver-node-cv6hz\" (UID: \"fb3c3b95-67e6-454b-854a-5ec64e63536a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cv6hz" Apr 16 17:39:52.459820 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.459054 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c1f0a51e-003d-4753-8b66-791533e66dea-system-cni-dir\") pod \"multus-7zg22\" (UID: \"c1f0a51e-003d-4753-8b66-791533e66dea\") " pod="openshift-multus/multus-7zg22" Apr 16 17:39:52.459820 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.459068 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2ee69609-b777-4f06-988b-8e74dd389f13-var-lib-openvswitch\") pod \"ovnkube-node-42vwn\" (UID: \"2ee69609-b777-4f06-988b-8e74dd389f13\") " pod="openshift-ovn-kubernetes/ovnkube-node-42vwn" Apr 16 17:39:52.459820 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.459075 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a3277964-04b5-4058-a09e-f604a2b93fe1-lib-modules\") pod \"tuned-thhnl\" (UID: \"a3277964-04b5-4058-a09e-f604a2b93fe1\") " pod="openshift-cluster-node-tuning-operator/tuned-thhnl" Apr 16 17:39:52.460674 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.459069 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c1f0a51e-003d-4753-8b66-791533e66dea-cnibin\") pod \"multus-7zg22\" (UID: \"c1f0a51e-003d-4753-8b66-791533e66dea\") " pod="openshift-multus/multus-7zg22" Apr 16 17:39:52.460674 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.459099 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a3277964-04b5-4058-a09e-f604a2b93fe1-var-lib-kubelet\") pod \"tuned-thhnl\" (UID: \"a3277964-04b5-4058-a09e-f604a2b93fe1\") " pod="openshift-cluster-node-tuning-operator/tuned-thhnl" Apr 16 17:39:52.460674 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.459112 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2ee69609-b777-4f06-988b-8e74dd389f13-host-run-ovn-kubernetes\") pod \"ovnkube-node-42vwn\" (UID: \"2ee69609-b777-4f06-988b-8e74dd389f13\") " pod="openshift-ovn-kubernetes/ovnkube-node-42vwn" Apr 16 17:39:52.460674 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.459114 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a3277964-04b5-4058-a09e-f604a2b93fe1-sys\") pod \"tuned-thhnl\" (UID: \"a3277964-04b5-4058-a09e-f604a2b93fe1\") " pod="openshift-cluster-node-tuning-operator/tuned-thhnl" Apr 16 17:39:52.460674 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.459124 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2ee69609-b777-4f06-988b-8e74dd389f13-etc-openvswitch\") pod \"ovnkube-node-42vwn\" (UID: \"2ee69609-b777-4f06-988b-8e74dd389f13\") " pod="openshift-ovn-kubernetes/ovnkube-node-42vwn" Apr 16 17:39:52.460674 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.459160 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a3277964-04b5-4058-a09e-f604a2b93fe1-var-lib-kubelet\") pod \"tuned-thhnl\" (UID: \"a3277964-04b5-4058-a09e-f604a2b93fe1\") " pod="openshift-cluster-node-tuning-operator/tuned-thhnl" Apr 16 17:39:52.460674 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.459167 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2ee69609-b777-4f06-988b-8e74dd389f13-etc-openvswitch\") pod \"ovnkube-node-42vwn\" (UID: \"2ee69609-b777-4f06-988b-8e74dd389f13\") " pod="openshift-ovn-kubernetes/ovnkube-node-42vwn" Apr 16 17:39:52.460674 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.459188 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c1f0a51e-003d-4753-8b66-791533e66dea-host-run-netns\") pod \"multus-7zg22\" (UID: \"c1f0a51e-003d-4753-8b66-791533e66dea\") " pod="openshift-multus/multus-7zg22" Apr 16 17:39:52.460674 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.459217 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c1f0a51e-003d-4753-8b66-791533e66dea-etc-kubernetes\") pod \"multus-7zg22\" (UID: \"c1f0a51e-003d-4753-8b66-791533e66dea\") " pod="openshift-multus/multus-7zg22" Apr 16 17:39:52.460674 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.459221 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c1f0a51e-003d-4753-8b66-791533e66dea-host-run-netns\") pod \"multus-7zg22\" (UID: \"c1f0a51e-003d-4753-8b66-791533e66dea\") " pod="openshift-multus/multus-7zg22" Apr 16 17:39:52.460674 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.459252 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/2ee69609-b777-4f06-988b-8e74dd389f13-systemd-units\") pod \"ovnkube-node-42vwn\" (UID: \"2ee69609-b777-4f06-988b-8e74dd389f13\") " pod="openshift-ovn-kubernetes/ovnkube-node-42vwn" Apr 16 17:39:52.460674 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.459190 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a3277964-04b5-4058-a09e-f604a2b93fe1-lib-modules\") pod \"tuned-thhnl\" (UID: \"a3277964-04b5-4058-a09e-f604a2b93fe1\") " pod="openshift-cluster-node-tuning-operator/tuned-thhnl" Apr 16 17:39:52.460674 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.459279 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/c9af89df-b4ab-4045-b2fc-4762862c0d37-agent-certs\") pod \"konnectivity-agent-9465m\" (UID: \"c9af89df-b4ab-4045-b2fc-4762862c0d37\") " pod="kube-system/konnectivity-agent-9465m" Apr 16 17:39:52.460674 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.459299 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c1f0a51e-003d-4753-8b66-791533e66dea-etc-kubernetes\") pod \"multus-7zg22\" (UID: \"c1f0a51e-003d-4753-8b66-791533e66dea\") " pod="openshift-multus/multus-7zg22" Apr 16 17:39:52.460674 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.459301 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/2ee69609-b777-4f06-988b-8e74dd389f13-systemd-units\") pod \"ovnkube-node-42vwn\" (UID: \"2ee69609-b777-4f06-988b-8e74dd389f13\") " pod="openshift-ovn-kubernetes/ovnkube-node-42vwn" Apr 16 17:39:52.460674 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.459304 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/a3277964-04b5-4058-a09e-f604a2b93fe1-etc-systemd\") pod \"tuned-thhnl\" (UID: \"a3277964-04b5-4058-a09e-f604a2b93fe1\") " pod="openshift-cluster-node-tuning-operator/tuned-thhnl" Apr 16 17:39:52.460674 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.459347 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/a3277964-04b5-4058-a09e-f604a2b93fe1-etc-systemd\") pod \"tuned-thhnl\" (UID: \"a3277964-04b5-4058-a09e-f604a2b93fe1\") " pod="openshift-cluster-node-tuning-operator/tuned-thhnl" Apr 16 17:39:52.460674 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.459358 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cvrjx\" (UniqueName: \"kubernetes.io/projected/fb3c3b95-67e6-454b-854a-5ec64e63536a-kube-api-access-cvrjx\") pod \"aws-ebs-csi-driver-node-cv6hz\" (UID: \"fb3c3b95-67e6-454b-854a-5ec64e63536a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cv6hz" Apr 16 17:39:52.461481 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.459384 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c1f0a51e-003d-4753-8b66-791533e66dea-multus-cni-dir\") pod \"multus-7zg22\" (UID: \"c1f0a51e-003d-4753-8b66-791533e66dea\") " pod="openshift-multus/multus-7zg22" Apr 16 17:39:52.461481 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.459408 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/2ee69609-b777-4f06-988b-8e74dd389f13-run-ovn\") pod \"ovnkube-node-42vwn\" (UID: \"2ee69609-b777-4f06-988b-8e74dd389f13\") " pod="openshift-ovn-kubernetes/ovnkube-node-42vwn" Apr 16 17:39:52.461481 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.459430 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/2ee69609-b777-4f06-988b-8e74dd389f13-log-socket\") pod \"ovnkube-node-42vwn\" (UID: \"2ee69609-b777-4f06-988b-8e74dd389f13\") " pod="openshift-ovn-kubernetes/ovnkube-node-42vwn" Apr 16 17:39:52.461481 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.459454 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/a3277964-04b5-4058-a09e-f604a2b93fe1-etc-sysctl-conf\") pod \"tuned-thhnl\" (UID: \"a3277964-04b5-4058-a09e-f604a2b93fe1\") " pod="openshift-cluster-node-tuning-operator/tuned-thhnl" Apr 16 17:39:52.461481 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.459476 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/a3277964-04b5-4058-a09e-f604a2b93fe1-run\") pod \"tuned-thhnl\" (UID: \"a3277964-04b5-4058-a09e-f604a2b93fe1\") " pod="openshift-cluster-node-tuning-operator/tuned-thhnl" Apr 16 17:39:52.461481 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.459521 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/37af66f0-77b6-4e5a-9cd0-c7267f06ab11-cni-binary-copy\") pod \"multus-additional-cni-plugins-5ww9l\" (UID: \"37af66f0-77b6-4e5a-9cd0-c7267f06ab11\") " pod="openshift-multus/multus-additional-cni-plugins-5ww9l" Apr 16 17:39:52.461481 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.459498 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c1f0a51e-003d-4753-8b66-791533e66dea-multus-cni-dir\") pod \"multus-7zg22\" (UID: \"c1f0a51e-003d-4753-8b66-791533e66dea\") " pod="openshift-multus/multus-7zg22" Apr 16 17:39:52.461481 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.459558 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2ee69609-b777-4f06-988b-8e74dd389f13-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-42vwn\" (UID: \"2ee69609-b777-4f06-988b-8e74dd389f13\") " pod="openshift-ovn-kubernetes/ovnkube-node-42vwn" Apr 16 17:39:52.461481 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.459602 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/c9af89df-b4ab-4045-b2fc-4762862c0d37-konnectivity-ca\") pod \"konnectivity-agent-9465m\" (UID: \"c9af89df-b4ab-4045-b2fc-4762862c0d37\") " pod="kube-system/konnectivity-agent-9465m" Apr 16 17:39:52.461481 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.459626 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/fb3c3b95-67e6-454b-854a-5ec64e63536a-sys-fs\") pod \"aws-ebs-csi-driver-node-cv6hz\" (UID: \"fb3c3b95-67e6-454b-854a-5ec64e63536a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cv6hz" Apr 16 17:39:52.461481 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.459649 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a3277964-04b5-4058-a09e-f604a2b93fe1-etc-kubernetes\") pod \"tuned-thhnl\" (UID: \"a3277964-04b5-4058-a09e-f604a2b93fe1\") " pod="openshift-cluster-node-tuning-operator/tuned-thhnl" Apr 16 17:39:52.461481 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.459676 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c7x5g\" (UniqueName: \"kubernetes.io/projected/37af66f0-77b6-4e5a-9cd0-c7267f06ab11-kube-api-access-c7x5g\") pod \"multus-additional-cni-plugins-5ww9l\" (UID: \"37af66f0-77b6-4e5a-9cd0-c7267f06ab11\") " pod="openshift-multus/multus-additional-cni-plugins-5ww9l" Apr 16 17:39:52.461481 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.459702 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/88ac585a-175b-45f2-8696-1c754b2a5a05-metrics-certs\") pod \"network-metrics-daemon-cxxrm\" (UID: \"88ac585a-175b-45f2-8696-1c754b2a5a05\") " pod="openshift-multus/network-metrics-daemon-cxxrm" Apr 16 17:39:52.461481 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.459728 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2ee69609-b777-4f06-988b-8e74dd389f13-host-cni-bin\") pod \"ovnkube-node-42vwn\" (UID: \"2ee69609-b777-4f06-988b-8e74dd389f13\") " pod="openshift-ovn-kubernetes/ovnkube-node-42vwn" Apr 16 17:39:52.461481 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.459733 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/2ee69609-b777-4f06-988b-8e74dd389f13-run-ovn\") pod \"ovnkube-node-42vwn\" (UID: \"2ee69609-b777-4f06-988b-8e74dd389f13\") " pod="openshift-ovn-kubernetes/ovnkube-node-42vwn" Apr 16 17:39:52.461481 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.459750 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/fb3c3b95-67e6-454b-854a-5ec64e63536a-etc-selinux\") pod \"aws-ebs-csi-driver-node-cv6hz\" (UID: \"fb3c3b95-67e6-454b-854a-5ec64e63536a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cv6hz" Apr 16 17:39:52.461481 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.459771 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rkgdj\" (UniqueName: \"kubernetes.io/projected/a3277964-04b5-4058-a09e-f604a2b93fe1-kube-api-access-rkgdj\") pod \"tuned-thhnl\" (UID: \"a3277964-04b5-4058-a09e-f604a2b93fe1\") " pod="openshift-cluster-node-tuning-operator/tuned-thhnl" Apr 16 17:39:52.462297 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.459773 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/2ee69609-b777-4f06-988b-8e74dd389f13-log-socket\") pod \"ovnkube-node-42vwn\" (UID: \"2ee69609-b777-4f06-988b-8e74dd389f13\") " pod="openshift-ovn-kubernetes/ovnkube-node-42vwn" Apr 16 17:39:52.462297 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.459700 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/a3277964-04b5-4058-a09e-f604a2b93fe1-etc-sysctl-conf\") pod \"tuned-thhnl\" (UID: \"a3277964-04b5-4058-a09e-f604a2b93fe1\") " pod="openshift-cluster-node-tuning-operator/tuned-thhnl" Apr 16 17:39:52.462297 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.459845 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/fb3c3b95-67e6-454b-854a-5ec64e63536a-sys-fs\") pod \"aws-ebs-csi-driver-node-cv6hz\" (UID: \"fb3c3b95-67e6-454b-854a-5ec64e63536a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cv6hz" Apr 16 17:39:52.462297 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.459854 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a3277964-04b5-4058-a09e-f604a2b93fe1-etc-kubernetes\") pod \"tuned-thhnl\" (UID: \"a3277964-04b5-4058-a09e-f604a2b93fe1\") " pod="openshift-cluster-node-tuning-operator/tuned-thhnl" Apr 16 17:39:52.462297 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.459877 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/a3277964-04b5-4058-a09e-f604a2b93fe1-run\") pod \"tuned-thhnl\" (UID: \"a3277964-04b5-4058-a09e-f604a2b93fe1\") " pod="openshift-cluster-node-tuning-operator/tuned-thhnl" Apr 16 17:39:52.462297 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.459891 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2ee69609-b777-4f06-988b-8e74dd389f13-host-cni-bin\") pod \"ovnkube-node-42vwn\" (UID: \"2ee69609-b777-4f06-988b-8e74dd389f13\") " pod="openshift-ovn-kubernetes/ovnkube-node-42vwn" Apr 16 17:39:52.462297 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.459930 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2ee69609-b777-4f06-988b-8e74dd389f13-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-42vwn\" (UID: \"2ee69609-b777-4f06-988b-8e74dd389f13\") " pod="openshift-ovn-kubernetes/ovnkube-node-42vwn" Apr 16 17:39:52.462297 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.460130 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/fb3c3b95-67e6-454b-854a-5ec64e63536a-etc-selinux\") pod \"aws-ebs-csi-driver-node-cv6hz\" (UID: \"fb3c3b95-67e6-454b-854a-5ec64e63536a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cv6hz" Apr 16 17:39:52.462297 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.460390 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/c9af89df-b4ab-4045-b2fc-4762862c0d37-konnectivity-ca\") pod \"konnectivity-agent-9465m\" (UID: \"c9af89df-b4ab-4045-b2fc-4762862c0d37\") " pod="kube-system/konnectivity-agent-9465m" Apr 16 17:39:52.462297 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.461167 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/a3277964-04b5-4058-a09e-f604a2b93fe1-tmp\") pod \"tuned-thhnl\" (UID: \"a3277964-04b5-4058-a09e-f604a2b93fe1\") " pod="openshift-cluster-node-tuning-operator/tuned-thhnl" Apr 16 17:39:52.462297 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.461201 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/a3277964-04b5-4058-a09e-f604a2b93fe1-etc-tuned\") pod \"tuned-thhnl\" (UID: \"a3277964-04b5-4058-a09e-f604a2b93fe1\") " pod="openshift-cluster-node-tuning-operator/tuned-thhnl" Apr 16 17:39:52.462297 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.461329 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2ee69609-b777-4f06-988b-8e74dd389f13-ovn-node-metrics-cert\") pod \"ovnkube-node-42vwn\" (UID: \"2ee69609-b777-4f06-988b-8e74dd389f13\") " pod="openshift-ovn-kubernetes/ovnkube-node-42vwn" Apr 16 17:39:52.462915 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.462894 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/c9af89df-b4ab-4045-b2fc-4762862c0d37-agent-certs\") pod \"konnectivity-agent-9465m\" (UID: \"c9af89df-b4ab-4045-b2fc-4762862c0d37\") " pod="kube-system/konnectivity-agent-9465m" Apr 16 17:39:52.465722 ip-10-0-138-45 kubenswrapper[2574]: E0416 17:39:52.465698 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 17:39:52.465722 ip-10-0-138-45 kubenswrapper[2574]: E0416 17:39:52.465723 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 17:39:52.465931 ip-10-0-138-45 kubenswrapper[2574]: E0416 17:39:52.465736 2574 projected.go:194] Error preparing data for projected volume kube-api-access-rvl7j for pod openshift-network-diagnostics/network-check-target-2kgzs: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 17:39:52.465931 ip-10-0-138-45 kubenswrapper[2574]: E0416 17:39:52.465811 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b2fa5fab-8dc1-4b89-bb5a-d252dd35fa2d-kube-api-access-rvl7j podName:b2fa5fab-8dc1-4b89-bb5a-d252dd35fa2d nodeName:}" failed. No retries permitted until 2026-04-16 17:39:52.965789702 +0000 UTC m=+3.125159978 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-rvl7j" (UniqueName: "kubernetes.io/projected/b2fa5fab-8dc1-4b89-bb5a-d252dd35fa2d-kube-api-access-rvl7j") pod "network-check-target-2kgzs" (UID: "b2fa5fab-8dc1-4b89-bb5a-d252dd35fa2d") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 17:39:52.466466 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.466438 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j2rj2\" (UniqueName: \"kubernetes.io/projected/c1f0a51e-003d-4753-8b66-791533e66dea-kube-api-access-j2rj2\") pod \"multus-7zg22\" (UID: \"c1f0a51e-003d-4753-8b66-791533e66dea\") " pod="openshift-multus/multus-7zg22" Apr 16 17:39:52.468301 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.468282 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2qx9d\" (UniqueName: \"kubernetes.io/projected/2ee69609-b777-4f06-988b-8e74dd389f13-kube-api-access-2qx9d\") pod \"ovnkube-node-42vwn\" (UID: \"2ee69609-b777-4f06-988b-8e74dd389f13\") " pod="openshift-ovn-kubernetes/ovnkube-node-42vwn" Apr 16 17:39:52.469275 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.469252 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2ck7g\" (UniqueName: \"kubernetes.io/projected/810acbe7-5369-4316-8f52-ddfcdd34e838-kube-api-access-2ck7g\") pod \"node-ca-gbhm8\" (UID: \"810acbe7-5369-4316-8f52-ddfcdd34e838\") " pod="openshift-image-registry/node-ca-gbhm8" Apr 16 17:39:52.470647 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.470604 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rkgdj\" (UniqueName: \"kubernetes.io/projected/a3277964-04b5-4058-a09e-f604a2b93fe1-kube-api-access-rkgdj\") pod \"tuned-thhnl\" (UID: \"a3277964-04b5-4058-a09e-f604a2b93fe1\") " pod="openshift-cluster-node-tuning-operator/tuned-thhnl" Apr 16 17:39:52.470787 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.470769 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4mmnm\" (UniqueName: \"kubernetes.io/projected/08a0b4d9-4a6f-45bc-821e-59612b76469a-kube-api-access-4mmnm\") pod \"iptables-alerter-cwp8j\" (UID: \"08a0b4d9-4a6f-45bc-821e-59612b76469a\") " pod="openshift-network-operator/iptables-alerter-cwp8j" Apr 16 17:39:52.471215 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.471195 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cvrjx\" (UniqueName: \"kubernetes.io/projected/fb3c3b95-67e6-454b-854a-5ec64e63536a-kube-api-access-cvrjx\") pod \"aws-ebs-csi-driver-node-cv6hz\" (UID: \"fb3c3b95-67e6-454b-854a-5ec64e63536a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cv6hz" Apr 16 17:39:52.560415 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.560376 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/37af66f0-77b6-4e5a-9cd0-c7267f06ab11-cnibin\") pod \"multus-additional-cni-plugins-5ww9l\" (UID: \"37af66f0-77b6-4e5a-9cd0-c7267f06ab11\") " pod="openshift-multus/multus-additional-cni-plugins-5ww9l" Apr 16 17:39:52.560415 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.560416 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/37af66f0-77b6-4e5a-9cd0-c7267f06ab11-os-release\") pod \"multus-additional-cni-plugins-5ww9l\" (UID: \"37af66f0-77b6-4e5a-9cd0-c7267f06ab11\") " pod="openshift-multus/multus-additional-cni-plugins-5ww9l" Apr 16 17:39:52.560693 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.560436 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-db2l4\" (UniqueName: \"kubernetes.io/projected/88ac585a-175b-45f2-8696-1c754b2a5a05-kube-api-access-db2l4\") pod \"network-metrics-daemon-cxxrm\" (UID: \"88ac585a-175b-45f2-8696-1c754b2a5a05\") " pod="openshift-multus/network-metrics-daemon-cxxrm" Apr 16 17:39:52.560693 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.560494 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/37af66f0-77b6-4e5a-9cd0-c7267f06ab11-cnibin\") pod \"multus-additional-cni-plugins-5ww9l\" (UID: \"37af66f0-77b6-4e5a-9cd0-c7267f06ab11\") " pod="openshift-multus/multus-additional-cni-plugins-5ww9l" Apr 16 17:39:52.560693 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.560534 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/37af66f0-77b6-4e5a-9cd0-c7267f06ab11-os-release\") pod \"multus-additional-cni-plugins-5ww9l\" (UID: \"37af66f0-77b6-4e5a-9cd0-c7267f06ab11\") " pod="openshift-multus/multus-additional-cni-plugins-5ww9l" Apr 16 17:39:52.560693 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.560592 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/37af66f0-77b6-4e5a-9cd0-c7267f06ab11-cni-binary-copy\") pod \"multus-additional-cni-plugins-5ww9l\" (UID: \"37af66f0-77b6-4e5a-9cd0-c7267f06ab11\") " pod="openshift-multus/multus-additional-cni-plugins-5ww9l" Apr 16 17:39:52.560693 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.560639 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c7x5g\" (UniqueName: \"kubernetes.io/projected/37af66f0-77b6-4e5a-9cd0-c7267f06ab11-kube-api-access-c7x5g\") pod \"multus-additional-cni-plugins-5ww9l\" (UID: \"37af66f0-77b6-4e5a-9cd0-c7267f06ab11\") " pod="openshift-multus/multus-additional-cni-plugins-5ww9l" Apr 16 17:39:52.560693 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.560680 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/88ac585a-175b-45f2-8696-1c754b2a5a05-metrics-certs\") pod \"network-metrics-daemon-cxxrm\" (UID: \"88ac585a-175b-45f2-8696-1c754b2a5a05\") " pod="openshift-multus/network-metrics-daemon-cxxrm" Apr 16 17:39:52.560986 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.560721 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/37af66f0-77b6-4e5a-9cd0-c7267f06ab11-tuning-conf-dir\") pod \"multus-additional-cni-plugins-5ww9l\" (UID: \"37af66f0-77b6-4e5a-9cd0-c7267f06ab11\") " pod="openshift-multus/multus-additional-cni-plugins-5ww9l" Apr 16 17:39:52.560986 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.560756 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/37af66f0-77b6-4e5a-9cd0-c7267f06ab11-system-cni-dir\") pod \"multus-additional-cni-plugins-5ww9l\" (UID: \"37af66f0-77b6-4e5a-9cd0-c7267f06ab11\") " pod="openshift-multus/multus-additional-cni-plugins-5ww9l" Apr 16 17:39:52.560986 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.560784 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/37af66f0-77b6-4e5a-9cd0-c7267f06ab11-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-5ww9l\" (UID: \"37af66f0-77b6-4e5a-9cd0-c7267f06ab11\") " pod="openshift-multus/multus-additional-cni-plugins-5ww9l" Apr 16 17:39:52.560986 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.560815 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/37af66f0-77b6-4e5a-9cd0-c7267f06ab11-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-5ww9l\" (UID: \"37af66f0-77b6-4e5a-9cd0-c7267f06ab11\") " pod="openshift-multus/multus-additional-cni-plugins-5ww9l" Apr 16 17:39:52.560986 ip-10-0-138-45 kubenswrapper[2574]: E0416 17:39:52.560857 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 17:39:52.560986 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.560889 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/37af66f0-77b6-4e5a-9cd0-c7267f06ab11-system-cni-dir\") pod \"multus-additional-cni-plugins-5ww9l\" (UID: \"37af66f0-77b6-4e5a-9cd0-c7267f06ab11\") " pod="openshift-multus/multus-additional-cni-plugins-5ww9l" Apr 16 17:39:52.560986 ip-10-0-138-45 kubenswrapper[2574]: E0416 17:39:52.560927 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/88ac585a-175b-45f2-8696-1c754b2a5a05-metrics-certs podName:88ac585a-175b-45f2-8696-1c754b2a5a05 nodeName:}" failed. No retries permitted until 2026-04-16 17:39:53.060907318 +0000 UTC m=+3.220277598 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/88ac585a-175b-45f2-8696-1c754b2a5a05-metrics-certs") pod "network-metrics-daemon-cxxrm" (UID: "88ac585a-175b-45f2-8696-1c754b2a5a05") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 17:39:52.561344 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.561135 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/37af66f0-77b6-4e5a-9cd0-c7267f06ab11-cni-binary-copy\") pod \"multus-additional-cni-plugins-5ww9l\" (UID: \"37af66f0-77b6-4e5a-9cd0-c7267f06ab11\") " pod="openshift-multus/multus-additional-cni-plugins-5ww9l" Apr 16 17:39:52.561399 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.561351 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/37af66f0-77b6-4e5a-9cd0-c7267f06ab11-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-5ww9l\" (UID: \"37af66f0-77b6-4e5a-9cd0-c7267f06ab11\") " pod="openshift-multus/multus-additional-cni-plugins-5ww9l" Apr 16 17:39:52.561483 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.561404 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/37af66f0-77b6-4e5a-9cd0-c7267f06ab11-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-5ww9l\" (UID: \"37af66f0-77b6-4e5a-9cd0-c7267f06ab11\") " pod="openshift-multus/multus-additional-cni-plugins-5ww9l" Apr 16 17:39:52.561616 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.561598 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/37af66f0-77b6-4e5a-9cd0-c7267f06ab11-tuning-conf-dir\") pod \"multus-additional-cni-plugins-5ww9l\" (UID: \"37af66f0-77b6-4e5a-9cd0-c7267f06ab11\") " pod="openshift-multus/multus-additional-cni-plugins-5ww9l" Apr 16 17:39:52.576207 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.576131 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-db2l4\" (UniqueName: \"kubernetes.io/projected/88ac585a-175b-45f2-8696-1c754b2a5a05-kube-api-access-db2l4\") pod \"network-metrics-daemon-cxxrm\" (UID: \"88ac585a-175b-45f2-8696-1c754b2a5a05\") " pod="openshift-multus/network-metrics-daemon-cxxrm" Apr 16 17:39:52.576884 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.576865 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c7x5g\" (UniqueName: \"kubernetes.io/projected/37af66f0-77b6-4e5a-9cd0-c7267f06ab11-kube-api-access-c7x5g\") pod \"multus-additional-cni-plugins-5ww9l\" (UID: \"37af66f0-77b6-4e5a-9cd0-c7267f06ab11\") " pod="openshift-multus/multus-additional-cni-plugins-5ww9l" Apr 16 17:39:52.640803 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.640769 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-9465m" Apr 16 17:39:52.647728 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.647696 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-7zg22" Apr 16 17:39:52.658365 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.658337 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-cwp8j" Apr 16 17:39:52.662437 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.662126 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-42vwn" Apr 16 17:39:52.668342 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.668318 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cv6hz" Apr 16 17:39:52.674752 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.674733 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-thhnl" Apr 16 17:39:52.682238 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.682214 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-gbhm8" Apr 16 17:39:52.686811 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:52.686792 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-5ww9l" Apr 16 17:39:53.064508 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:53.064425 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rvl7j\" (UniqueName: \"kubernetes.io/projected/b2fa5fab-8dc1-4b89-bb5a-d252dd35fa2d-kube-api-access-rvl7j\") pod \"network-check-target-2kgzs\" (UID: \"b2fa5fab-8dc1-4b89-bb5a-d252dd35fa2d\") " pod="openshift-network-diagnostics/network-check-target-2kgzs" Apr 16 17:39:53.064508 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:53.064493 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/88ac585a-175b-45f2-8696-1c754b2a5a05-metrics-certs\") pod \"network-metrics-daemon-cxxrm\" (UID: \"88ac585a-175b-45f2-8696-1c754b2a5a05\") " pod="openshift-multus/network-metrics-daemon-cxxrm" Apr 16 17:39:53.064751 ip-10-0-138-45 kubenswrapper[2574]: E0416 17:39:53.064612 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 17:39:53.064751 ip-10-0-138-45 kubenswrapper[2574]: E0416 17:39:53.064631 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 17:39:53.064751 ip-10-0-138-45 kubenswrapper[2574]: E0416 17:39:53.064643 2574 projected.go:194] Error preparing data for projected volume kube-api-access-rvl7j for pod openshift-network-diagnostics/network-check-target-2kgzs: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 17:39:53.064751 ip-10-0-138-45 kubenswrapper[2574]: E0416 17:39:53.064644 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 17:39:53.064751 ip-10-0-138-45 kubenswrapper[2574]: E0416 17:39:53.064718 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b2fa5fab-8dc1-4b89-bb5a-d252dd35fa2d-kube-api-access-rvl7j podName:b2fa5fab-8dc1-4b89-bb5a-d252dd35fa2d nodeName:}" failed. No retries permitted until 2026-04-16 17:39:54.064702453 +0000 UTC m=+4.224072716 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-rvl7j" (UniqueName: "kubernetes.io/projected/b2fa5fab-8dc1-4b89-bb5a-d252dd35fa2d-kube-api-access-rvl7j") pod "network-check-target-2kgzs" (UID: "b2fa5fab-8dc1-4b89-bb5a-d252dd35fa2d") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 17:39:53.064751 ip-10-0-138-45 kubenswrapper[2574]: E0416 17:39:53.064735 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/88ac585a-175b-45f2-8696-1c754b2a5a05-metrics-certs podName:88ac585a-175b-45f2-8696-1c754b2a5a05 nodeName:}" failed. No retries permitted until 2026-04-16 17:39:54.064728715 +0000 UTC m=+4.224098975 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/88ac585a-175b-45f2-8696-1c754b2a5a05-metrics-certs") pod "network-metrics-daemon-cxxrm" (UID: "88ac585a-175b-45f2-8696-1c754b2a5a05") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 17:39:53.186894 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:53.186860 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda3277964_04b5_4058_a09e_f604a2b93fe1.slice/crio-c29261a2a76a1b7a1c1eb92b6a2ff629fe1c1e398012aa5f17b83ff1cf5a818e WatchSource:0}: Error finding container c29261a2a76a1b7a1c1eb92b6a2ff629fe1c1e398012aa5f17b83ff1cf5a818e: Status 404 returned error can't find the container with id c29261a2a76a1b7a1c1eb92b6a2ff629fe1c1e398012aa5f17b83ff1cf5a818e Apr 16 17:39:53.188339 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:53.188242 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod08a0b4d9_4a6f_45bc_821e_59612b76469a.slice/crio-ee0fc091aa49922ea0f6985a420593fd783419e9c98e1e29657df324ccac8cf3 WatchSource:0}: Error finding container ee0fc091aa49922ea0f6985a420593fd783419e9c98e1e29657df324ccac8cf3: Status 404 returned error can't find the container with id ee0fc091aa49922ea0f6985a420593fd783419e9c98e1e29657df324ccac8cf3 Apr 16 17:39:53.191134 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:53.191108 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2ee69609_b777_4f06_988b_8e74dd389f13.slice/crio-2c9b4aac1773878e5506bf073e62d4e6b91b49b7f54dcdb4e459b845f817a6de WatchSource:0}: Error finding container 2c9b4aac1773878e5506bf073e62d4e6b91b49b7f54dcdb4e459b845f817a6de: Status 404 returned error can't find the container with id 2c9b4aac1773878e5506bf073e62d4e6b91b49b7f54dcdb4e459b845f817a6de Apr 16 17:39:53.211028 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:53.211000 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37af66f0_77b6_4e5a_9cd0_c7267f06ab11.slice/crio-3a9922ea8b28998e2160802d83c9a92e181803bb054e6110fa0430913a5ea568 WatchSource:0}: Error finding container 3a9922ea8b28998e2160802d83c9a92e181803bb054e6110fa0430913a5ea568: Status 404 returned error can't find the container with id 3a9922ea8b28998e2160802d83c9a92e181803bb054e6110fa0430913a5ea568 Apr 16 17:39:53.211792 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:53.211770 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfb3c3b95_67e6_454b_854a_5ec64e63536a.slice/crio-ada4ae33194b7841088147e749790700d4f0f4cb0a5ee574d1cc11876d430922 WatchSource:0}: Error finding container ada4ae33194b7841088147e749790700d4f0f4cb0a5ee574d1cc11876d430922: Status 404 returned error can't find the container with id ada4ae33194b7841088147e749790700d4f0f4cb0a5ee574d1cc11876d430922 Apr 16 17:39:53.212561 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:53.212473 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc9af89df_b4ab_4045_b2fc_4762862c0d37.slice/crio-f2cf6347e77fec7393e27b56b75e8d3e1ef64dec35dd58d81eadca42a3ed302a WatchSource:0}: Error finding container f2cf6347e77fec7393e27b56b75e8d3e1ef64dec35dd58d81eadca42a3ed302a: Status 404 returned error can't find the container with id f2cf6347e77fec7393e27b56b75e8d3e1ef64dec35dd58d81eadca42a3ed302a Apr 16 17:39:53.213169 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:53.213148 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod810acbe7_5369_4316_8f52_ddfcdd34e838.slice/crio-6c82dada0050fb8f91f566ac0409c4e27d110cdf39a18fe7f3c0f25bfc97998f WatchSource:0}: Error finding container 6c82dada0050fb8f91f566ac0409c4e27d110cdf39a18fe7f3c0f25bfc97998f: Status 404 returned error can't find the container with id 6c82dada0050fb8f91f566ac0409c4e27d110cdf39a18fe7f3c0f25bfc97998f Apr 16 17:39:53.239584 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:39:53.239535 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc1f0a51e_003d_4753_8b66_791533e66dea.slice/crio-29bdba13dc29f455aae2e9535dfd60bb7c44a156c846e9537f4e0e6ba1bf35d1 WatchSource:0}: Error finding container 29bdba13dc29f455aae2e9535dfd60bb7c44a156c846e9537f4e0e6ba1bf35d1: Status 404 returned error can't find the container with id 29bdba13dc29f455aae2e9535dfd60bb7c44a156c846e9537f4e0e6ba1bf35d1 Apr 16 17:39:53.381457 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:53.381197 2574 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 17:34:51 +0000 UTC" deadline="2027-10-21 16:19:08.916974175 +0000 UTC" Apr 16 17:39:53.381457 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:53.381385 2574 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13270h39m15.535594339s" Apr 16 17:39:53.450692 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:53.450659 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2kgzs" Apr 16 17:39:53.450857 ip-10-0-138-45 kubenswrapper[2574]: E0416 17:39:53.450769 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2kgzs" podUID="b2fa5fab-8dc1-4b89-bb5a-d252dd35fa2d" Apr 16 17:39:53.458222 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:53.458189 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-45.ec2.internal" event={"ID":"2eb2699ca22310dd4864f65d554362ea","Type":"ContainerStarted","Data":"5774de39d2184cd537ca924e5300fd3667710ddb9afb8e93b634671a5c980510"} Apr 16 17:39:53.459361 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:53.459332 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-7zg22" event={"ID":"c1f0a51e-003d-4753-8b66-791533e66dea","Type":"ContainerStarted","Data":"29bdba13dc29f455aae2e9535dfd60bb7c44a156c846e9537f4e0e6ba1bf35d1"} Apr 16 17:39:53.460280 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:53.460256 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cv6hz" event={"ID":"fb3c3b95-67e6-454b-854a-5ec64e63536a","Type":"ContainerStarted","Data":"ada4ae33194b7841088147e749790700d4f0f4cb0a5ee574d1cc11876d430922"} Apr 16 17:39:53.461171 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:53.461150 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-gbhm8" event={"ID":"810acbe7-5369-4316-8f52-ddfcdd34e838","Type":"ContainerStarted","Data":"6c82dada0050fb8f91f566ac0409c4e27d110cdf39a18fe7f3c0f25bfc97998f"} Apr 16 17:39:53.462090 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:53.462066 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-42vwn" event={"ID":"2ee69609-b777-4f06-988b-8e74dd389f13","Type":"ContainerStarted","Data":"2c9b4aac1773878e5506bf073e62d4e6b91b49b7f54dcdb4e459b845f817a6de"} Apr 16 17:39:53.463461 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:53.463436 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-thhnl" event={"ID":"a3277964-04b5-4058-a09e-f604a2b93fe1","Type":"ContainerStarted","Data":"c29261a2a76a1b7a1c1eb92b6a2ff629fe1c1e398012aa5f17b83ff1cf5a818e"} Apr 16 17:39:53.464325 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:53.464308 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-9465m" event={"ID":"c9af89df-b4ab-4045-b2fc-4762862c0d37","Type":"ContainerStarted","Data":"f2cf6347e77fec7393e27b56b75e8d3e1ef64dec35dd58d81eadca42a3ed302a"} Apr 16 17:39:53.465262 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:53.465230 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5ww9l" event={"ID":"37af66f0-77b6-4e5a-9cd0-c7267f06ab11","Type":"ContainerStarted","Data":"3a9922ea8b28998e2160802d83c9a92e181803bb054e6110fa0430913a5ea568"} Apr 16 17:39:53.466213 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:53.466191 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-cwp8j" event={"ID":"08a0b4d9-4a6f-45bc-821e-59612b76469a","Type":"ContainerStarted","Data":"ee0fc091aa49922ea0f6985a420593fd783419e9c98e1e29657df324ccac8cf3"} Apr 16 17:39:53.471762 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:53.471228 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-45.ec2.internal" podStartSLOduration=2.471218191 podStartE2EDuration="2.471218191s" podCreationTimestamp="2026-04-16 17:39:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 17:39:53.471218277 +0000 UTC m=+3.630588561" watchObservedRunningTime="2026-04-16 17:39:53.471218191 +0000 UTC m=+3.630588473" Apr 16 17:39:53.541952 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:53.541907 2574 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 17:39:54.071517 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:54.071479 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rvl7j\" (UniqueName: \"kubernetes.io/projected/b2fa5fab-8dc1-4b89-bb5a-d252dd35fa2d-kube-api-access-rvl7j\") pod \"network-check-target-2kgzs\" (UID: \"b2fa5fab-8dc1-4b89-bb5a-d252dd35fa2d\") " pod="openshift-network-diagnostics/network-check-target-2kgzs" Apr 16 17:39:54.071728 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:54.071555 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/88ac585a-175b-45f2-8696-1c754b2a5a05-metrics-certs\") pod \"network-metrics-daemon-cxxrm\" (UID: \"88ac585a-175b-45f2-8696-1c754b2a5a05\") " pod="openshift-multus/network-metrics-daemon-cxxrm" Apr 16 17:39:54.071728 ip-10-0-138-45 kubenswrapper[2574]: E0416 17:39:54.071711 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 17:39:54.071835 ip-10-0-138-45 kubenswrapper[2574]: E0416 17:39:54.071774 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/88ac585a-175b-45f2-8696-1c754b2a5a05-metrics-certs podName:88ac585a-175b-45f2-8696-1c754b2a5a05 nodeName:}" failed. No retries permitted until 2026-04-16 17:39:56.071755007 +0000 UTC m=+6.231125285 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/88ac585a-175b-45f2-8696-1c754b2a5a05-metrics-certs") pod "network-metrics-daemon-cxxrm" (UID: "88ac585a-175b-45f2-8696-1c754b2a5a05") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 17:39:54.072168 ip-10-0-138-45 kubenswrapper[2574]: E0416 17:39:54.072044 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 17:39:54.072168 ip-10-0-138-45 kubenswrapper[2574]: E0416 17:39:54.072071 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 17:39:54.072168 ip-10-0-138-45 kubenswrapper[2574]: E0416 17:39:54.072083 2574 projected.go:194] Error preparing data for projected volume kube-api-access-rvl7j for pod openshift-network-diagnostics/network-check-target-2kgzs: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 17:39:54.072168 ip-10-0-138-45 kubenswrapper[2574]: E0416 17:39:54.072133 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b2fa5fab-8dc1-4b89-bb5a-d252dd35fa2d-kube-api-access-rvl7j podName:b2fa5fab-8dc1-4b89-bb5a-d252dd35fa2d nodeName:}" failed. No retries permitted until 2026-04-16 17:39:56.072116971 +0000 UTC m=+6.231487244 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-rvl7j" (UniqueName: "kubernetes.io/projected/b2fa5fab-8dc1-4b89-bb5a-d252dd35fa2d-kube-api-access-rvl7j") pod "network-check-target-2kgzs" (UID: "b2fa5fab-8dc1-4b89-bb5a-d252dd35fa2d") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 17:39:54.451187 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:54.450674 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cxxrm" Apr 16 17:39:54.451187 ip-10-0-138-45 kubenswrapper[2574]: E0416 17:39:54.450822 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cxxrm" podUID="88ac585a-175b-45f2-8696-1c754b2a5a05" Apr 16 17:39:55.450357 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:55.450284 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2kgzs" Apr 16 17:39:55.450524 ip-10-0-138-45 kubenswrapper[2574]: E0416 17:39:55.450418 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2kgzs" podUID="b2fa5fab-8dc1-4b89-bb5a-d252dd35fa2d" Apr 16 17:39:55.492767 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:55.492730 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-45.ec2.internal" event={"ID":"cc1dec6d25ddbc91146c8c11e71acdd6","Type":"ContainerDied","Data":"326280d16815298934e87bd944f946337bcf7d6de726095afa1209a3aa6bb7ed"} Apr 16 17:39:55.493226 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:55.492550 2574 generic.go:358] "Generic (PLEG): container finished" podID="cc1dec6d25ddbc91146c8c11e71acdd6" containerID="326280d16815298934e87bd944f946337bcf7d6de726095afa1209a3aa6bb7ed" exitCode=0 Apr 16 17:39:56.088566 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:56.088525 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rvl7j\" (UniqueName: \"kubernetes.io/projected/b2fa5fab-8dc1-4b89-bb5a-d252dd35fa2d-kube-api-access-rvl7j\") pod \"network-check-target-2kgzs\" (UID: \"b2fa5fab-8dc1-4b89-bb5a-d252dd35fa2d\") " pod="openshift-network-diagnostics/network-check-target-2kgzs" Apr 16 17:39:56.088784 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:56.088608 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/88ac585a-175b-45f2-8696-1c754b2a5a05-metrics-certs\") pod \"network-metrics-daemon-cxxrm\" (UID: \"88ac585a-175b-45f2-8696-1c754b2a5a05\") " pod="openshift-multus/network-metrics-daemon-cxxrm" Apr 16 17:39:56.088784 ip-10-0-138-45 kubenswrapper[2574]: E0416 17:39:56.088753 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 17:39:56.088933 ip-10-0-138-45 kubenswrapper[2574]: E0416 17:39:56.088817 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/88ac585a-175b-45f2-8696-1c754b2a5a05-metrics-certs podName:88ac585a-175b-45f2-8696-1c754b2a5a05 nodeName:}" failed. No retries permitted until 2026-04-16 17:40:00.088798126 +0000 UTC m=+10.248168401 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/88ac585a-175b-45f2-8696-1c754b2a5a05-metrics-certs") pod "network-metrics-daemon-cxxrm" (UID: "88ac585a-175b-45f2-8696-1c754b2a5a05") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 17:39:56.089252 ip-10-0-138-45 kubenswrapper[2574]: E0416 17:39:56.089222 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 17:39:56.089252 ip-10-0-138-45 kubenswrapper[2574]: E0416 17:39:56.089249 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 17:39:56.089427 ip-10-0-138-45 kubenswrapper[2574]: E0416 17:39:56.089263 2574 projected.go:194] Error preparing data for projected volume kube-api-access-rvl7j for pod openshift-network-diagnostics/network-check-target-2kgzs: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 17:39:56.089427 ip-10-0-138-45 kubenswrapper[2574]: E0416 17:39:56.089309 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b2fa5fab-8dc1-4b89-bb5a-d252dd35fa2d-kube-api-access-rvl7j podName:b2fa5fab-8dc1-4b89-bb5a-d252dd35fa2d nodeName:}" failed. No retries permitted until 2026-04-16 17:40:00.089293972 +0000 UTC m=+10.248664234 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-rvl7j" (UniqueName: "kubernetes.io/projected/b2fa5fab-8dc1-4b89-bb5a-d252dd35fa2d-kube-api-access-rvl7j") pod "network-check-target-2kgzs" (UID: "b2fa5fab-8dc1-4b89-bb5a-d252dd35fa2d") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 17:39:56.451811 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:56.451729 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cxxrm" Apr 16 17:39:56.451968 ip-10-0-138-45 kubenswrapper[2574]: E0416 17:39:56.451868 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cxxrm" podUID="88ac585a-175b-45f2-8696-1c754b2a5a05" Apr 16 17:39:57.450740 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:57.450193 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2kgzs" Apr 16 17:39:57.450740 ip-10-0-138-45 kubenswrapper[2574]: E0416 17:39:57.450322 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2kgzs" podUID="b2fa5fab-8dc1-4b89-bb5a-d252dd35fa2d" Apr 16 17:39:58.450266 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:58.450234 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cxxrm" Apr 16 17:39:58.450455 ip-10-0-138-45 kubenswrapper[2574]: E0416 17:39:58.450375 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cxxrm" podUID="88ac585a-175b-45f2-8696-1c754b2a5a05" Apr 16 17:39:59.451290 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:39:59.450773 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2kgzs" Apr 16 17:39:59.451290 ip-10-0-138-45 kubenswrapper[2574]: E0416 17:39:59.450903 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2kgzs" podUID="b2fa5fab-8dc1-4b89-bb5a-d252dd35fa2d" Apr 16 17:40:00.126619 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:00.126563 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rvl7j\" (UniqueName: \"kubernetes.io/projected/b2fa5fab-8dc1-4b89-bb5a-d252dd35fa2d-kube-api-access-rvl7j\") pod \"network-check-target-2kgzs\" (UID: \"b2fa5fab-8dc1-4b89-bb5a-d252dd35fa2d\") " pod="openshift-network-diagnostics/network-check-target-2kgzs" Apr 16 17:40:00.126839 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:00.126650 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/88ac585a-175b-45f2-8696-1c754b2a5a05-metrics-certs\") pod \"network-metrics-daemon-cxxrm\" (UID: \"88ac585a-175b-45f2-8696-1c754b2a5a05\") " pod="openshift-multus/network-metrics-daemon-cxxrm" Apr 16 17:40:00.126839 ip-10-0-138-45 kubenswrapper[2574]: E0416 17:40:00.126752 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 17:40:00.126839 ip-10-0-138-45 kubenswrapper[2574]: E0416 17:40:00.126765 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 17:40:00.126839 ip-10-0-138-45 kubenswrapper[2574]: E0416 17:40:00.126776 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 17:40:00.126839 ip-10-0-138-45 kubenswrapper[2574]: E0416 17:40:00.126789 2574 projected.go:194] Error preparing data for projected volume kube-api-access-rvl7j for pod openshift-network-diagnostics/network-check-target-2kgzs: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 17:40:00.126839 ip-10-0-138-45 kubenswrapper[2574]: E0416 17:40:00.126826 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/88ac585a-175b-45f2-8696-1c754b2a5a05-metrics-certs podName:88ac585a-175b-45f2-8696-1c754b2a5a05 nodeName:}" failed. No retries permitted until 2026-04-16 17:40:08.126806069 +0000 UTC m=+18.286176340 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/88ac585a-175b-45f2-8696-1c754b2a5a05-metrics-certs") pod "network-metrics-daemon-cxxrm" (UID: "88ac585a-175b-45f2-8696-1c754b2a5a05") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 17:40:00.127120 ip-10-0-138-45 kubenswrapper[2574]: E0416 17:40:00.126846 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b2fa5fab-8dc1-4b89-bb5a-d252dd35fa2d-kube-api-access-rvl7j podName:b2fa5fab-8dc1-4b89-bb5a-d252dd35fa2d nodeName:}" failed. No retries permitted until 2026-04-16 17:40:08.126834843 +0000 UTC m=+18.286205110 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-rvl7j" (UniqueName: "kubernetes.io/projected/b2fa5fab-8dc1-4b89-bb5a-d252dd35fa2d-kube-api-access-rvl7j") pod "network-check-target-2kgzs" (UID: "b2fa5fab-8dc1-4b89-bb5a-d252dd35fa2d") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 17:40:00.451167 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:00.451131 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cxxrm" Apr 16 17:40:00.451335 ip-10-0-138-45 kubenswrapper[2574]: E0416 17:40:00.451278 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cxxrm" podUID="88ac585a-175b-45f2-8696-1c754b2a5a05" Apr 16 17:40:01.450390 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:01.450349 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2kgzs" Apr 16 17:40:01.450563 ip-10-0-138-45 kubenswrapper[2574]: E0416 17:40:01.450496 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2kgzs" podUID="b2fa5fab-8dc1-4b89-bb5a-d252dd35fa2d" Apr 16 17:40:02.450474 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:02.450433 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cxxrm" Apr 16 17:40:02.450891 ip-10-0-138-45 kubenswrapper[2574]: E0416 17:40:02.450582 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cxxrm" podUID="88ac585a-175b-45f2-8696-1c754b2a5a05" Apr 16 17:40:02.799893 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:02.799858 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-49fnc"] Apr 16 17:40:02.802705 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:02.802671 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-49fnc" Apr 16 17:40:02.802834 ip-10-0-138-45 kubenswrapper[2574]: E0416 17:40:02.802757 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-49fnc" podUID="cf7246f1-b28a-4d7e-bda9-3608f1c76516" Apr 16 17:40:02.849652 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:02.849603 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/cf7246f1-b28a-4d7e-bda9-3608f1c76516-kubelet-config\") pod \"global-pull-secret-syncer-49fnc\" (UID: \"cf7246f1-b28a-4d7e-bda9-3608f1c76516\") " pod="kube-system/global-pull-secret-syncer-49fnc" Apr 16 17:40:02.849814 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:02.849673 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/cf7246f1-b28a-4d7e-bda9-3608f1c76516-original-pull-secret\") pod \"global-pull-secret-syncer-49fnc\" (UID: \"cf7246f1-b28a-4d7e-bda9-3608f1c76516\") " pod="kube-system/global-pull-secret-syncer-49fnc" Apr 16 17:40:02.849814 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:02.849704 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/cf7246f1-b28a-4d7e-bda9-3608f1c76516-dbus\") pod \"global-pull-secret-syncer-49fnc\" (UID: \"cf7246f1-b28a-4d7e-bda9-3608f1c76516\") " pod="kube-system/global-pull-secret-syncer-49fnc" Apr 16 17:40:02.951023 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:02.950989 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/cf7246f1-b28a-4d7e-bda9-3608f1c76516-kubelet-config\") pod \"global-pull-secret-syncer-49fnc\" (UID: \"cf7246f1-b28a-4d7e-bda9-3608f1c76516\") " pod="kube-system/global-pull-secret-syncer-49fnc" Apr 16 17:40:02.951191 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:02.951037 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/cf7246f1-b28a-4d7e-bda9-3608f1c76516-original-pull-secret\") pod \"global-pull-secret-syncer-49fnc\" (UID: \"cf7246f1-b28a-4d7e-bda9-3608f1c76516\") " pod="kube-system/global-pull-secret-syncer-49fnc" Apr 16 17:40:02.951191 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:02.951057 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/cf7246f1-b28a-4d7e-bda9-3608f1c76516-dbus\") pod \"global-pull-secret-syncer-49fnc\" (UID: \"cf7246f1-b28a-4d7e-bda9-3608f1c76516\") " pod="kube-system/global-pull-secret-syncer-49fnc" Apr 16 17:40:02.951191 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:02.951133 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/cf7246f1-b28a-4d7e-bda9-3608f1c76516-kubelet-config\") pod \"global-pull-secret-syncer-49fnc\" (UID: \"cf7246f1-b28a-4d7e-bda9-3608f1c76516\") " pod="kube-system/global-pull-secret-syncer-49fnc" Apr 16 17:40:02.951191 ip-10-0-138-45 kubenswrapper[2574]: E0416 17:40:02.951154 2574 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 17:40:02.951377 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:02.951206 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/cf7246f1-b28a-4d7e-bda9-3608f1c76516-dbus\") pod \"global-pull-secret-syncer-49fnc\" (UID: \"cf7246f1-b28a-4d7e-bda9-3608f1c76516\") " pod="kube-system/global-pull-secret-syncer-49fnc" Apr 16 17:40:02.951377 ip-10-0-138-45 kubenswrapper[2574]: E0416 17:40:02.951217 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cf7246f1-b28a-4d7e-bda9-3608f1c76516-original-pull-secret podName:cf7246f1-b28a-4d7e-bda9-3608f1c76516 nodeName:}" failed. No retries permitted until 2026-04-16 17:40:03.451204068 +0000 UTC m=+13.610574333 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/cf7246f1-b28a-4d7e-bda9-3608f1c76516-original-pull-secret") pod "global-pull-secret-syncer-49fnc" (UID: "cf7246f1-b28a-4d7e-bda9-3608f1c76516") : object "kube-system"/"original-pull-secret" not registered Apr 16 17:40:03.450748 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:03.450720 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2kgzs" Apr 16 17:40:03.451168 ip-10-0-138-45 kubenswrapper[2574]: E0416 17:40:03.450826 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2kgzs" podUID="b2fa5fab-8dc1-4b89-bb5a-d252dd35fa2d" Apr 16 17:40:03.455924 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:03.455900 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/cf7246f1-b28a-4d7e-bda9-3608f1c76516-original-pull-secret\") pod \"global-pull-secret-syncer-49fnc\" (UID: \"cf7246f1-b28a-4d7e-bda9-3608f1c76516\") " pod="kube-system/global-pull-secret-syncer-49fnc" Apr 16 17:40:03.456047 ip-10-0-138-45 kubenswrapper[2574]: E0416 17:40:03.456024 2574 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 17:40:03.456102 ip-10-0-138-45 kubenswrapper[2574]: E0416 17:40:03.456096 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cf7246f1-b28a-4d7e-bda9-3608f1c76516-original-pull-secret podName:cf7246f1-b28a-4d7e-bda9-3608f1c76516 nodeName:}" failed. No retries permitted until 2026-04-16 17:40:04.456076303 +0000 UTC m=+14.615446594 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/cf7246f1-b28a-4d7e-bda9-3608f1c76516-original-pull-secret") pod "global-pull-secret-syncer-49fnc" (UID: "cf7246f1-b28a-4d7e-bda9-3608f1c76516") : object "kube-system"/"original-pull-secret" not registered Apr 16 17:40:04.450566 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:04.450526 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-49fnc" Apr 16 17:40:04.450566 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:04.450550 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cxxrm" Apr 16 17:40:04.450827 ip-10-0-138-45 kubenswrapper[2574]: E0416 17:40:04.450695 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cxxrm" podUID="88ac585a-175b-45f2-8696-1c754b2a5a05" Apr 16 17:40:04.450827 ip-10-0-138-45 kubenswrapper[2574]: E0416 17:40:04.450797 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-49fnc" podUID="cf7246f1-b28a-4d7e-bda9-3608f1c76516" Apr 16 17:40:04.462672 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:04.462644 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/cf7246f1-b28a-4d7e-bda9-3608f1c76516-original-pull-secret\") pod \"global-pull-secret-syncer-49fnc\" (UID: \"cf7246f1-b28a-4d7e-bda9-3608f1c76516\") " pod="kube-system/global-pull-secret-syncer-49fnc" Apr 16 17:40:04.462849 ip-10-0-138-45 kubenswrapper[2574]: E0416 17:40:04.462794 2574 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 17:40:04.462905 ip-10-0-138-45 kubenswrapper[2574]: E0416 17:40:04.462865 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cf7246f1-b28a-4d7e-bda9-3608f1c76516-original-pull-secret podName:cf7246f1-b28a-4d7e-bda9-3608f1c76516 nodeName:}" failed. No retries permitted until 2026-04-16 17:40:06.462846594 +0000 UTC m=+16.622216869 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/cf7246f1-b28a-4d7e-bda9-3608f1c76516-original-pull-secret") pod "global-pull-secret-syncer-49fnc" (UID: "cf7246f1-b28a-4d7e-bda9-3608f1c76516") : object "kube-system"/"original-pull-secret" not registered Apr 16 17:40:05.450095 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:05.450058 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2kgzs" Apr 16 17:40:05.450289 ip-10-0-138-45 kubenswrapper[2574]: E0416 17:40:05.450184 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2kgzs" podUID="b2fa5fab-8dc1-4b89-bb5a-d252dd35fa2d" Apr 16 17:40:06.450680 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:06.450647 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cxxrm" Apr 16 17:40:06.451095 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:06.450647 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-49fnc" Apr 16 17:40:06.451095 ip-10-0-138-45 kubenswrapper[2574]: E0416 17:40:06.450787 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cxxrm" podUID="88ac585a-175b-45f2-8696-1c754b2a5a05" Apr 16 17:40:06.451095 ip-10-0-138-45 kubenswrapper[2574]: E0416 17:40:06.450879 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-49fnc" podUID="cf7246f1-b28a-4d7e-bda9-3608f1c76516" Apr 16 17:40:06.475038 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:06.475003 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/cf7246f1-b28a-4d7e-bda9-3608f1c76516-original-pull-secret\") pod \"global-pull-secret-syncer-49fnc\" (UID: \"cf7246f1-b28a-4d7e-bda9-3608f1c76516\") " pod="kube-system/global-pull-secret-syncer-49fnc" Apr 16 17:40:06.475224 ip-10-0-138-45 kubenswrapper[2574]: E0416 17:40:06.475169 2574 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 17:40:06.475292 ip-10-0-138-45 kubenswrapper[2574]: E0416 17:40:06.475252 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cf7246f1-b28a-4d7e-bda9-3608f1c76516-original-pull-secret podName:cf7246f1-b28a-4d7e-bda9-3608f1c76516 nodeName:}" failed. No retries permitted until 2026-04-16 17:40:10.475231406 +0000 UTC m=+20.634601667 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/cf7246f1-b28a-4d7e-bda9-3608f1c76516-original-pull-secret") pod "global-pull-secret-syncer-49fnc" (UID: "cf7246f1-b28a-4d7e-bda9-3608f1c76516") : object "kube-system"/"original-pull-secret" not registered Apr 16 17:40:07.449942 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:07.449909 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2kgzs" Apr 16 17:40:07.450127 ip-10-0-138-45 kubenswrapper[2574]: E0416 17:40:07.450042 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2kgzs" podUID="b2fa5fab-8dc1-4b89-bb5a-d252dd35fa2d" Apr 16 17:40:08.187024 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:08.186946 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rvl7j\" (UniqueName: \"kubernetes.io/projected/b2fa5fab-8dc1-4b89-bb5a-d252dd35fa2d-kube-api-access-rvl7j\") pod \"network-check-target-2kgzs\" (UID: \"b2fa5fab-8dc1-4b89-bb5a-d252dd35fa2d\") " pod="openshift-network-diagnostics/network-check-target-2kgzs" Apr 16 17:40:08.187430 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:08.187057 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/88ac585a-175b-45f2-8696-1c754b2a5a05-metrics-certs\") pod \"network-metrics-daemon-cxxrm\" (UID: \"88ac585a-175b-45f2-8696-1c754b2a5a05\") " pod="openshift-multus/network-metrics-daemon-cxxrm" Apr 16 17:40:08.187430 ip-10-0-138-45 kubenswrapper[2574]: E0416 17:40:08.187144 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 17:40:08.187430 ip-10-0-138-45 kubenswrapper[2574]: E0416 17:40:08.187161 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 17:40:08.187430 ip-10-0-138-45 kubenswrapper[2574]: E0416 17:40:08.187171 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 17:40:08.187430 ip-10-0-138-45 kubenswrapper[2574]: E0416 17:40:08.187184 2574 projected.go:194] Error preparing data for projected volume kube-api-access-rvl7j for pod openshift-network-diagnostics/network-check-target-2kgzs: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 17:40:08.187430 ip-10-0-138-45 kubenswrapper[2574]: E0416 17:40:08.187220 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/88ac585a-175b-45f2-8696-1c754b2a5a05-metrics-certs podName:88ac585a-175b-45f2-8696-1c754b2a5a05 nodeName:}" failed. No retries permitted until 2026-04-16 17:40:24.187206808 +0000 UTC m=+34.346577068 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/88ac585a-175b-45f2-8696-1c754b2a5a05-metrics-certs") pod "network-metrics-daemon-cxxrm" (UID: "88ac585a-175b-45f2-8696-1c754b2a5a05") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 17:40:08.187430 ip-10-0-138-45 kubenswrapper[2574]: E0416 17:40:08.187236 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b2fa5fab-8dc1-4b89-bb5a-d252dd35fa2d-kube-api-access-rvl7j podName:b2fa5fab-8dc1-4b89-bb5a-d252dd35fa2d nodeName:}" failed. No retries permitted until 2026-04-16 17:40:24.18722839 +0000 UTC m=+34.346598650 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-rvl7j" (UniqueName: "kubernetes.io/projected/b2fa5fab-8dc1-4b89-bb5a-d252dd35fa2d-kube-api-access-rvl7j") pod "network-check-target-2kgzs" (UID: "b2fa5fab-8dc1-4b89-bb5a-d252dd35fa2d") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 17:40:08.450443 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:08.450349 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cxxrm" Apr 16 17:40:08.450622 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:08.450350 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-49fnc" Apr 16 17:40:08.450622 ip-10-0-138-45 kubenswrapper[2574]: E0416 17:40:08.450492 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cxxrm" podUID="88ac585a-175b-45f2-8696-1c754b2a5a05" Apr 16 17:40:08.450622 ip-10-0-138-45 kubenswrapper[2574]: E0416 17:40:08.450561 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-49fnc" podUID="cf7246f1-b28a-4d7e-bda9-3608f1c76516" Apr 16 17:40:09.450604 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:09.450555 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2kgzs" Apr 16 17:40:09.451041 ip-10-0-138-45 kubenswrapper[2574]: E0416 17:40:09.450693 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2kgzs" podUID="b2fa5fab-8dc1-4b89-bb5a-d252dd35fa2d" Apr 16 17:40:10.450558 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:10.450535 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cxxrm" Apr 16 17:40:10.450674 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:10.450659 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-49fnc" Apr 16 17:40:10.450943 ip-10-0-138-45 kubenswrapper[2574]: E0416 17:40:10.450666 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cxxrm" podUID="88ac585a-175b-45f2-8696-1c754b2a5a05" Apr 16 17:40:10.450943 ip-10-0-138-45 kubenswrapper[2574]: E0416 17:40:10.450735 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-49fnc" podUID="cf7246f1-b28a-4d7e-bda9-3608f1c76516" Apr 16 17:40:10.502846 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:10.502820 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/cf7246f1-b28a-4d7e-bda9-3608f1c76516-original-pull-secret\") pod \"global-pull-secret-syncer-49fnc\" (UID: \"cf7246f1-b28a-4d7e-bda9-3608f1c76516\") " pod="kube-system/global-pull-secret-syncer-49fnc" Apr 16 17:40:10.502967 ip-10-0-138-45 kubenswrapper[2574]: E0416 17:40:10.502952 2574 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 17:40:10.503020 ip-10-0-138-45 kubenswrapper[2574]: E0416 17:40:10.503009 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cf7246f1-b28a-4d7e-bda9-3608f1c76516-original-pull-secret podName:cf7246f1-b28a-4d7e-bda9-3608f1c76516 nodeName:}" failed. No retries permitted until 2026-04-16 17:40:18.502994937 +0000 UTC m=+28.662365197 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/cf7246f1-b28a-4d7e-bda9-3608f1c76516-original-pull-secret") pod "global-pull-secret-syncer-49fnc" (UID: "cf7246f1-b28a-4d7e-bda9-3608f1c76516") : object "kube-system"/"original-pull-secret" not registered Apr 16 17:40:11.450629 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:11.450238 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2kgzs" Apr 16 17:40:11.450801 ip-10-0-138-45 kubenswrapper[2574]: E0416 17:40:11.450718 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2kgzs" podUID="b2fa5fab-8dc1-4b89-bb5a-d252dd35fa2d" Apr 16 17:40:11.523800 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:11.523761 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-45.ec2.internal" event={"ID":"cc1dec6d25ddbc91146c8c11e71acdd6","Type":"ContainerStarted","Data":"205720fa972a14d053edd7b95b1e66ec56dd08d4152fd2d21cdaa2118df212a9"} Apr 16 17:40:11.524971 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:11.524943 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-7zg22" event={"ID":"c1f0a51e-003d-4753-8b66-791533e66dea","Type":"ContainerStarted","Data":"88d29c51bbd3be69d1b50500fcfd9e7b2fd586257980caeb176ef6be89c16f26"} Apr 16 17:40:11.526094 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:11.526074 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cv6hz" event={"ID":"fb3c3b95-67e6-454b-854a-5ec64e63536a","Type":"ContainerStarted","Data":"e7ab86731ecde5713aa138c9482c3441c944b11e3d58208d1fdfacb25a614bd9"} Apr 16 17:40:11.527251 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:11.527228 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-gbhm8" event={"ID":"810acbe7-5369-4316-8f52-ddfcdd34e838","Type":"ContainerStarted","Data":"e306fb1b060cdaafbe93830f0322f4833620b3066dd251a82ce5e5d1ad52f4b5"} Apr 16 17:40:11.529437 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:11.529419 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-42vwn_2ee69609-b777-4f06-988b-8e74dd389f13/ovn-acl-logging/0.log" Apr 16 17:40:11.529718 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:11.529701 2574 generic.go:358] "Generic (PLEG): container finished" podID="2ee69609-b777-4f06-988b-8e74dd389f13" containerID="9a9edaf02a3102b0d28cf5d66234a4f50e5e692b5d562d2f631af60b50a0ca4f" exitCode=1 Apr 16 17:40:11.529790 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:11.529751 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-42vwn" event={"ID":"2ee69609-b777-4f06-988b-8e74dd389f13","Type":"ContainerStarted","Data":"6602557f117e2b941330fd0a96de7cba66c7d1da57d0b1e86f904772fed87163"} Apr 16 17:40:11.529790 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:11.529765 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-42vwn" event={"ID":"2ee69609-b777-4f06-988b-8e74dd389f13","Type":"ContainerStarted","Data":"7833b91775de467e3dda1c9959f1f595ae9010b5581455d5500edb52e1c5cde1"} Apr 16 17:40:11.529790 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:11.529777 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-42vwn" event={"ID":"2ee69609-b777-4f06-988b-8e74dd389f13","Type":"ContainerStarted","Data":"a81be49612b0583188a28c33953cccf718e9f183144945febdd948d76060719c"} Apr 16 17:40:11.529790 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:11.529785 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-42vwn" event={"ID":"2ee69609-b777-4f06-988b-8e74dd389f13","Type":"ContainerStarted","Data":"b42299b2c46143f21627708f65202621cd6f21b8805d00e0ef2dafe9e8377917"} Apr 16 17:40:11.529912 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:11.529796 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-42vwn" event={"ID":"2ee69609-b777-4f06-988b-8e74dd389f13","Type":"ContainerDied","Data":"9a9edaf02a3102b0d28cf5d66234a4f50e5e692b5d562d2f631af60b50a0ca4f"} Apr 16 17:40:11.529912 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:11.529810 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-42vwn" event={"ID":"2ee69609-b777-4f06-988b-8e74dd389f13","Type":"ContainerStarted","Data":"e9f0763691740f918f32e417ae553291ce14ea1bc7e548a1ff9eaa5e7fce8a13"} Apr 16 17:40:11.530918 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:11.530887 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-thhnl" event={"ID":"a3277964-04b5-4058-a09e-f604a2b93fe1","Type":"ContainerStarted","Data":"04cf958f7efd470145123a4bddf1ef87e9143953e738986bf78ecb47d690b00a"} Apr 16 17:40:11.532083 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:11.532064 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-9465m" event={"ID":"c9af89df-b4ab-4045-b2fc-4762862c0d37","Type":"ContainerStarted","Data":"3cb0e834cf327512974073709de01526a1a898e258efa0a645c2b4a898bb474e"} Apr 16 17:40:11.533203 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:11.533184 2574 generic.go:358] "Generic (PLEG): container finished" podID="37af66f0-77b6-4e5a-9cd0-c7267f06ab11" containerID="4d8ecfc64c701c55af1ec61ed36b45496f4e5d31946207d436c5c3a70d03af5d" exitCode=0 Apr 16 17:40:11.533282 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:11.533220 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5ww9l" event={"ID":"37af66f0-77b6-4e5a-9cd0-c7267f06ab11","Type":"ContainerDied","Data":"4d8ecfc64c701c55af1ec61ed36b45496f4e5d31946207d436c5c3a70d03af5d"} Apr 16 17:40:11.537195 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:11.537165 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-45.ec2.internal" podStartSLOduration=20.537154688 podStartE2EDuration="20.537154688s" podCreationTimestamp="2026-04-16 17:39:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 17:40:11.536705574 +0000 UTC m=+21.696075857" watchObservedRunningTime="2026-04-16 17:40:11.537154688 +0000 UTC m=+21.696524967" Apr 16 17:40:11.570912 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:11.570861 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-thhnl" podStartSLOduration=4.291822198 podStartE2EDuration="21.570846631s" podCreationTimestamp="2026-04-16 17:39:50 +0000 UTC" firstStartedPulling="2026-04-16 17:39:53.189006746 +0000 UTC m=+3.348377012" lastFinishedPulling="2026-04-16 17:40:10.468031166 +0000 UTC m=+20.627401445" observedRunningTime="2026-04-16 17:40:11.570370711 +0000 UTC m=+21.729740993" watchObservedRunningTime="2026-04-16 17:40:11.570846631 +0000 UTC m=+21.730216913" Apr 16 17:40:11.583484 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:11.583434 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-9465m" podStartSLOduration=4.340717767 podStartE2EDuration="21.583420352s" podCreationTimestamp="2026-04-16 17:39:50 +0000 UTC" firstStartedPulling="2026-04-16 17:39:53.214157355 +0000 UTC m=+3.373527618" lastFinishedPulling="2026-04-16 17:40:10.45685993 +0000 UTC m=+20.616230203" observedRunningTime="2026-04-16 17:40:11.583130057 +0000 UTC m=+21.742500338" watchObservedRunningTime="2026-04-16 17:40:11.583420352 +0000 UTC m=+21.742790633" Apr 16 17:40:11.622413 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:11.622363 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-gbhm8" podStartSLOduration=9.185588427 podStartE2EDuration="21.622349168s" podCreationTimestamp="2026-04-16 17:39:50 +0000 UTC" firstStartedPulling="2026-04-16 17:39:53.214890974 +0000 UTC m=+3.374261233" lastFinishedPulling="2026-04-16 17:40:05.651651707 +0000 UTC m=+15.811021974" observedRunningTime="2026-04-16 17:40:11.622332021 +0000 UTC m=+21.781702314" watchObservedRunningTime="2026-04-16 17:40:11.622349168 +0000 UTC m=+21.781719447" Apr 16 17:40:11.622648 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:11.622628 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-7zg22" podStartSLOduration=4.38971455 podStartE2EDuration="21.622622944s" podCreationTimestamp="2026-04-16 17:39:50 +0000 UTC" firstStartedPulling="2026-04-16 17:39:53.242951404 +0000 UTC m=+3.402321668" lastFinishedPulling="2026-04-16 17:40:10.47585979 +0000 UTC m=+20.635230062" observedRunningTime="2026-04-16 17:40:11.596487022 +0000 UTC m=+21.755857303" watchObservedRunningTime="2026-04-16 17:40:11.622622944 +0000 UTC m=+21.781993226" Apr 16 17:40:11.839619 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:11.839561 2574 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 16 17:40:12.434443 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:12.434225 2574 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-16T17:40:11.839598027Z","UUID":"ba2f3c2a-1fbf-4e47-89bb-a0c38f8ee159","Handler":null,"Name":"","Endpoint":""} Apr 16 17:40:12.436138 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:12.436115 2574 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 16 17:40:12.436261 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:12.436145 2574 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 16 17:40:12.450766 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:12.450731 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cxxrm" Apr 16 17:40:12.450912 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:12.450744 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-49fnc" Apr 16 17:40:12.450912 ip-10-0-138-45 kubenswrapper[2574]: E0416 17:40:12.450870 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cxxrm" podUID="88ac585a-175b-45f2-8696-1c754b2a5a05" Apr 16 17:40:12.451252 ip-10-0-138-45 kubenswrapper[2574]: E0416 17:40:12.450928 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-49fnc" podUID="cf7246f1-b28a-4d7e-bda9-3608f1c76516" Apr 16 17:40:12.537255 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:12.537216 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cv6hz" event={"ID":"fb3c3b95-67e6-454b-854a-5ec64e63536a","Type":"ContainerStarted","Data":"a05ecc8bcc6d428ed48ded50122a30b6a50d9db2aa818f046c6453f1ff7d6fe3"} Apr 16 17:40:12.540311 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:12.539616 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-cwp8j" event={"ID":"08a0b4d9-4a6f-45bc-821e-59612b76469a","Type":"ContainerStarted","Data":"6c3a1f35ec87488b34f348c42061ccccce817d17391fe0ba45b7dccdd0530698"} Apr 16 17:40:12.570795 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:12.570735 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-cwp8j" podStartSLOduration=5.329503496 podStartE2EDuration="22.570718615s" podCreationTimestamp="2026-04-16 17:39:50 +0000 UTC" firstStartedPulling="2026-04-16 17:39:53.190193787 +0000 UTC m=+3.349564051" lastFinishedPulling="2026-04-16 17:40:10.431408899 +0000 UTC m=+20.590779170" observedRunningTime="2026-04-16 17:40:12.570627577 +0000 UTC m=+22.729997860" watchObservedRunningTime="2026-04-16 17:40:12.570718615 +0000 UTC m=+22.730088896" Apr 16 17:40:13.451069 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:13.450123 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2kgzs" Apr 16 17:40:13.451069 ip-10-0-138-45 kubenswrapper[2574]: E0416 17:40:13.450259 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2kgzs" podUID="b2fa5fab-8dc1-4b89-bb5a-d252dd35fa2d" Apr 16 17:40:13.545119 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:13.545093 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-42vwn_2ee69609-b777-4f06-988b-8e74dd389f13/ovn-acl-logging/0.log" Apr 16 17:40:13.545520 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:13.545493 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-42vwn" event={"ID":"2ee69609-b777-4f06-988b-8e74dd389f13","Type":"ContainerStarted","Data":"3c7c1082f85d2878fc2228d2db688178922b32b75e90a7261c04a83909030683"} Apr 16 17:40:13.547553 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:13.547521 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cv6hz" event={"ID":"fb3c3b95-67e6-454b-854a-5ec64e63536a","Type":"ContainerStarted","Data":"a9fb8191d05112b4270524cd1e19ca251fd3f0e9af79784e5a42226cfad0bb4f"} Apr 16 17:40:13.564684 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:13.564631 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cv6hz" podStartSLOduration=3.679545816 podStartE2EDuration="23.564615986s" podCreationTimestamp="2026-04-16 17:39:50 +0000 UTC" firstStartedPulling="2026-04-16 17:39:53.21336956 +0000 UTC m=+3.372739823" lastFinishedPulling="2026-04-16 17:40:13.09843973 +0000 UTC m=+23.257809993" observedRunningTime="2026-04-16 17:40:13.564524671 +0000 UTC m=+23.723894954" watchObservedRunningTime="2026-04-16 17:40:13.564615986 +0000 UTC m=+23.723986268" Apr 16 17:40:14.450589 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:14.450541 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-49fnc" Apr 16 17:40:14.450778 ip-10-0-138-45 kubenswrapper[2574]: E0416 17:40:14.450698 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-49fnc" podUID="cf7246f1-b28a-4d7e-bda9-3608f1c76516" Apr 16 17:40:14.450778 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:14.450762 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cxxrm" Apr 16 17:40:14.450901 ip-10-0-138-45 kubenswrapper[2574]: E0416 17:40:14.450878 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cxxrm" podUID="88ac585a-175b-45f2-8696-1c754b2a5a05" Apr 16 17:40:15.450923 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:15.450890 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2kgzs" Apr 16 17:40:15.451305 ip-10-0-138-45 kubenswrapper[2574]: E0416 17:40:15.451020 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2kgzs" podUID="b2fa5fab-8dc1-4b89-bb5a-d252dd35fa2d" Apr 16 17:40:15.510774 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:15.510735 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-9465m" Apr 16 17:40:15.511633 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:15.511398 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-9465m" Apr 16 17:40:15.551559 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:15.551535 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-9465m" Apr 16 17:40:15.551961 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:15.551942 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-9465m" Apr 16 17:40:16.450310 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:16.450281 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cxxrm" Apr 16 17:40:16.450450 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:16.450284 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-49fnc" Apr 16 17:40:16.450499 ip-10-0-138-45 kubenswrapper[2574]: E0416 17:40:16.450379 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cxxrm" podUID="88ac585a-175b-45f2-8696-1c754b2a5a05" Apr 16 17:40:16.450499 ip-10-0-138-45 kubenswrapper[2574]: E0416 17:40:16.450476 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-49fnc" podUID="cf7246f1-b28a-4d7e-bda9-3608f1c76516" Apr 16 17:40:16.554027 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:16.553998 2574 generic.go:358] "Generic (PLEG): container finished" podID="37af66f0-77b6-4e5a-9cd0-c7267f06ab11" containerID="5088f7f8bfb84737f2028f92a4e8605c1bae6f746f2d0b829d44ed091b9b2d67" exitCode=0 Apr 16 17:40:16.554805 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:16.554085 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5ww9l" event={"ID":"37af66f0-77b6-4e5a-9cd0-c7267f06ab11","Type":"ContainerDied","Data":"5088f7f8bfb84737f2028f92a4e8605c1bae6f746f2d0b829d44ed091b9b2d67"} Apr 16 17:40:16.557750 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:16.557734 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-42vwn_2ee69609-b777-4f06-988b-8e74dd389f13/ovn-acl-logging/0.log" Apr 16 17:40:16.558075 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:16.558051 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-42vwn" event={"ID":"2ee69609-b777-4f06-988b-8e74dd389f13","Type":"ContainerStarted","Data":"a2c5a723058eaf6639a1600e78f91f7abc9603d1b99e9e6d9895040eefddd4a4"} Apr 16 17:40:16.558485 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:16.558463 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-42vwn" Apr 16 17:40:16.558695 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:16.558680 2574 scope.go:117] "RemoveContainer" containerID="9a9edaf02a3102b0d28cf5d66234a4f50e5e692b5d562d2f631af60b50a0ca4f" Apr 16 17:40:16.574152 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:16.574129 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-42vwn" Apr 16 17:40:17.450708 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:17.450447 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2kgzs" Apr 16 17:40:17.450839 ip-10-0-138-45 kubenswrapper[2574]: E0416 17:40:17.450747 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2kgzs" podUID="b2fa5fab-8dc1-4b89-bb5a-d252dd35fa2d" Apr 16 17:40:17.522830 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:17.522795 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-2kgzs"] Apr 16 17:40:17.525374 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:17.525347 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-49fnc"] Apr 16 17:40:17.525516 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:17.525462 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-49fnc" Apr 16 17:40:17.525591 ip-10-0-138-45 kubenswrapper[2574]: E0416 17:40:17.525547 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-49fnc" podUID="cf7246f1-b28a-4d7e-bda9-3608f1c76516" Apr 16 17:40:17.528464 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:17.528442 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-cxxrm"] Apr 16 17:40:17.528581 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:17.528541 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cxxrm" Apr 16 17:40:17.528663 ip-10-0-138-45 kubenswrapper[2574]: E0416 17:40:17.528643 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cxxrm" podUID="88ac585a-175b-45f2-8696-1c754b2a5a05" Apr 16 17:40:17.563674 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:17.563647 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-42vwn_2ee69609-b777-4f06-988b-8e74dd389f13/ovn-acl-logging/0.log" Apr 16 17:40:17.564118 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:17.564043 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-42vwn" event={"ID":"2ee69609-b777-4f06-988b-8e74dd389f13","Type":"ContainerStarted","Data":"881cc79464dd07ac53e7a5bd711fbf3ff16b3ab54d8c259afcc86002decde9b6"} Apr 16 17:40:17.564409 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:17.564356 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-42vwn" Apr 16 17:40:17.564409 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:17.564389 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-42vwn" Apr 16 17:40:17.566368 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:17.566339 2574 generic.go:358] "Generic (PLEG): container finished" podID="37af66f0-77b6-4e5a-9cd0-c7267f06ab11" containerID="6c60ab32c6139833dd73c5649e0e3be2f1edf37c092e05ad12f0092a50563dfd" exitCode=0 Apr 16 17:40:17.566486 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:17.566367 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5ww9l" event={"ID":"37af66f0-77b6-4e5a-9cd0-c7267f06ab11","Type":"ContainerDied","Data":"6c60ab32c6139833dd73c5649e0e3be2f1edf37c092e05ad12f0092a50563dfd"} Apr 16 17:40:17.566486 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:17.566427 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2kgzs" Apr 16 17:40:17.566691 ip-10-0-138-45 kubenswrapper[2574]: E0416 17:40:17.566667 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2kgzs" podUID="b2fa5fab-8dc1-4b89-bb5a-d252dd35fa2d" Apr 16 17:40:17.580351 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:17.580331 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-42vwn" Apr 16 17:40:17.591554 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:17.591514 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-42vwn" podStartSLOduration=10.278836484 podStartE2EDuration="27.591501597s" podCreationTimestamp="2026-04-16 17:39:50 +0000 UTC" firstStartedPulling="2026-04-16 17:39:53.209948098 +0000 UTC m=+3.369318358" lastFinishedPulling="2026-04-16 17:40:10.522613208 +0000 UTC m=+20.681983471" observedRunningTime="2026-04-16 17:40:17.59076456 +0000 UTC m=+27.750134841" watchObservedRunningTime="2026-04-16 17:40:17.591501597 +0000 UTC m=+27.750871881" Apr 16 17:40:18.563690 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:18.563655 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/cf7246f1-b28a-4d7e-bda9-3608f1c76516-original-pull-secret\") pod \"global-pull-secret-syncer-49fnc\" (UID: \"cf7246f1-b28a-4d7e-bda9-3608f1c76516\") " pod="kube-system/global-pull-secret-syncer-49fnc" Apr 16 17:40:18.564072 ip-10-0-138-45 kubenswrapper[2574]: E0416 17:40:18.563775 2574 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 17:40:18.564072 ip-10-0-138-45 kubenswrapper[2574]: E0416 17:40:18.563829 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cf7246f1-b28a-4d7e-bda9-3608f1c76516-original-pull-secret podName:cf7246f1-b28a-4d7e-bda9-3608f1c76516 nodeName:}" failed. No retries permitted until 2026-04-16 17:40:34.563815457 +0000 UTC m=+44.723185717 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/cf7246f1-b28a-4d7e-bda9-3608f1c76516-original-pull-secret") pod "global-pull-secret-syncer-49fnc" (UID: "cf7246f1-b28a-4d7e-bda9-3608f1c76516") : object "kube-system"/"original-pull-secret" not registered Apr 16 17:40:18.570062 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:18.570038 2574 generic.go:358] "Generic (PLEG): container finished" podID="37af66f0-77b6-4e5a-9cd0-c7267f06ab11" containerID="c141b557f1c93a1e882e52d45bf9da7492f566d359c3d8ba72392a37c9c4b274" exitCode=0 Apr 16 17:40:18.570197 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:18.570123 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5ww9l" event={"ID":"37af66f0-77b6-4e5a-9cd0-c7267f06ab11","Type":"ContainerDied","Data":"c141b557f1c93a1e882e52d45bf9da7492f566d359c3d8ba72392a37c9c4b274"} Apr 16 17:40:19.450752 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:19.450718 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-49fnc" Apr 16 17:40:19.450959 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:19.450834 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2kgzs" Apr 16 17:40:19.450959 ip-10-0-138-45 kubenswrapper[2574]: E0416 17:40:19.450842 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-49fnc" podUID="cf7246f1-b28a-4d7e-bda9-3608f1c76516" Apr 16 17:40:19.450959 ip-10-0-138-45 kubenswrapper[2574]: E0416 17:40:19.450930 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2kgzs" podUID="b2fa5fab-8dc1-4b89-bb5a-d252dd35fa2d" Apr 16 17:40:19.451104 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:19.450983 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cxxrm" Apr 16 17:40:19.451104 ip-10-0-138-45 kubenswrapper[2574]: E0416 17:40:19.451051 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cxxrm" podUID="88ac585a-175b-45f2-8696-1c754b2a5a05" Apr 16 17:40:21.450657 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:21.450620 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2kgzs" Apr 16 17:40:21.451172 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:21.450629 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-49fnc" Apr 16 17:40:21.451172 ip-10-0-138-45 kubenswrapper[2574]: E0416 17:40:21.450728 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2kgzs" podUID="b2fa5fab-8dc1-4b89-bb5a-d252dd35fa2d" Apr 16 17:40:21.451172 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:21.450760 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cxxrm" Apr 16 17:40:21.451172 ip-10-0-138-45 kubenswrapper[2574]: E0416 17:40:21.450855 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-49fnc" podUID="cf7246f1-b28a-4d7e-bda9-3608f1c76516" Apr 16 17:40:21.451172 ip-10-0-138-45 kubenswrapper[2574]: E0416 17:40:21.450966 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cxxrm" podUID="88ac585a-175b-45f2-8696-1c754b2a5a05" Apr 16 17:40:23.450139 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:23.450103 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2kgzs" Apr 16 17:40:23.450536 ip-10-0-138-45 kubenswrapper[2574]: E0416 17:40:23.450240 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2kgzs" podUID="b2fa5fab-8dc1-4b89-bb5a-d252dd35fa2d" Apr 16 17:40:23.450536 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:23.450103 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-49fnc" Apr 16 17:40:23.450536 ip-10-0-138-45 kubenswrapper[2574]: E0416 17:40:23.450315 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-49fnc" podUID="cf7246f1-b28a-4d7e-bda9-3608f1c76516" Apr 16 17:40:23.450536 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:23.450103 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cxxrm" Apr 16 17:40:23.450536 ip-10-0-138-45 kubenswrapper[2574]: E0416 17:40:23.450369 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cxxrm" podUID="88ac585a-175b-45f2-8696-1c754b2a5a05" Apr 16 17:40:23.663532 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:23.663495 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-45.ec2.internal" event="NodeReady" Apr 16 17:40:23.663708 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:23.663660 2574 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 16 17:40:23.701226 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:23.701151 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-7f8454dc74-5t5l5"] Apr 16 17:40:23.746026 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:23.745991 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-n6pht"] Apr 16 17:40:23.763519 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:23.763484 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-667775844f-gm955"] Apr 16 17:40:23.763697 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:23.763551 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7f8454dc74-5t5l5" Apr 16 17:40:23.763697 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:23.763663 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-n6pht" Apr 16 17:40:23.765868 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:23.765840 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 16 17:40:23.766000 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:23.765920 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 16 17:40:23.766058 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:23.766005 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 16 17:40:23.766160 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:23.766142 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-n5znr\"" Apr 16 17:40:23.766259 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:23.766239 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"openshift-service-ca.crt\"" Apr 16 17:40:23.766348 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:23.766276 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"kube-root-ca.crt\"" Apr 16 17:40:23.766423 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:23.766411 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-storage-operator\"/\"volume-data-source-validator-dockercfg-jskpj\"" Apr 16 17:40:23.780522 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:23.778726 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-d87b8d5fc-zcvg5"] Apr 16 17:40:23.780522 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:23.778759 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 16 17:40:23.780522 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:23.779022 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-gm955" Apr 16 17:40:23.781018 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:23.780997 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-r9plp\"" Apr 16 17:40:23.781127 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:23.781090 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"openshift-service-ca.crt\"" Apr 16 17:40:23.781535 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:23.781516 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"samples-operator-tls\"" Apr 16 17:40:23.781657 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:23.781552 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"kube-root-ca.crt\"" Apr 16 17:40:23.791245 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:23.791221 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-qdmf7"] Apr 16 17:40:23.791367 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:23.791346 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-d87b8d5fc-zcvg5" Apr 16 17:40:23.793248 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:23.793058 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"serving-cert\"" Apr 16 17:40:23.793407 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:23.793392 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"console-operator-config\"" Apr 16 17:40:23.793482 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:23.793402 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"kube-root-ca.crt\"" Apr 16 17:40:23.793535 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:23.793485 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"openshift-service-ca.crt\"" Apr 16 17:40:23.793798 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:23.793783 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"console-operator-dockercfg-c2nn6\"" Apr 16 17:40:23.803711 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:23.802777 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"trusted-ca\"" Apr 16 17:40:23.803711 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:23.803045 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-69965bb79d-7fwlg"] Apr 16 17:40:23.803711 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:23.803207 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-qdmf7" Apr 16 17:40:23.804990 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:23.804973 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"serving-cert\"" Apr 16 17:40:23.805201 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:23.805140 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"config\"" Apr 16 17:40:23.805298 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:23.805225 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-storage-version-migrator-operator-dockercfg-rp6nl\"" Apr 16 17:40:23.805298 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:23.805243 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-root-ca.crt\"" Apr 16 17:40:23.805298 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:23.805155 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"openshift-service-ca.crt\"" Apr 16 17:40:23.813129 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:23.813110 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-n6pht"] Apr 16 17:40:23.813129 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:23.813131 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-d87b8d5fc-zcvg5"] Apr 16 17:40:23.813254 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:23.813140 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-7f8454dc74-5t5l5"] Apr 16 17:40:23.813254 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:23.813150 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-7b678d77c7-dn2q5"] Apr 16 17:40:23.813314 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:23.813264 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-7fwlg" Apr 16 17:40:23.815112 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:23.815090 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"openshift-service-ca.crt\"" Apr 16 17:40:23.815221 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:23.815114 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"serving-cert\"" Apr 16 17:40:23.815273 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:23.815249 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"kube-root-ca.crt\"" Apr 16 17:40:23.815692 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:23.815672 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-config\"" Apr 16 17:40:23.815824 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:23.815720 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-dockercfg-khwsd\"" Apr 16 17:40:23.820888 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:23.820679 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-79b8f4f7c7-ps6xx"] Apr 16 17:40:23.833164 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:23.833141 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-9c6fb4dd8-k7hnt"] Apr 16 17:40:23.833288 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:23.833209 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-7b678d77c7-dn2q5" Apr 16 17:40:23.833288 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:23.833249 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-79b8f4f7c7-ps6xx" Apr 16 17:40:23.835859 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:23.835835 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"network-diagnostics-dockercfg-qttzs\"" Apr 16 17:40:23.835859 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:23.835848 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-dockercfg-7hz8t\"" Apr 16 17:40:23.836055 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:23.835881 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 17:40:23.836055 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:23.835835 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 17:40:23.836206 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:23.836152 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 16 17:40:23.836206 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:23.836189 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 16 17:40:23.836525 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:23.836401 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-hub-kubeconfig\"" Apr 16 17:40:23.837136 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:23.837121 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 16 17:40:23.842720 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:23.842560 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-76b6b8b57d-ht8l8"] Apr 16 17:40:23.842720 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:23.842698 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-9c6fb4dd8-k7hnt" Apr 16 17:40:23.845054 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:23.844882 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"work-manager-hub-kubeconfig\"" Apr 16 17:40:23.849332 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:23.849313 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-qdmf7"] Apr 16 17:40:23.849440 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:23.849337 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-jv4sl"] Apr 16 17:40:23.849499 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:23.849447 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-76b6b8b57d-ht8l8" Apr 16 17:40:23.851665 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:23.851646 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-service-proxy-server-certificates\"" Apr 16 17:40:23.851823 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:23.851728 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-hub-kubeconfig\"" Apr 16 17:40:23.852136 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:23.852118 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-ca\"" Apr 16 17:40:23.852215 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:23.852123 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-open-cluster-management.io-proxy-agent-signer-client-cert\"" Apr 16 17:40:23.858143 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:23.858124 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-667775844f-gm955"] Apr 16 17:40:23.858232 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:23.858154 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-9c6fb4dd8-k7hnt"] Apr 16 17:40:23.858232 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:23.858169 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-7b678d77c7-dn2q5"] Apr 16 17:40:23.858232 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:23.858180 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-69965bb79d-7fwlg"] Apr 16 17:40:23.858232 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:23.858196 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-79b8f4f7c7-ps6xx"] Apr 16 17:40:23.858232 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:23.858207 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-76b6b8b57d-ht8l8"] Apr 16 17:40:23.858232 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:23.858218 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-jv4sl"] Apr 16 17:40:23.858514 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:23.858244 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-bs84n"] Apr 16 17:40:23.858514 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:23.858268 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-jv4sl" Apr 16 17:40:23.860197 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:23.860179 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 16 17:40:23.860360 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:23.860213 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 16 17:40:23.860455 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:23.860299 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-zdjnc\"" Apr 16 17:40:23.860455 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:23.860304 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 16 17:40:23.861498 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:23.861472 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-bs84n"] Apr 16 17:40:23.861639 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:23.861622 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-bs84n" Apr 16 17:40:23.863527 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:23.863509 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-qhzh2\"" Apr 16 17:40:23.864058 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:23.864038 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 16 17:40:23.864274 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:23.864239 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 16 17:40:23.864520 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:23.864503 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 16 17:40:23.864771 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:23.864752 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 16 17:40:23.909683 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:23.909655 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7d3e502c-98bf-4943-aa5a-2412d9b53369-registry-tls\") pod \"image-registry-7f8454dc74-5t5l5\" (UID: \"7d3e502c-98bf-4943-aa5a-2412d9b53369\") " pod="openshift-image-registry/image-registry-7f8454dc74-5t5l5" Apr 16 17:40:23.909860 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:23.909701 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/7d3e502c-98bf-4943-aa5a-2412d9b53369-installation-pull-secrets\") pod \"image-registry-7f8454dc74-5t5l5\" (UID: \"7d3e502c-98bf-4943-aa5a-2412d9b53369\") " pod="openshift-image-registry/image-registry-7f8454dc74-5t5l5" Apr 16 17:40:23.909860 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:23.909745 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7d3e502c-98bf-4943-aa5a-2412d9b53369-bound-sa-token\") pod \"image-registry-7f8454dc74-5t5l5\" (UID: \"7d3e502c-98bf-4943-aa5a-2412d9b53369\") " pod="openshift-image-registry/image-registry-7f8454dc74-5t5l5" Apr 16 17:40:23.909860 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:23.909776 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/7d3e502c-98bf-4943-aa5a-2412d9b53369-ca-trust-extracted\") pod \"image-registry-7f8454dc74-5t5l5\" (UID: \"7d3e502c-98bf-4943-aa5a-2412d9b53369\") " pod="openshift-image-registry/image-registry-7f8454dc74-5t5l5" Apr 16 17:40:23.909860 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:23.909799 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/05d7dd61-bda6-4c4d-87b6-86d7c20fe0cd-config\") pod \"console-operator-d87b8d5fc-zcvg5\" (UID: \"05d7dd61-bda6-4c4d-87b6-86d7c20fe0cd\") " pod="openshift-console-operator/console-operator-d87b8d5fc-zcvg5" Apr 16 17:40:23.910494 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:23.910455 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/05d7dd61-bda6-4c4d-87b6-86d7c20fe0cd-serving-cert\") pod \"console-operator-d87b8d5fc-zcvg5\" (UID: \"05d7dd61-bda6-4c4d-87b6-86d7c20fe0cd\") " pod="openshift-console-operator/console-operator-d87b8d5fc-zcvg5" Apr 16 17:40:23.910638 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:23.910545 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wsflf\" (UniqueName: \"kubernetes.io/projected/7df6448d-96ef-4887-bf9c-4b4dfb72c238-kube-api-access-wsflf\") pod \"kube-storage-version-migrator-operator-756bb7d76f-qdmf7\" (UID: \"7df6448d-96ef-4887-bf9c-4b4dfb72c238\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-qdmf7" Apr 16 17:40:23.910638 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:23.910623 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4qn24\" (UniqueName: \"kubernetes.io/projected/3bb520c8-5109-453e-8cb0-2d1c298b05a9-kube-api-access-4qn24\") pod \"cluster-samples-operator-667775844f-gm955\" (UID: \"3bb520c8-5109-453e-8cb0-2d1c298b05a9\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-gm955" Apr 16 17:40:23.910752 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:23.910651 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/05d7dd61-bda6-4c4d-87b6-86d7c20fe0cd-trusted-ca\") pod \"console-operator-d87b8d5fc-zcvg5\" (UID: \"05d7dd61-bda6-4c4d-87b6-86d7c20fe0cd\") " pod="openshift-console-operator/console-operator-d87b8d5fc-zcvg5" Apr 16 17:40:23.910752 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:23.910679 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nldfw\" (UniqueName: \"kubernetes.io/projected/7d3e502c-98bf-4943-aa5a-2412d9b53369-kube-api-access-nldfw\") pod \"image-registry-7f8454dc74-5t5l5\" (UID: \"7d3e502c-98bf-4943-aa5a-2412d9b53369\") " pod="openshift-image-registry/image-registry-7f8454dc74-5t5l5" Apr 16 17:40:23.910752 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:23.910734 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/7d3e502c-98bf-4943-aa5a-2412d9b53369-image-registry-private-configuration\") pod \"image-registry-7f8454dc74-5t5l5\" (UID: \"7d3e502c-98bf-4943-aa5a-2412d9b53369\") " pod="openshift-image-registry/image-registry-7f8454dc74-5t5l5" Apr 16 17:40:23.910859 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:23.910767 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kdqnp\" (UniqueName: \"kubernetes.io/projected/a957470a-21dc-4720-b171-61e476f5ec68-kube-api-access-kdqnp\") pod \"service-ca-operator-69965bb79d-7fwlg\" (UID: \"a957470a-21dc-4720-b171-61e476f5ec68\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-7fwlg" Apr 16 17:40:23.910859 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:23.910813 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/ec0d3d6e-865b-4664-bf7c-6b26346d0631-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-79b8f4f7c7-ps6xx\" (UID: \"ec0d3d6e-865b-4664-bf7c-6b26346d0631\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-79b8f4f7c7-ps6xx" Apr 16 17:40:23.910922 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:23.910866 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/7d3e502c-98bf-4943-aa5a-2412d9b53369-registry-certificates\") pod \"image-registry-7f8454dc74-5t5l5\" (UID: \"7d3e502c-98bf-4943-aa5a-2412d9b53369\") " pod="openshift-image-registry/image-registry-7f8454dc74-5t5l5" Apr 16 17:40:23.910922 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:23.910908 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9kq4r\" (UniqueName: \"kubernetes.io/projected/ec0d3d6e-865b-4664-bf7c-6b26346d0631-kube-api-access-9kq4r\") pod \"managed-serviceaccount-addon-agent-79b8f4f7c7-ps6xx\" (UID: \"ec0d3d6e-865b-4664-bf7c-6b26346d0631\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-79b8f4f7c7-ps6xx" Apr 16 17:40:23.911011 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:23.910997 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7df6448d-96ef-4887-bf9c-4b4dfb72c238-config\") pod \"kube-storage-version-migrator-operator-756bb7d76f-qdmf7\" (UID: \"7df6448d-96ef-4887-bf9c-4b4dfb72c238\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-qdmf7" Apr 16 17:40:23.911055 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:23.911039 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-928x8\" (UniqueName: \"kubernetes.io/projected/c2483f11-da48-413a-a820-2d4de16e6ae2-kube-api-access-928x8\") pod \"volume-data-source-validator-7d955d5dd4-n6pht\" (UID: \"c2483f11-da48-413a-a820-2d4de16e6ae2\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-n6pht" Apr 16 17:40:23.911101 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:23.911068 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a957470a-21dc-4720-b171-61e476f5ec68-serving-cert\") pod \"service-ca-operator-69965bb79d-7fwlg\" (UID: \"a957470a-21dc-4720-b171-61e476f5ec68\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-7fwlg" Apr 16 17:40:23.911101 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:23.911095 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7df6448d-96ef-4887-bf9c-4b4dfb72c238-serving-cert\") pod \"kube-storage-version-migrator-operator-756bb7d76f-qdmf7\" (UID: \"7df6448d-96ef-4887-bf9c-4b4dfb72c238\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-qdmf7" Apr 16 17:40:23.911175 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:23.911127 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/3bb520c8-5109-453e-8cb0-2d1c298b05a9-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-gm955\" (UID: \"3bb520c8-5109-453e-8cb0-2d1c298b05a9\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-gm955" Apr 16 17:40:23.911226 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:23.911201 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7d3e502c-98bf-4943-aa5a-2412d9b53369-trusted-ca\") pod \"image-registry-7f8454dc74-5t5l5\" (UID: \"7d3e502c-98bf-4943-aa5a-2412d9b53369\") " pod="openshift-image-registry/image-registry-7f8454dc74-5t5l5" Apr 16 17:40:23.911281 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:23.911246 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lbq2c\" (UniqueName: \"kubernetes.io/projected/e224778c-6e6a-4ece-bf4c-6f9f2accf344-kube-api-access-lbq2c\") pod \"network-check-source-7b678d77c7-dn2q5\" (UID: \"e224778c-6e6a-4ece-bf4c-6f9f2accf344\") " pod="openshift-network-diagnostics/network-check-source-7b678d77c7-dn2q5" Apr 16 17:40:23.911281 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:23.911273 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6l85n\" (UniqueName: \"kubernetes.io/projected/05d7dd61-bda6-4c4d-87b6-86d7c20fe0cd-kube-api-access-6l85n\") pod \"console-operator-d87b8d5fc-zcvg5\" (UID: \"05d7dd61-bda6-4c4d-87b6-86d7c20fe0cd\") " pod="openshift-console-operator/console-operator-d87b8d5fc-zcvg5" Apr 16 17:40:23.911375 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:23.911301 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a957470a-21dc-4720-b171-61e476f5ec68-config\") pod \"service-ca-operator-69965bb79d-7fwlg\" (UID: \"a957470a-21dc-4720-b171-61e476f5ec68\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-7fwlg" Apr 16 17:40:24.011934 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:24.011841 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/6b4b86fb-c5fd-4c18-8467-71f4575e6f41-tmp\") pod \"klusterlet-addon-workmgr-9c6fb4dd8-k7hnt\" (UID: \"6b4b86fb-c5fd-4c18-8467-71f4575e6f41\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-9c6fb4dd8-k7hnt" Apr 16 17:40:24.011934 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:24.011894 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/0583cad7-fc5e-4718-a618-7be058c6fa40-hub\") pod \"cluster-proxy-proxy-agent-76b6b8b57d-ht8l8\" (UID: \"0583cad7-fc5e-4718-a618-7be058c6fa40\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-76b6b8b57d-ht8l8" Apr 16 17:40:24.011934 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:24.011930 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kdqnp\" (UniqueName: \"kubernetes.io/projected/a957470a-21dc-4720-b171-61e476f5ec68-kube-api-access-kdqnp\") pod \"service-ca-operator-69965bb79d-7fwlg\" (UID: \"a957470a-21dc-4720-b171-61e476f5ec68\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-7fwlg" Apr 16 17:40:24.012208 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:24.011965 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9kq4r\" (UniqueName: \"kubernetes.io/projected/ec0d3d6e-865b-4664-bf7c-6b26346d0631-kube-api-access-9kq4r\") pod \"managed-serviceaccount-addon-agent-79b8f4f7c7-ps6xx\" (UID: \"ec0d3d6e-865b-4664-bf7c-6b26346d0631\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-79b8f4f7c7-ps6xx" Apr 16 17:40:24.012208 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:24.011996 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-928x8\" (UniqueName: \"kubernetes.io/projected/c2483f11-da48-413a-a820-2d4de16e6ae2-kube-api-access-928x8\") pod \"volume-data-source-validator-7d955d5dd4-n6pht\" (UID: \"c2483f11-da48-413a-a820-2d4de16e6ae2\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-n6pht" Apr 16 17:40:24.012208 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:24.012022 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a957470a-21dc-4720-b171-61e476f5ec68-serving-cert\") pod \"service-ca-operator-69965bb79d-7fwlg\" (UID: \"a957470a-21dc-4720-b171-61e476f5ec68\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-7fwlg" Apr 16 17:40:24.012208 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:24.012048 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7df6448d-96ef-4887-bf9c-4b4dfb72c238-serving-cert\") pod \"kube-storage-version-migrator-operator-756bb7d76f-qdmf7\" (UID: \"7df6448d-96ef-4887-bf9c-4b4dfb72c238\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-qdmf7" Apr 16 17:40:24.012208 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:24.012076 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c1e50cfa-f2ad-4fdb-bf97-8bab7c30928c-cert\") pod \"ingress-canary-jv4sl\" (UID: \"c1e50cfa-f2ad-4fdb-bf97-8bab7c30928c\") " pod="openshift-ingress-canary/ingress-canary-jv4sl" Apr 16 17:40:24.012467 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:24.012294 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tcm6v\" (UniqueName: \"kubernetes.io/projected/0583cad7-fc5e-4718-a618-7be058c6fa40-kube-api-access-tcm6v\") pod \"cluster-proxy-proxy-agent-76b6b8b57d-ht8l8\" (UID: \"0583cad7-fc5e-4718-a618-7be058c6fa40\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-76b6b8b57d-ht8l8" Apr 16 17:40:24.012467 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:24.012327 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lbq2c\" (UniqueName: \"kubernetes.io/projected/e224778c-6e6a-4ece-bf4c-6f9f2accf344-kube-api-access-lbq2c\") pod \"network-check-source-7b678d77c7-dn2q5\" (UID: \"e224778c-6e6a-4ece-bf4c-6f9f2accf344\") " pod="openshift-network-diagnostics/network-check-source-7b678d77c7-dn2q5" Apr 16 17:40:24.012467 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:24.012359 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6l85n\" (UniqueName: \"kubernetes.io/projected/05d7dd61-bda6-4c4d-87b6-86d7c20fe0cd-kube-api-access-6l85n\") pod \"console-operator-d87b8d5fc-zcvg5\" (UID: \"05d7dd61-bda6-4c4d-87b6-86d7c20fe0cd\") " pod="openshift-console-operator/console-operator-d87b8d5fc-zcvg5" Apr 16 17:40:24.012467 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:24.012416 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d237b125-8598-4bde-9a2c-f3145e257d0a-config-volume\") pod \"dns-default-bs84n\" (UID: \"d237b125-8598-4bde-9a2c-f3145e257d0a\") " pod="openshift-dns/dns-default-bs84n" Apr 16 17:40:24.012467 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:24.012456 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/6b4b86fb-c5fd-4c18-8467-71f4575e6f41-klusterlet-config\") pod \"klusterlet-addon-workmgr-9c6fb4dd8-k7hnt\" (UID: \"6b4b86fb-c5fd-4c18-8467-71f4575e6f41\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-9c6fb4dd8-k7hnt" Apr 16 17:40:24.012735 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:24.012508 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7d3e502c-98bf-4943-aa5a-2412d9b53369-bound-sa-token\") pod \"image-registry-7f8454dc74-5t5l5\" (UID: \"7d3e502c-98bf-4943-aa5a-2412d9b53369\") " pod="openshift-image-registry/image-registry-7f8454dc74-5t5l5" Apr 16 17:40:24.012735 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:24.012534 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/7d3e502c-98bf-4943-aa5a-2412d9b53369-installation-pull-secrets\") pod \"image-registry-7f8454dc74-5t5l5\" (UID: \"7d3e502c-98bf-4943-aa5a-2412d9b53369\") " pod="openshift-image-registry/image-registry-7f8454dc74-5t5l5" Apr 16 17:40:24.012735 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:24.012557 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/05d7dd61-bda6-4c4d-87b6-86d7c20fe0cd-config\") pod \"console-operator-d87b8d5fc-zcvg5\" (UID: \"05d7dd61-bda6-4c4d-87b6-86d7c20fe0cd\") " pod="openshift-console-operator/console-operator-d87b8d5fc-zcvg5" Apr 16 17:40:24.012735 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:24.012601 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/7d3e502c-98bf-4943-aa5a-2412d9b53369-ca-trust-extracted\") pod \"image-registry-7f8454dc74-5t5l5\" (UID: \"7d3e502c-98bf-4943-aa5a-2412d9b53369\") " pod="openshift-image-registry/image-registry-7f8454dc74-5t5l5" Apr 16 17:40:24.012735 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:24.012630 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/0583cad7-fc5e-4718-a618-7be058c6fa40-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-76b6b8b57d-ht8l8\" (UID: \"0583cad7-fc5e-4718-a618-7be058c6fa40\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-76b6b8b57d-ht8l8" Apr 16 17:40:24.012735 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:24.012658 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4qn24\" (UniqueName: \"kubernetes.io/projected/3bb520c8-5109-453e-8cb0-2d1c298b05a9-kube-api-access-4qn24\") pod \"cluster-samples-operator-667775844f-gm955\" (UID: \"3bb520c8-5109-453e-8cb0-2d1c298b05a9\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-gm955" Apr 16 17:40:24.013011 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:24.012891 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/05d7dd61-bda6-4c4d-87b6-86d7c20fe0cd-trusted-ca\") pod \"console-operator-d87b8d5fc-zcvg5\" (UID: \"05d7dd61-bda6-4c4d-87b6-86d7c20fe0cd\") " pod="openshift-console-operator/console-operator-d87b8d5fc-zcvg5" Apr 16 17:40:24.013011 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:24.012927 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mn2ww\" (UniqueName: \"kubernetes.io/projected/c1e50cfa-f2ad-4fdb-bf97-8bab7c30928c-kube-api-access-mn2ww\") pod \"ingress-canary-jv4sl\" (UID: \"c1e50cfa-f2ad-4fdb-bf97-8bab7c30928c\") " pod="openshift-ingress-canary/ingress-canary-jv4sl" Apr 16 17:40:24.013011 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:24.012960 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wsflf\" (UniqueName: \"kubernetes.io/projected/7df6448d-96ef-4887-bf9c-4b4dfb72c238-kube-api-access-wsflf\") pod \"kube-storage-version-migrator-operator-756bb7d76f-qdmf7\" (UID: \"7df6448d-96ef-4887-bf9c-4b4dfb72c238\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-qdmf7" Apr 16 17:40:24.013167 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:24.013010 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d237b125-8598-4bde-9a2c-f3145e257d0a-metrics-tls\") pod \"dns-default-bs84n\" (UID: \"d237b125-8598-4bde-9a2c-f3145e257d0a\") " pod="openshift-dns/dns-default-bs84n" Apr 16 17:40:24.013167 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:24.013043 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/0583cad7-fc5e-4718-a618-7be058c6fa40-ca\") pod \"cluster-proxy-proxy-agent-76b6b8b57d-ht8l8\" (UID: \"0583cad7-fc5e-4718-a618-7be058c6fa40\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-76b6b8b57d-ht8l8" Apr 16 17:40:24.013167 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:24.013068 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/0583cad7-fc5e-4718-a618-7be058c6fa40-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-76b6b8b57d-ht8l8\" (UID: \"0583cad7-fc5e-4718-a618-7be058c6fa40\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-76b6b8b57d-ht8l8" Apr 16 17:40:24.013167 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:24.013097 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nldfw\" (UniqueName: \"kubernetes.io/projected/7d3e502c-98bf-4943-aa5a-2412d9b53369-kube-api-access-nldfw\") pod \"image-registry-7f8454dc74-5t5l5\" (UID: \"7d3e502c-98bf-4943-aa5a-2412d9b53369\") " pod="openshift-image-registry/image-registry-7f8454dc74-5t5l5" Apr 16 17:40:24.013167 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:24.013128 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/7d3e502c-98bf-4943-aa5a-2412d9b53369-image-registry-private-configuration\") pod \"image-registry-7f8454dc74-5t5l5\" (UID: \"7d3e502c-98bf-4943-aa5a-2412d9b53369\") " pod="openshift-image-registry/image-registry-7f8454dc74-5t5l5" Apr 16 17:40:24.013167 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:24.013157 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/7d3e502c-98bf-4943-aa5a-2412d9b53369-registry-certificates\") pod \"image-registry-7f8454dc74-5t5l5\" (UID: \"7d3e502c-98bf-4943-aa5a-2412d9b53369\") " pod="openshift-image-registry/image-registry-7f8454dc74-5t5l5" Apr 16 17:40:24.013433 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:24.013189 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7df6448d-96ef-4887-bf9c-4b4dfb72c238-config\") pod \"kube-storage-version-migrator-operator-756bb7d76f-qdmf7\" (UID: \"7df6448d-96ef-4887-bf9c-4b4dfb72c238\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-qdmf7" Apr 16 17:40:24.013433 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:24.013218 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/0583cad7-fc5e-4718-a618-7be058c6fa40-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-76b6b8b57d-ht8l8\" (UID: \"0583cad7-fc5e-4718-a618-7be058c6fa40\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-76b6b8b57d-ht8l8" Apr 16 17:40:24.013433 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:24.013244 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/7d3e502c-98bf-4943-aa5a-2412d9b53369-ca-trust-extracted\") pod \"image-registry-7f8454dc74-5t5l5\" (UID: \"7d3e502c-98bf-4943-aa5a-2412d9b53369\") " pod="openshift-image-registry/image-registry-7f8454dc74-5t5l5" Apr 16 17:40:24.013433 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:24.013252 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/3bb520c8-5109-453e-8cb0-2d1c298b05a9-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-gm955\" (UID: \"3bb520c8-5109-453e-8cb0-2d1c298b05a9\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-gm955" Apr 16 17:40:24.013433 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:24.013306 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/d237b125-8598-4bde-9a2c-f3145e257d0a-tmp-dir\") pod \"dns-default-bs84n\" (UID: \"d237b125-8598-4bde-9a2c-f3145e257d0a\") " pod="openshift-dns/dns-default-bs84n" Apr 16 17:40:24.013433 ip-10-0-138-45 kubenswrapper[2574]: E0416 17:40:24.013329 2574 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 17:40:24.013433 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:24.013342 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7d3e502c-98bf-4943-aa5a-2412d9b53369-trusted-ca\") pod \"image-registry-7f8454dc74-5t5l5\" (UID: \"7d3e502c-98bf-4943-aa5a-2412d9b53369\") " pod="openshift-image-registry/image-registry-7f8454dc74-5t5l5" Apr 16 17:40:24.013433 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:24.013372 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a957470a-21dc-4720-b171-61e476f5ec68-config\") pod \"service-ca-operator-69965bb79d-7fwlg\" (UID: \"a957470a-21dc-4720-b171-61e476f5ec68\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-7fwlg" Apr 16 17:40:24.013433 ip-10-0-138-45 kubenswrapper[2574]: E0416 17:40:24.013392 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3bb520c8-5109-453e-8cb0-2d1c298b05a9-samples-operator-tls podName:3bb520c8-5109-453e-8cb0-2d1c298b05a9 nodeName:}" failed. No retries permitted until 2026-04-16 17:40:24.513372376 +0000 UTC m=+34.672742640 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/3bb520c8-5109-453e-8cb0-2d1c298b05a9-samples-operator-tls") pod "cluster-samples-operator-667775844f-gm955" (UID: "3bb520c8-5109-453e-8cb0-2d1c298b05a9") : secret "samples-operator-tls" not found Apr 16 17:40:24.013433 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:24.013422 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sq955\" (UniqueName: \"kubernetes.io/projected/d237b125-8598-4bde-9a2c-f3145e257d0a-kube-api-access-sq955\") pod \"dns-default-bs84n\" (UID: \"d237b125-8598-4bde-9a2c-f3145e257d0a\") " pod="openshift-dns/dns-default-bs84n" Apr 16 17:40:24.013887 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:24.013448 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m9njv\" (UniqueName: \"kubernetes.io/projected/6b4b86fb-c5fd-4c18-8467-71f4575e6f41-kube-api-access-m9njv\") pod \"klusterlet-addon-workmgr-9c6fb4dd8-k7hnt\" (UID: \"6b4b86fb-c5fd-4c18-8467-71f4575e6f41\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-9c6fb4dd8-k7hnt" Apr 16 17:40:24.013887 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:24.013479 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/05d7dd61-bda6-4c4d-87b6-86d7c20fe0cd-serving-cert\") pod \"console-operator-d87b8d5fc-zcvg5\" (UID: \"05d7dd61-bda6-4c4d-87b6-86d7c20fe0cd\") " pod="openshift-console-operator/console-operator-d87b8d5fc-zcvg5" Apr 16 17:40:24.013887 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:24.013501 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/ec0d3d6e-865b-4664-bf7c-6b26346d0631-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-79b8f4f7c7-ps6xx\" (UID: \"ec0d3d6e-865b-4664-bf7c-6b26346d0631\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-79b8f4f7c7-ps6xx" Apr 16 17:40:24.013887 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:24.013529 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7d3e502c-98bf-4943-aa5a-2412d9b53369-registry-tls\") pod \"image-registry-7f8454dc74-5t5l5\" (UID: \"7d3e502c-98bf-4943-aa5a-2412d9b53369\") " pod="openshift-image-registry/image-registry-7f8454dc74-5t5l5" Apr 16 17:40:24.013887 ip-10-0-138-45 kubenswrapper[2574]: E0416 17:40:24.013668 2574 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 17:40:24.013887 ip-10-0-138-45 kubenswrapper[2574]: E0416 17:40:24.013680 2574 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7f8454dc74-5t5l5: secret "image-registry-tls" not found Apr 16 17:40:24.014146 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:24.013912 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/05d7dd61-bda6-4c4d-87b6-86d7c20fe0cd-config\") pod \"console-operator-d87b8d5fc-zcvg5\" (UID: \"05d7dd61-bda6-4c4d-87b6-86d7c20fe0cd\") " pod="openshift-console-operator/console-operator-d87b8d5fc-zcvg5" Apr 16 17:40:24.014146 ip-10-0-138-45 kubenswrapper[2574]: E0416 17:40:24.013959 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7d3e502c-98bf-4943-aa5a-2412d9b53369-registry-tls podName:7d3e502c-98bf-4943-aa5a-2412d9b53369 nodeName:}" failed. No retries permitted until 2026-04-16 17:40:24.513942564 +0000 UTC m=+34.673312824 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/7d3e502c-98bf-4943-aa5a-2412d9b53369-registry-tls") pod "image-registry-7f8454dc74-5t5l5" (UID: "7d3e502c-98bf-4943-aa5a-2412d9b53369") : secret "image-registry-tls" not found Apr 16 17:40:24.014146 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:24.014006 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a957470a-21dc-4720-b171-61e476f5ec68-config\") pod \"service-ca-operator-69965bb79d-7fwlg\" (UID: \"a957470a-21dc-4720-b171-61e476f5ec68\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-7fwlg" Apr 16 17:40:24.014146 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:24.014025 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7df6448d-96ef-4887-bf9c-4b4dfb72c238-config\") pod \"kube-storage-version-migrator-operator-756bb7d76f-qdmf7\" (UID: \"7df6448d-96ef-4887-bf9c-4b4dfb72c238\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-qdmf7" Apr 16 17:40:24.014506 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:24.014146 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/05d7dd61-bda6-4c4d-87b6-86d7c20fe0cd-trusted-ca\") pod \"console-operator-d87b8d5fc-zcvg5\" (UID: \"05d7dd61-bda6-4c4d-87b6-86d7c20fe0cd\") " pod="openshift-console-operator/console-operator-d87b8d5fc-zcvg5" Apr 16 17:40:24.014633 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:24.014611 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/7d3e502c-98bf-4943-aa5a-2412d9b53369-registry-certificates\") pod \"image-registry-7f8454dc74-5t5l5\" (UID: \"7d3e502c-98bf-4943-aa5a-2412d9b53369\") " pod="openshift-image-registry/image-registry-7f8454dc74-5t5l5" Apr 16 17:40:24.014860 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:24.014843 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7d3e502c-98bf-4943-aa5a-2412d9b53369-trusted-ca\") pod \"image-registry-7f8454dc74-5t5l5\" (UID: \"7d3e502c-98bf-4943-aa5a-2412d9b53369\") " pod="openshift-image-registry/image-registry-7f8454dc74-5t5l5" Apr 16 17:40:24.017885 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:24.017728 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7df6448d-96ef-4887-bf9c-4b4dfb72c238-serving-cert\") pod \"kube-storage-version-migrator-operator-756bb7d76f-qdmf7\" (UID: \"7df6448d-96ef-4887-bf9c-4b4dfb72c238\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-qdmf7" Apr 16 17:40:24.017885 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:24.017860 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/7d3e502c-98bf-4943-aa5a-2412d9b53369-installation-pull-secrets\") pod \"image-registry-7f8454dc74-5t5l5\" (UID: \"7d3e502c-98bf-4943-aa5a-2412d9b53369\") " pod="openshift-image-registry/image-registry-7f8454dc74-5t5l5" Apr 16 17:40:24.017885 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:24.017871 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/7d3e502c-98bf-4943-aa5a-2412d9b53369-image-registry-private-configuration\") pod \"image-registry-7f8454dc74-5t5l5\" (UID: \"7d3e502c-98bf-4943-aa5a-2412d9b53369\") " pod="openshift-image-registry/image-registry-7f8454dc74-5t5l5" Apr 16 17:40:24.018107 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:24.018047 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/ec0d3d6e-865b-4664-bf7c-6b26346d0631-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-79b8f4f7c7-ps6xx\" (UID: \"ec0d3d6e-865b-4664-bf7c-6b26346d0631\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-79b8f4f7c7-ps6xx" Apr 16 17:40:24.018365 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:24.018347 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/05d7dd61-bda6-4c4d-87b6-86d7c20fe0cd-serving-cert\") pod \"console-operator-d87b8d5fc-zcvg5\" (UID: \"05d7dd61-bda6-4c4d-87b6-86d7c20fe0cd\") " pod="openshift-console-operator/console-operator-d87b8d5fc-zcvg5" Apr 16 17:40:24.019017 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:24.018998 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a957470a-21dc-4720-b171-61e476f5ec68-serving-cert\") pod \"service-ca-operator-69965bb79d-7fwlg\" (UID: \"a957470a-21dc-4720-b171-61e476f5ec68\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-7fwlg" Apr 16 17:40:24.027254 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:24.027224 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nldfw\" (UniqueName: \"kubernetes.io/projected/7d3e502c-98bf-4943-aa5a-2412d9b53369-kube-api-access-nldfw\") pod \"image-registry-7f8454dc74-5t5l5\" (UID: \"7d3e502c-98bf-4943-aa5a-2412d9b53369\") " pod="openshift-image-registry/image-registry-7f8454dc74-5t5l5" Apr 16 17:40:24.027624 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:24.027546 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6l85n\" (UniqueName: \"kubernetes.io/projected/05d7dd61-bda6-4c4d-87b6-86d7c20fe0cd-kube-api-access-6l85n\") pod \"console-operator-d87b8d5fc-zcvg5\" (UID: \"05d7dd61-bda6-4c4d-87b6-86d7c20fe0cd\") " pod="openshift-console-operator/console-operator-d87b8d5fc-zcvg5" Apr 16 17:40:24.028685 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:24.028640 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wsflf\" (UniqueName: \"kubernetes.io/projected/7df6448d-96ef-4887-bf9c-4b4dfb72c238-kube-api-access-wsflf\") pod \"kube-storage-version-migrator-operator-756bb7d76f-qdmf7\" (UID: \"7df6448d-96ef-4887-bf9c-4b4dfb72c238\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-qdmf7" Apr 16 17:40:24.028809 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:24.028699 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9kq4r\" (UniqueName: \"kubernetes.io/projected/ec0d3d6e-865b-4664-bf7c-6b26346d0631-kube-api-access-9kq4r\") pod \"managed-serviceaccount-addon-agent-79b8f4f7c7-ps6xx\" (UID: \"ec0d3d6e-865b-4664-bf7c-6b26346d0631\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-79b8f4f7c7-ps6xx" Apr 16 17:40:24.028885 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:24.028865 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-928x8\" (UniqueName: \"kubernetes.io/projected/c2483f11-da48-413a-a820-2d4de16e6ae2-kube-api-access-928x8\") pod \"volume-data-source-validator-7d955d5dd4-n6pht\" (UID: \"c2483f11-da48-413a-a820-2d4de16e6ae2\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-n6pht" Apr 16 17:40:24.029633 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:24.029615 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kdqnp\" (UniqueName: \"kubernetes.io/projected/a957470a-21dc-4720-b171-61e476f5ec68-kube-api-access-kdqnp\") pod \"service-ca-operator-69965bb79d-7fwlg\" (UID: \"a957470a-21dc-4720-b171-61e476f5ec68\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-7fwlg" Apr 16 17:40:24.030668 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:24.030561 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4qn24\" (UniqueName: \"kubernetes.io/projected/3bb520c8-5109-453e-8cb0-2d1c298b05a9-kube-api-access-4qn24\") pod \"cluster-samples-operator-667775844f-gm955\" (UID: \"3bb520c8-5109-453e-8cb0-2d1c298b05a9\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-gm955" Apr 16 17:40:24.030760 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:24.030692 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lbq2c\" (UniqueName: \"kubernetes.io/projected/e224778c-6e6a-4ece-bf4c-6f9f2accf344-kube-api-access-lbq2c\") pod \"network-check-source-7b678d77c7-dn2q5\" (UID: \"e224778c-6e6a-4ece-bf4c-6f9f2accf344\") " pod="openshift-network-diagnostics/network-check-source-7b678d77c7-dn2q5" Apr 16 17:40:24.037034 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:24.036979 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7d3e502c-98bf-4943-aa5a-2412d9b53369-bound-sa-token\") pod \"image-registry-7f8454dc74-5t5l5\" (UID: \"7d3e502c-98bf-4943-aa5a-2412d9b53369\") " pod="openshift-image-registry/image-registry-7f8454dc74-5t5l5" Apr 16 17:40:24.084189 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:24.084158 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-n6pht" Apr 16 17:40:24.108494 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:24.108460 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-d87b8d5fc-zcvg5" Apr 16 17:40:24.114539 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:24.114517 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/0583cad7-fc5e-4718-a618-7be058c6fa40-hub\") pod \"cluster-proxy-proxy-agent-76b6b8b57d-ht8l8\" (UID: \"0583cad7-fc5e-4718-a618-7be058c6fa40\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-76b6b8b57d-ht8l8" Apr 16 17:40:24.114681 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:24.114552 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c1e50cfa-f2ad-4fdb-bf97-8bab7c30928c-cert\") pod \"ingress-canary-jv4sl\" (UID: \"c1e50cfa-f2ad-4fdb-bf97-8bab7c30928c\") " pod="openshift-ingress-canary/ingress-canary-jv4sl" Apr 16 17:40:24.114681 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:24.114609 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tcm6v\" (UniqueName: \"kubernetes.io/projected/0583cad7-fc5e-4718-a618-7be058c6fa40-kube-api-access-tcm6v\") pod \"cluster-proxy-proxy-agent-76b6b8b57d-ht8l8\" (UID: \"0583cad7-fc5e-4718-a618-7be058c6fa40\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-76b6b8b57d-ht8l8" Apr 16 17:40:24.114681 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:24.114641 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d237b125-8598-4bde-9a2c-f3145e257d0a-config-volume\") pod \"dns-default-bs84n\" (UID: \"d237b125-8598-4bde-9a2c-f3145e257d0a\") " pod="openshift-dns/dns-default-bs84n" Apr 16 17:40:24.114681 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:24.114660 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/6b4b86fb-c5fd-4c18-8467-71f4575e6f41-klusterlet-config\") pod \"klusterlet-addon-workmgr-9c6fb4dd8-k7hnt\" (UID: \"6b4b86fb-c5fd-4c18-8467-71f4575e6f41\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-9c6fb4dd8-k7hnt" Apr 16 17:40:24.114881 ip-10-0-138-45 kubenswrapper[2574]: E0416 17:40:24.114692 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 17:40:24.114881 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:24.114705 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/0583cad7-fc5e-4718-a618-7be058c6fa40-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-76b6b8b57d-ht8l8\" (UID: \"0583cad7-fc5e-4718-a618-7be058c6fa40\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-76b6b8b57d-ht8l8" Apr 16 17:40:24.114881 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:24.114725 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mn2ww\" (UniqueName: \"kubernetes.io/projected/c1e50cfa-f2ad-4fdb-bf97-8bab7c30928c-kube-api-access-mn2ww\") pod \"ingress-canary-jv4sl\" (UID: \"c1e50cfa-f2ad-4fdb-bf97-8bab7c30928c\") " pod="openshift-ingress-canary/ingress-canary-jv4sl" Apr 16 17:40:24.114881 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:24.114748 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d237b125-8598-4bde-9a2c-f3145e257d0a-metrics-tls\") pod \"dns-default-bs84n\" (UID: \"d237b125-8598-4bde-9a2c-f3145e257d0a\") " pod="openshift-dns/dns-default-bs84n" Apr 16 17:40:24.114881 ip-10-0-138-45 kubenswrapper[2574]: E0416 17:40:24.114778 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c1e50cfa-f2ad-4fdb-bf97-8bab7c30928c-cert podName:c1e50cfa-f2ad-4fdb-bf97-8bab7c30928c nodeName:}" failed. No retries permitted until 2026-04-16 17:40:24.614755175 +0000 UTC m=+34.774125452 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c1e50cfa-f2ad-4fdb-bf97-8bab7c30928c-cert") pod "ingress-canary-jv4sl" (UID: "c1e50cfa-f2ad-4fdb-bf97-8bab7c30928c") : secret "canary-serving-cert" not found Apr 16 17:40:24.114881 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:24.114804 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/0583cad7-fc5e-4718-a618-7be058c6fa40-ca\") pod \"cluster-proxy-proxy-agent-76b6b8b57d-ht8l8\" (UID: \"0583cad7-fc5e-4718-a618-7be058c6fa40\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-76b6b8b57d-ht8l8" Apr 16 17:40:24.114881 ip-10-0-138-45 kubenswrapper[2574]: E0416 17:40:24.114813 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 17:40:24.114881 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:24.114839 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/0583cad7-fc5e-4718-a618-7be058c6fa40-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-76b6b8b57d-ht8l8\" (UID: \"0583cad7-fc5e-4718-a618-7be058c6fa40\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-76b6b8b57d-ht8l8" Apr 16 17:40:24.114881 ip-10-0-138-45 kubenswrapper[2574]: E0416 17:40:24.114865 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d237b125-8598-4bde-9a2c-f3145e257d0a-metrics-tls podName:d237b125-8598-4bde-9a2c-f3145e257d0a nodeName:}" failed. No retries permitted until 2026-04-16 17:40:24.614850793 +0000 UTC m=+34.774221055 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d237b125-8598-4bde-9a2c-f3145e257d0a-metrics-tls") pod "dns-default-bs84n" (UID: "d237b125-8598-4bde-9a2c-f3145e257d0a") : secret "dns-default-metrics-tls" not found Apr 16 17:40:24.115313 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:24.114892 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/0583cad7-fc5e-4718-a618-7be058c6fa40-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-76b6b8b57d-ht8l8\" (UID: \"0583cad7-fc5e-4718-a618-7be058c6fa40\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-76b6b8b57d-ht8l8" Apr 16 17:40:24.115313 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:24.114916 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/d237b125-8598-4bde-9a2c-f3145e257d0a-tmp-dir\") pod \"dns-default-bs84n\" (UID: \"d237b125-8598-4bde-9a2c-f3145e257d0a\") " pod="openshift-dns/dns-default-bs84n" Apr 16 17:40:24.115313 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:24.114943 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sq955\" (UniqueName: \"kubernetes.io/projected/d237b125-8598-4bde-9a2c-f3145e257d0a-kube-api-access-sq955\") pod \"dns-default-bs84n\" (UID: \"d237b125-8598-4bde-9a2c-f3145e257d0a\") " pod="openshift-dns/dns-default-bs84n" Apr 16 17:40:24.115313 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:24.114961 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m9njv\" (UniqueName: \"kubernetes.io/projected/6b4b86fb-c5fd-4c18-8467-71f4575e6f41-kube-api-access-m9njv\") pod \"klusterlet-addon-workmgr-9c6fb4dd8-k7hnt\" (UID: \"6b4b86fb-c5fd-4c18-8467-71f4575e6f41\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-9c6fb4dd8-k7hnt" Apr 16 17:40:24.115313 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:24.114993 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/6b4b86fb-c5fd-4c18-8467-71f4575e6f41-tmp\") pod \"klusterlet-addon-workmgr-9c6fb4dd8-k7hnt\" (UID: \"6b4b86fb-c5fd-4c18-8467-71f4575e6f41\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-9c6fb4dd8-k7hnt" Apr 16 17:40:24.115643 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:24.115372 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/6b4b86fb-c5fd-4c18-8467-71f4575e6f41-tmp\") pod \"klusterlet-addon-workmgr-9c6fb4dd8-k7hnt\" (UID: \"6b4b86fb-c5fd-4c18-8467-71f4575e6f41\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-9c6fb4dd8-k7hnt" Apr 16 17:40:24.115643 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:24.115509 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/0583cad7-fc5e-4718-a618-7be058c6fa40-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-76b6b8b57d-ht8l8\" (UID: \"0583cad7-fc5e-4718-a618-7be058c6fa40\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-76b6b8b57d-ht8l8" Apr 16 17:40:24.116353 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:24.116201 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-qdmf7" Apr 16 17:40:24.116353 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:24.116279 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d237b125-8598-4bde-9a2c-f3145e257d0a-config-volume\") pod \"dns-default-bs84n\" (UID: \"d237b125-8598-4bde-9a2c-f3145e257d0a\") " pod="openshift-dns/dns-default-bs84n" Apr 16 17:40:24.116353 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:24.116296 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/d237b125-8598-4bde-9a2c-f3145e257d0a-tmp-dir\") pod \"dns-default-bs84n\" (UID: \"d237b125-8598-4bde-9a2c-f3145e257d0a\") " pod="openshift-dns/dns-default-bs84n" Apr 16 17:40:24.118470 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:24.118447 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/6b4b86fb-c5fd-4c18-8467-71f4575e6f41-klusterlet-config\") pod \"klusterlet-addon-workmgr-9c6fb4dd8-k7hnt\" (UID: \"6b4b86fb-c5fd-4c18-8467-71f4575e6f41\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-9c6fb4dd8-k7hnt" Apr 16 17:40:24.118752 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:24.118723 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/0583cad7-fc5e-4718-a618-7be058c6fa40-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-76b6b8b57d-ht8l8\" (UID: \"0583cad7-fc5e-4718-a618-7be058c6fa40\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-76b6b8b57d-ht8l8" Apr 16 17:40:24.118846 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:24.118759 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub\" (UniqueName: \"kubernetes.io/secret/0583cad7-fc5e-4718-a618-7be058c6fa40-hub\") pod \"cluster-proxy-proxy-agent-76b6b8b57d-ht8l8\" (UID: \"0583cad7-fc5e-4718-a618-7be058c6fa40\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-76b6b8b57d-ht8l8" Apr 16 17:40:24.119067 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:24.119045 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca\" (UniqueName: \"kubernetes.io/secret/0583cad7-fc5e-4718-a618-7be058c6fa40-ca\") pod \"cluster-proxy-proxy-agent-76b6b8b57d-ht8l8\" (UID: \"0583cad7-fc5e-4718-a618-7be058c6fa40\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-76b6b8b57d-ht8l8" Apr 16 17:40:24.119805 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:24.119786 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/0583cad7-fc5e-4718-a618-7be058c6fa40-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-76b6b8b57d-ht8l8\" (UID: \"0583cad7-fc5e-4718-a618-7be058c6fa40\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-76b6b8b57d-ht8l8" Apr 16 17:40:24.124844 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:24.124678 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-7fwlg" Apr 16 17:40:24.127027 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:24.127002 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-ncccq"] Apr 16 17:40:24.134919 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:24.134892 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tcm6v\" (UniqueName: \"kubernetes.io/projected/0583cad7-fc5e-4718-a618-7be058c6fa40-kube-api-access-tcm6v\") pod \"cluster-proxy-proxy-agent-76b6b8b57d-ht8l8\" (UID: \"0583cad7-fc5e-4718-a618-7be058c6fa40\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-76b6b8b57d-ht8l8" Apr 16 17:40:24.135048 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:24.134896 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mn2ww\" (UniqueName: \"kubernetes.io/projected/c1e50cfa-f2ad-4fdb-bf97-8bab7c30928c-kube-api-access-mn2ww\") pod \"ingress-canary-jv4sl\" (UID: \"c1e50cfa-f2ad-4fdb-bf97-8bab7c30928c\") " pod="openshift-ingress-canary/ingress-canary-jv4sl" Apr 16 17:40:24.135048 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:24.134898 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sq955\" (UniqueName: \"kubernetes.io/projected/d237b125-8598-4bde-9a2c-f3145e257d0a-kube-api-access-sq955\") pod \"dns-default-bs84n\" (UID: \"d237b125-8598-4bde-9a2c-f3145e257d0a\") " pod="openshift-dns/dns-default-bs84n" Apr 16 17:40:24.137268 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:24.137246 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m9njv\" (UniqueName: \"kubernetes.io/projected/6b4b86fb-c5fd-4c18-8467-71f4575e6f41-kube-api-access-m9njv\") pod \"klusterlet-addon-workmgr-9c6fb4dd8-k7hnt\" (UID: \"6b4b86fb-c5fd-4c18-8467-71f4575e6f41\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-9c6fb4dd8-k7hnt" Apr 16 17:40:24.147676 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:24.147648 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-ncccq" Apr 16 17:40:24.150074 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:24.150037 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-jcfh8\"" Apr 16 17:40:24.151623 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:24.151565 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-79b8f4f7c7-ps6xx" Apr 16 17:40:24.158432 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:24.158412 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-7b678d77c7-dn2q5" Apr 16 17:40:24.166854 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:24.166830 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-9c6fb4dd8-k7hnt" Apr 16 17:40:24.174458 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:24.173819 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-76b6b8b57d-ht8l8" Apr 16 17:40:24.222169 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:24.216411 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/88ac585a-175b-45f2-8696-1c754b2a5a05-metrics-certs\") pod \"network-metrics-daemon-cxxrm\" (UID: \"88ac585a-175b-45f2-8696-1c754b2a5a05\") " pod="openshift-multus/network-metrics-daemon-cxxrm" Apr 16 17:40:24.222169 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:24.216597 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rvl7j\" (UniqueName: \"kubernetes.io/projected/b2fa5fab-8dc1-4b89-bb5a-d252dd35fa2d-kube-api-access-rvl7j\") pod \"network-check-target-2kgzs\" (UID: \"b2fa5fab-8dc1-4b89-bb5a-d252dd35fa2d\") " pod="openshift-network-diagnostics/network-check-target-2kgzs" Apr 16 17:40:24.222169 ip-10-0-138-45 kubenswrapper[2574]: E0416 17:40:24.217377 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 17:40:24.222169 ip-10-0-138-45 kubenswrapper[2574]: E0416 17:40:24.217459 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/88ac585a-175b-45f2-8696-1c754b2a5a05-metrics-certs podName:88ac585a-175b-45f2-8696-1c754b2a5a05 nodeName:}" failed. No retries permitted until 2026-04-16 17:40:56.21743872 +0000 UTC m=+66.376808982 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/88ac585a-175b-45f2-8696-1c754b2a5a05-metrics-certs") pod "network-metrics-daemon-cxxrm" (UID: "88ac585a-175b-45f2-8696-1c754b2a5a05") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 17:40:24.239724 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:24.232265 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rvl7j\" (UniqueName: \"kubernetes.io/projected/b2fa5fab-8dc1-4b89-bb5a-d252dd35fa2d-kube-api-access-rvl7j\") pod \"network-check-target-2kgzs\" (UID: \"b2fa5fab-8dc1-4b89-bb5a-d252dd35fa2d\") " pod="openshift-network-diagnostics/network-check-target-2kgzs" Apr 16 17:40:24.318958 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:24.318694 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/cf1030f1-c004-49a8-a6a6-1c7e1bbe4b25-hosts-file\") pod \"node-resolver-ncccq\" (UID: \"cf1030f1-c004-49a8-a6a6-1c7e1bbe4b25\") " pod="openshift-dns/node-resolver-ncccq" Apr 16 17:40:24.318958 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:24.318801 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/cf1030f1-c004-49a8-a6a6-1c7e1bbe4b25-tmp-dir\") pod \"node-resolver-ncccq\" (UID: \"cf1030f1-c004-49a8-a6a6-1c7e1bbe4b25\") " pod="openshift-dns/node-resolver-ncccq" Apr 16 17:40:24.318958 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:24.318833 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-drtdk\" (UniqueName: \"kubernetes.io/projected/cf1030f1-c004-49a8-a6a6-1c7e1bbe4b25-kube-api-access-drtdk\") pod \"node-resolver-ncccq\" (UID: \"cf1030f1-c004-49a8-a6a6-1c7e1bbe4b25\") " pod="openshift-dns/node-resolver-ncccq" Apr 16 17:40:24.366646 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:24.366447 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-n6pht"] Apr 16 17:40:24.378330 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:24.378275 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-69965bb79d-7fwlg"] Apr 16 17:40:24.382027 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:24.381973 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-d87b8d5fc-zcvg5"] Apr 16 17:40:24.389692 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:24.389349 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-qdmf7"] Apr 16 17:40:24.421701 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:24.421004 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/cf1030f1-c004-49a8-a6a6-1c7e1bbe4b25-hosts-file\") pod \"node-resolver-ncccq\" (UID: \"cf1030f1-c004-49a8-a6a6-1c7e1bbe4b25\") " pod="openshift-dns/node-resolver-ncccq" Apr 16 17:40:24.421701 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:24.421075 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/cf1030f1-c004-49a8-a6a6-1c7e1bbe4b25-tmp-dir\") pod \"node-resolver-ncccq\" (UID: \"cf1030f1-c004-49a8-a6a6-1c7e1bbe4b25\") " pod="openshift-dns/node-resolver-ncccq" Apr 16 17:40:24.421701 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:24.421094 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-drtdk\" (UniqueName: \"kubernetes.io/projected/cf1030f1-c004-49a8-a6a6-1c7e1bbe4b25-kube-api-access-drtdk\") pod \"node-resolver-ncccq\" (UID: \"cf1030f1-c004-49a8-a6a6-1c7e1bbe4b25\") " pod="openshift-dns/node-resolver-ncccq" Apr 16 17:40:24.421701 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:24.421378 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/cf1030f1-c004-49a8-a6a6-1c7e1bbe4b25-hosts-file\") pod \"node-resolver-ncccq\" (UID: \"cf1030f1-c004-49a8-a6a6-1c7e1bbe4b25\") " pod="openshift-dns/node-resolver-ncccq" Apr 16 17:40:24.421701 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:24.421650 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/cf1030f1-c004-49a8-a6a6-1c7e1bbe4b25-tmp-dir\") pod \"node-resolver-ncccq\" (UID: \"cf1030f1-c004-49a8-a6a6-1c7e1bbe4b25\") " pod="openshift-dns/node-resolver-ncccq" Apr 16 17:40:24.428150 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:24.426591 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-79b8f4f7c7-ps6xx"] Apr 16 17:40:24.436420 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:24.436368 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-7b678d77c7-dn2q5"] Apr 16 17:40:24.437103 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:24.437083 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-drtdk\" (UniqueName: \"kubernetes.io/projected/cf1030f1-c004-49a8-a6a6-1c7e1bbe4b25-kube-api-access-drtdk\") pod \"node-resolver-ncccq\" (UID: \"cf1030f1-c004-49a8-a6a6-1c7e1bbe4b25\") " pod="openshift-dns/node-resolver-ncccq" Apr 16 17:40:24.441952 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:40:24.440053 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podec0d3d6e_865b_4664_bf7c_6b26346d0631.slice/crio-2b44c3cd00e262518eaa18e36c975a8bb68200a01056b72afa438df7081f3feb WatchSource:0}: Error finding container 2b44c3cd00e262518eaa18e36c975a8bb68200a01056b72afa438df7081f3feb: Status 404 returned error can't find the container with id 2b44c3cd00e262518eaa18e36c975a8bb68200a01056b72afa438df7081f3feb Apr 16 17:40:24.441952 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:40:24.441698 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode224778c_6e6a_4ece_bf4c_6f9f2accf344.slice/crio-3031c15696ace3279fbbe6cca36d92e6fedb654d7996b235814e2766e2045b8e WatchSource:0}: Error finding container 3031c15696ace3279fbbe6cca36d92e6fedb654d7996b235814e2766e2045b8e: Status 404 returned error can't find the container with id 3031c15696ace3279fbbe6cca36d92e6fedb654d7996b235814e2766e2045b8e Apr 16 17:40:24.461257 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:24.461228 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-9c6fb4dd8-k7hnt"] Apr 16 17:40:24.462191 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:40:24.461472 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6b4b86fb_c5fd_4c18_8467_71f4575e6f41.slice/crio-e505ca4c75cfea1dbc7fefc3aa46fe7aabf20793542f5ebfc427818b53e44425 WatchSource:0}: Error finding container e505ca4c75cfea1dbc7fefc3aa46fe7aabf20793542f5ebfc427818b53e44425: Status 404 returned error can't find the container with id e505ca4c75cfea1dbc7fefc3aa46fe7aabf20793542f5ebfc427818b53e44425 Apr 16 17:40:24.463141 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:24.462932 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-ncccq" Apr 16 17:40:24.472185 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:40:24.472155 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcf1030f1_c004_49a8_a6a6_1c7e1bbe4b25.slice/crio-30ee4076220ed7af39b52ceae635880b4e38bd04a4470812a187d54e4eb4fc2c WatchSource:0}: Error finding container 30ee4076220ed7af39b52ceae635880b4e38bd04a4470812a187d54e4eb4fc2c: Status 404 returned error can't find the container with id 30ee4076220ed7af39b52ceae635880b4e38bd04a4470812a187d54e4eb4fc2c Apr 16 17:40:24.482500 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:24.482477 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-76b6b8b57d-ht8l8"] Apr 16 17:40:24.486477 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:40:24.486452 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0583cad7_fc5e_4718_a618_7be058c6fa40.slice/crio-e0afe5eb5e413643da03f5f47b586d59b4a26e40990e241a759ee9c39dc92956 WatchSource:0}: Error finding container e0afe5eb5e413643da03f5f47b586d59b4a26e40990e241a759ee9c39dc92956: Status 404 returned error can't find the container with id e0afe5eb5e413643da03f5f47b586d59b4a26e40990e241a759ee9c39dc92956 Apr 16 17:40:24.521704 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:24.521672 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/3bb520c8-5109-453e-8cb0-2d1c298b05a9-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-gm955\" (UID: \"3bb520c8-5109-453e-8cb0-2d1c298b05a9\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-gm955" Apr 16 17:40:24.521874 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:24.521723 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7d3e502c-98bf-4943-aa5a-2412d9b53369-registry-tls\") pod \"image-registry-7f8454dc74-5t5l5\" (UID: \"7d3e502c-98bf-4943-aa5a-2412d9b53369\") " pod="openshift-image-registry/image-registry-7f8454dc74-5t5l5" Apr 16 17:40:24.521874 ip-10-0-138-45 kubenswrapper[2574]: E0416 17:40:24.521834 2574 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 17:40:24.521874 ip-10-0-138-45 kubenswrapper[2574]: E0416 17:40:24.521847 2574 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 17:40:24.521874 ip-10-0-138-45 kubenswrapper[2574]: E0416 17:40:24.521863 2574 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7f8454dc74-5t5l5: secret "image-registry-tls" not found Apr 16 17:40:24.522072 ip-10-0-138-45 kubenswrapper[2574]: E0416 17:40:24.521912 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7d3e502c-98bf-4943-aa5a-2412d9b53369-registry-tls podName:7d3e502c-98bf-4943-aa5a-2412d9b53369 nodeName:}" failed. No retries permitted until 2026-04-16 17:40:25.52189931 +0000 UTC m=+35.681269570 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/7d3e502c-98bf-4943-aa5a-2412d9b53369-registry-tls") pod "image-registry-7f8454dc74-5t5l5" (UID: "7d3e502c-98bf-4943-aa5a-2412d9b53369") : secret "image-registry-tls" not found Apr 16 17:40:24.522072 ip-10-0-138-45 kubenswrapper[2574]: E0416 17:40:24.521925 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3bb520c8-5109-453e-8cb0-2d1c298b05a9-samples-operator-tls podName:3bb520c8-5109-453e-8cb0-2d1c298b05a9 nodeName:}" failed. No retries permitted until 2026-04-16 17:40:25.521919619 +0000 UTC m=+35.681289879 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/3bb520c8-5109-453e-8cb0-2d1c298b05a9-samples-operator-tls") pod "cluster-samples-operator-667775844f-gm955" (UID: "3bb520c8-5109-453e-8cb0-2d1c298b05a9") : secret "samples-operator-tls" not found Apr 16 17:40:24.584982 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:24.584942 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-ncccq" event={"ID":"cf1030f1-c004-49a8-a6a6-1c7e1bbe4b25","Type":"ContainerStarted","Data":"30ee4076220ed7af39b52ceae635880b4e38bd04a4470812a187d54e4eb4fc2c"} Apr 16 17:40:24.586178 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:24.586142 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-qdmf7" event={"ID":"7df6448d-96ef-4887-bf9c-4b4dfb72c238","Type":"ContainerStarted","Data":"be6c37be70605f5ae97174cfb14e418ac10c9d083cfa48cd5b2c93241499eafc"} Apr 16 17:40:24.587210 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:24.587174 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-d87b8d5fc-zcvg5" event={"ID":"05d7dd61-bda6-4c4d-87b6-86d7c20fe0cd","Type":"ContainerStarted","Data":"4255325e07a8d23693f0cfb5138a5d4615fbc5b2c0bb75c571a35097e02aee7f"} Apr 16 17:40:24.588294 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:24.588272 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-79b8f4f7c7-ps6xx" event={"ID":"ec0d3d6e-865b-4664-bf7c-6b26346d0631","Type":"ContainerStarted","Data":"2b44c3cd00e262518eaa18e36c975a8bb68200a01056b72afa438df7081f3feb"} Apr 16 17:40:24.590799 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:24.590778 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5ww9l" event={"ID":"37af66f0-77b6-4e5a-9cd0-c7267f06ab11","Type":"ContainerStarted","Data":"a6f2caaad31a6b0189dd1386cdff48989acb2b1f3bbee4d4738d0cada2adda57"} Apr 16 17:40:24.592007 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:24.591987 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-76b6b8b57d-ht8l8" event={"ID":"0583cad7-fc5e-4718-a618-7be058c6fa40","Type":"ContainerStarted","Data":"e0afe5eb5e413643da03f5f47b586d59b4a26e40990e241a759ee9c39dc92956"} Apr 16 17:40:24.593019 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:24.592985 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-9c6fb4dd8-k7hnt" event={"ID":"6b4b86fb-c5fd-4c18-8467-71f4575e6f41","Type":"ContainerStarted","Data":"e505ca4c75cfea1dbc7fefc3aa46fe7aabf20793542f5ebfc427818b53e44425"} Apr 16 17:40:24.594025 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:24.594006 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-7fwlg" event={"ID":"a957470a-21dc-4720-b171-61e476f5ec68","Type":"ContainerStarted","Data":"cbeebdf65bde4c76b6600d3c0d8f18bdc8d9215cb77c675fba389d2746f1d3d6"} Apr 16 17:40:24.595117 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:24.595098 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-7b678d77c7-dn2q5" event={"ID":"e224778c-6e6a-4ece-bf4c-6f9f2accf344","Type":"ContainerStarted","Data":"3031c15696ace3279fbbe6cca36d92e6fedb654d7996b235814e2766e2045b8e"} Apr 16 17:40:24.596261 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:24.596227 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-n6pht" event={"ID":"c2483f11-da48-413a-a820-2d4de16e6ae2","Type":"ContainerStarted","Data":"01189e4a86569b12fb6ee0007b8794ea3a61ac80c6a7a9d9f298e3ef3465299f"} Apr 16 17:40:24.622616 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:24.622566 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c1e50cfa-f2ad-4fdb-bf97-8bab7c30928c-cert\") pod \"ingress-canary-jv4sl\" (UID: \"c1e50cfa-f2ad-4fdb-bf97-8bab7c30928c\") " pod="openshift-ingress-canary/ingress-canary-jv4sl" Apr 16 17:40:24.622740 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:24.622661 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d237b125-8598-4bde-9a2c-f3145e257d0a-metrics-tls\") pod \"dns-default-bs84n\" (UID: \"d237b125-8598-4bde-9a2c-f3145e257d0a\") " pod="openshift-dns/dns-default-bs84n" Apr 16 17:40:24.622740 ip-10-0-138-45 kubenswrapper[2574]: E0416 17:40:24.622670 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 17:40:24.622740 ip-10-0-138-45 kubenswrapper[2574]: E0416 17:40:24.622735 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c1e50cfa-f2ad-4fdb-bf97-8bab7c30928c-cert podName:c1e50cfa-f2ad-4fdb-bf97-8bab7c30928c nodeName:}" failed. No retries permitted until 2026-04-16 17:40:25.622709951 +0000 UTC m=+35.782080211 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c1e50cfa-f2ad-4fdb-bf97-8bab7c30928c-cert") pod "ingress-canary-jv4sl" (UID: "c1e50cfa-f2ad-4fdb-bf97-8bab7c30928c") : secret "canary-serving-cert" not found Apr 16 17:40:24.622873 ip-10-0-138-45 kubenswrapper[2574]: E0416 17:40:24.622767 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 17:40:24.622873 ip-10-0-138-45 kubenswrapper[2574]: E0416 17:40:24.622806 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d237b125-8598-4bde-9a2c-f3145e257d0a-metrics-tls podName:d237b125-8598-4bde-9a2c-f3145e257d0a nodeName:}" failed. No retries permitted until 2026-04-16 17:40:25.622792693 +0000 UTC m=+35.782162954 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d237b125-8598-4bde-9a2c-f3145e257d0a-metrics-tls") pod "dns-default-bs84n" (UID: "d237b125-8598-4bde-9a2c-f3145e257d0a") : secret "dns-default-metrics-tls" not found Apr 16 17:40:25.452235 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:25.450762 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2kgzs" Apr 16 17:40:25.452235 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:25.451220 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cxxrm" Apr 16 17:40:25.452235 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:25.451671 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-49fnc" Apr 16 17:40:25.457309 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:25.453958 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 17:40:25.457309 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:25.454488 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 16 17:40:25.457309 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:25.454723 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-hvrx4\"" Apr 16 17:40:25.457309 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:25.454907 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-wdxwt\"" Apr 16 17:40:25.471251 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:25.471226 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2kgzs" Apr 16 17:40:25.531620 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:25.531559 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7d3e502c-98bf-4943-aa5a-2412d9b53369-registry-tls\") pod \"image-registry-7f8454dc74-5t5l5\" (UID: \"7d3e502c-98bf-4943-aa5a-2412d9b53369\") " pod="openshift-image-registry/image-registry-7f8454dc74-5t5l5" Apr 16 17:40:25.531783 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:25.531731 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/3bb520c8-5109-453e-8cb0-2d1c298b05a9-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-gm955\" (UID: \"3bb520c8-5109-453e-8cb0-2d1c298b05a9\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-gm955" Apr 16 17:40:25.531882 ip-10-0-138-45 kubenswrapper[2574]: E0416 17:40:25.531869 2574 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 17:40:25.531952 ip-10-0-138-45 kubenswrapper[2574]: E0416 17:40:25.531933 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3bb520c8-5109-453e-8cb0-2d1c298b05a9-samples-operator-tls podName:3bb520c8-5109-453e-8cb0-2d1c298b05a9 nodeName:}" failed. No retries permitted until 2026-04-16 17:40:27.531914813 +0000 UTC m=+37.691285078 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/3bb520c8-5109-453e-8cb0-2d1c298b05a9-samples-operator-tls") pod "cluster-samples-operator-667775844f-gm955" (UID: "3bb520c8-5109-453e-8cb0-2d1c298b05a9") : secret "samples-operator-tls" not found Apr 16 17:40:25.532370 ip-10-0-138-45 kubenswrapper[2574]: E0416 17:40:25.532350 2574 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 17:40:25.532370 ip-10-0-138-45 kubenswrapper[2574]: E0416 17:40:25.532372 2574 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7f8454dc74-5t5l5: secret "image-registry-tls" not found Apr 16 17:40:25.532526 ip-10-0-138-45 kubenswrapper[2574]: E0416 17:40:25.532418 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7d3e502c-98bf-4943-aa5a-2412d9b53369-registry-tls podName:7d3e502c-98bf-4943-aa5a-2412d9b53369 nodeName:}" failed. No retries permitted until 2026-04-16 17:40:27.532403348 +0000 UTC m=+37.691773628 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/7d3e502c-98bf-4943-aa5a-2412d9b53369-registry-tls") pod "image-registry-7f8454dc74-5t5l5" (UID: "7d3e502c-98bf-4943-aa5a-2412d9b53369") : secret "image-registry-tls" not found Apr 16 17:40:25.620459 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:25.620426 2574 generic.go:358] "Generic (PLEG): container finished" podID="37af66f0-77b6-4e5a-9cd0-c7267f06ab11" containerID="a6f2caaad31a6b0189dd1386cdff48989acb2b1f3bbee4d4738d0cada2adda57" exitCode=0 Apr 16 17:40:25.620642 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:25.620565 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5ww9l" event={"ID":"37af66f0-77b6-4e5a-9cd0-c7267f06ab11","Type":"ContainerDied","Data":"a6f2caaad31a6b0189dd1386cdff48989acb2b1f3bbee4d4738d0cada2adda57"} Apr 16 17:40:25.635970 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:25.635666 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c1e50cfa-f2ad-4fdb-bf97-8bab7c30928c-cert\") pod \"ingress-canary-jv4sl\" (UID: \"c1e50cfa-f2ad-4fdb-bf97-8bab7c30928c\") " pod="openshift-ingress-canary/ingress-canary-jv4sl" Apr 16 17:40:25.635970 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:25.635773 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d237b125-8598-4bde-9a2c-f3145e257d0a-metrics-tls\") pod \"dns-default-bs84n\" (UID: \"d237b125-8598-4bde-9a2c-f3145e257d0a\") " pod="openshift-dns/dns-default-bs84n" Apr 16 17:40:25.636564 ip-10-0-138-45 kubenswrapper[2574]: E0416 17:40:25.636542 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 17:40:25.636676 ip-10-0-138-45 kubenswrapper[2574]: E0416 17:40:25.636623 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c1e50cfa-f2ad-4fdb-bf97-8bab7c30928c-cert podName:c1e50cfa-f2ad-4fdb-bf97-8bab7c30928c nodeName:}" failed. No retries permitted until 2026-04-16 17:40:27.636602432 +0000 UTC m=+37.795972696 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c1e50cfa-f2ad-4fdb-bf97-8bab7c30928c-cert") pod "ingress-canary-jv4sl" (UID: "c1e50cfa-f2ad-4fdb-bf97-8bab7c30928c") : secret "canary-serving-cert" not found Apr 16 17:40:25.637249 ip-10-0-138-45 kubenswrapper[2574]: E0416 17:40:25.637226 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 17:40:25.637327 ip-10-0-138-45 kubenswrapper[2574]: E0416 17:40:25.637278 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d237b125-8598-4bde-9a2c-f3145e257d0a-metrics-tls podName:d237b125-8598-4bde-9a2c-f3145e257d0a nodeName:}" failed. No retries permitted until 2026-04-16 17:40:27.637264826 +0000 UTC m=+37.796635088 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d237b125-8598-4bde-9a2c-f3145e257d0a-metrics-tls") pod "dns-default-bs84n" (UID: "d237b125-8598-4bde-9a2c-f3145e257d0a") : secret "dns-default-metrics-tls" not found Apr 16 17:40:25.637810 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:25.637786 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-ncccq" event={"ID":"cf1030f1-c004-49a8-a6a6-1c7e1bbe4b25","Type":"ContainerStarted","Data":"cc5278687cbd33b5520d6cf09ceed8765c69d741ca3fba1885ee8a23b9ee4be8"} Apr 16 17:40:25.676732 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:25.674116 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-ncccq" podStartSLOduration=1.674095022 podStartE2EDuration="1.674095022s" podCreationTimestamp="2026-04-16 17:40:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 17:40:25.673435248 +0000 UTC m=+35.832805531" watchObservedRunningTime="2026-04-16 17:40:25.674095022 +0000 UTC m=+35.833465307" Apr 16 17:40:25.696114 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:25.696007 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-2kgzs"] Apr 16 17:40:25.721116 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:40:25.714545 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb2fa5fab_8dc1_4b89_bb5a_d252dd35fa2d.slice/crio-2b8129bc9034ade591b5237e24ce682123458d623ff182a9fc500215dc444bd2 WatchSource:0}: Error finding container 2b8129bc9034ade591b5237e24ce682123458d623ff182a9fc500215dc444bd2: Status 404 returned error can't find the container with id 2b8129bc9034ade591b5237e24ce682123458d623ff182a9fc500215dc444bd2 Apr 16 17:40:26.652433 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:26.651881 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-2kgzs" event={"ID":"b2fa5fab-8dc1-4b89-bb5a-d252dd35fa2d","Type":"ContainerStarted","Data":"2b8129bc9034ade591b5237e24ce682123458d623ff182a9fc500215dc444bd2"} Apr 16 17:40:26.691503 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:26.691463 2574 generic.go:358] "Generic (PLEG): container finished" podID="37af66f0-77b6-4e5a-9cd0-c7267f06ab11" containerID="5204d319cb362b740b9697c727eba3757256a61514f4bcd84c80a92d2204d598" exitCode=0 Apr 16 17:40:26.692387 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:26.692349 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5ww9l" event={"ID":"37af66f0-77b6-4e5a-9cd0-c7267f06ab11","Type":"ContainerDied","Data":"5204d319cb362b740b9697c727eba3757256a61514f4bcd84c80a92d2204d598"} Apr 16 17:40:27.558461 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:27.557706 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/3bb520c8-5109-453e-8cb0-2d1c298b05a9-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-gm955\" (UID: \"3bb520c8-5109-453e-8cb0-2d1c298b05a9\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-gm955" Apr 16 17:40:27.558461 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:27.557771 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7d3e502c-98bf-4943-aa5a-2412d9b53369-registry-tls\") pod \"image-registry-7f8454dc74-5t5l5\" (UID: \"7d3e502c-98bf-4943-aa5a-2412d9b53369\") " pod="openshift-image-registry/image-registry-7f8454dc74-5t5l5" Apr 16 17:40:27.558461 ip-10-0-138-45 kubenswrapper[2574]: E0416 17:40:27.557897 2574 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 17:40:27.558461 ip-10-0-138-45 kubenswrapper[2574]: E0416 17:40:27.557911 2574 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7f8454dc74-5t5l5: secret "image-registry-tls" not found Apr 16 17:40:27.558461 ip-10-0-138-45 kubenswrapper[2574]: E0416 17:40:27.557970 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7d3e502c-98bf-4943-aa5a-2412d9b53369-registry-tls podName:7d3e502c-98bf-4943-aa5a-2412d9b53369 nodeName:}" failed. No retries permitted until 2026-04-16 17:40:31.557952061 +0000 UTC m=+41.717322323 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/7d3e502c-98bf-4943-aa5a-2412d9b53369-registry-tls") pod "image-registry-7f8454dc74-5t5l5" (UID: "7d3e502c-98bf-4943-aa5a-2412d9b53369") : secret "image-registry-tls" not found Apr 16 17:40:27.558461 ip-10-0-138-45 kubenswrapper[2574]: E0416 17:40:27.558370 2574 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 17:40:27.558461 ip-10-0-138-45 kubenswrapper[2574]: E0416 17:40:27.558421 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3bb520c8-5109-453e-8cb0-2d1c298b05a9-samples-operator-tls podName:3bb520c8-5109-453e-8cb0-2d1c298b05a9 nodeName:}" failed. No retries permitted until 2026-04-16 17:40:31.55840557 +0000 UTC m=+41.717775834 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/3bb520c8-5109-453e-8cb0-2d1c298b05a9-samples-operator-tls") pod "cluster-samples-operator-667775844f-gm955" (UID: "3bb520c8-5109-453e-8cb0-2d1c298b05a9") : secret "samples-operator-tls" not found Apr 16 17:40:27.598941 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:27.598889 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-5cb6cf4cb4-bbrtg"] Apr 16 17:40:27.611827 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:27.611794 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-5cb6cf4cb4-bbrtg"] Apr 16 17:40:27.611997 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:27.611945 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-bbrtg" Apr 16 17:40:27.614960 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:27.614333 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"networking-console-plugin-cert\"" Apr 16 17:40:27.614960 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:27.614558 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-bv2x4\"" Apr 16 17:40:27.614960 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:27.614803 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-console\"/\"networking-console-plugin\"" Apr 16 17:40:27.658679 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:27.658640 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c1e50cfa-f2ad-4fdb-bf97-8bab7c30928c-cert\") pod \"ingress-canary-jv4sl\" (UID: \"c1e50cfa-f2ad-4fdb-bf97-8bab7c30928c\") " pod="openshift-ingress-canary/ingress-canary-jv4sl" Apr 16 17:40:27.659243 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:27.658730 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d237b125-8598-4bde-9a2c-f3145e257d0a-metrics-tls\") pod \"dns-default-bs84n\" (UID: \"d237b125-8598-4bde-9a2c-f3145e257d0a\") " pod="openshift-dns/dns-default-bs84n" Apr 16 17:40:27.659243 ip-10-0-138-45 kubenswrapper[2574]: E0416 17:40:27.658887 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 17:40:27.659243 ip-10-0-138-45 kubenswrapper[2574]: E0416 17:40:27.658956 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d237b125-8598-4bde-9a2c-f3145e257d0a-metrics-tls podName:d237b125-8598-4bde-9a2c-f3145e257d0a nodeName:}" failed. No retries permitted until 2026-04-16 17:40:31.658938427 +0000 UTC m=+41.818308701 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d237b125-8598-4bde-9a2c-f3145e257d0a-metrics-tls") pod "dns-default-bs84n" (UID: "d237b125-8598-4bde-9a2c-f3145e257d0a") : secret "dns-default-metrics-tls" not found Apr 16 17:40:27.659416 ip-10-0-138-45 kubenswrapper[2574]: E0416 17:40:27.659369 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 17:40:27.659468 ip-10-0-138-45 kubenswrapper[2574]: E0416 17:40:27.659418 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c1e50cfa-f2ad-4fdb-bf97-8bab7c30928c-cert podName:c1e50cfa-f2ad-4fdb-bf97-8bab7c30928c nodeName:}" failed. No retries permitted until 2026-04-16 17:40:31.659403587 +0000 UTC m=+41.818773850 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c1e50cfa-f2ad-4fdb-bf97-8bab7c30928c-cert") pod "ingress-canary-jv4sl" (UID: "c1e50cfa-f2ad-4fdb-bf97-8bab7c30928c") : secret "canary-serving-cert" not found Apr 16 17:40:27.707185 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:27.706299 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5ww9l" event={"ID":"37af66f0-77b6-4e5a-9cd0-c7267f06ab11","Type":"ContainerStarted","Data":"370d490e0ed656c7359b8cfe27f9e13549c41b68b7452bb113c6d262f1ae1539"} Apr 16 17:40:27.733264 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:27.732859 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-5ww9l" podStartSLOduration=6.631704845 podStartE2EDuration="37.732836647s" podCreationTimestamp="2026-04-16 17:39:50 +0000 UTC" firstStartedPulling="2026-04-16 17:39:53.21306311 +0000 UTC m=+3.372433382" lastFinishedPulling="2026-04-16 17:40:24.31419492 +0000 UTC m=+34.473565184" observedRunningTime="2026-04-16 17:40:27.731322865 +0000 UTC m=+37.890693149" watchObservedRunningTime="2026-04-16 17:40:27.732836647 +0000 UTC m=+37.892206930" Apr 16 17:40:27.760126 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:27.759926 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/c8d26167-1916-46e8-8dba-80fb8694cf64-nginx-conf\") pod \"networking-console-plugin-5cb6cf4cb4-bbrtg\" (UID: \"c8d26167-1916-46e8-8dba-80fb8694cf64\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-bbrtg" Apr 16 17:40:27.760126 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:27.760018 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/c8d26167-1916-46e8-8dba-80fb8694cf64-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-bbrtg\" (UID: \"c8d26167-1916-46e8-8dba-80fb8694cf64\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-bbrtg" Apr 16 17:40:27.863413 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:27.861133 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/c8d26167-1916-46e8-8dba-80fb8694cf64-nginx-conf\") pod \"networking-console-plugin-5cb6cf4cb4-bbrtg\" (UID: \"c8d26167-1916-46e8-8dba-80fb8694cf64\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-bbrtg" Apr 16 17:40:27.863413 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:27.861259 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/c8d26167-1916-46e8-8dba-80fb8694cf64-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-bbrtg\" (UID: \"c8d26167-1916-46e8-8dba-80fb8694cf64\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-bbrtg" Apr 16 17:40:27.863413 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:27.862444 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/c8d26167-1916-46e8-8dba-80fb8694cf64-nginx-conf\") pod \"networking-console-plugin-5cb6cf4cb4-bbrtg\" (UID: \"c8d26167-1916-46e8-8dba-80fb8694cf64\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-bbrtg" Apr 16 17:40:27.863413 ip-10-0-138-45 kubenswrapper[2574]: E0416 17:40:27.863053 2574 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 17:40:27.863413 ip-10-0-138-45 kubenswrapper[2574]: E0416 17:40:27.863113 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c8d26167-1916-46e8-8dba-80fb8694cf64-networking-console-plugin-cert podName:c8d26167-1916-46e8-8dba-80fb8694cf64 nodeName:}" failed. No retries permitted until 2026-04-16 17:40:28.363094594 +0000 UTC m=+38.522464858 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/c8d26167-1916-46e8-8dba-80fb8694cf64-networking-console-plugin-cert") pod "networking-console-plugin-5cb6cf4cb4-bbrtg" (UID: "c8d26167-1916-46e8-8dba-80fb8694cf64") : secret "networking-console-plugin-cert" not found Apr 16 17:40:28.366298 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:28.366254 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/c8d26167-1916-46e8-8dba-80fb8694cf64-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-bbrtg\" (UID: \"c8d26167-1916-46e8-8dba-80fb8694cf64\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-bbrtg" Apr 16 17:40:28.366528 ip-10-0-138-45 kubenswrapper[2574]: E0416 17:40:28.366510 2574 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 17:40:28.366620 ip-10-0-138-45 kubenswrapper[2574]: E0416 17:40:28.366591 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c8d26167-1916-46e8-8dba-80fb8694cf64-networking-console-plugin-cert podName:c8d26167-1916-46e8-8dba-80fb8694cf64 nodeName:}" failed. No retries permitted until 2026-04-16 17:40:29.366556704 +0000 UTC m=+39.525926964 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/c8d26167-1916-46e8-8dba-80fb8694cf64-networking-console-plugin-cert") pod "networking-console-plugin-5cb6cf4cb4-bbrtg" (UID: "c8d26167-1916-46e8-8dba-80fb8694cf64") : secret "networking-console-plugin-cert" not found Apr 16 17:40:29.376026 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:29.375978 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/c8d26167-1916-46e8-8dba-80fb8694cf64-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-bbrtg\" (UID: \"c8d26167-1916-46e8-8dba-80fb8694cf64\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-bbrtg" Apr 16 17:40:29.376618 ip-10-0-138-45 kubenswrapper[2574]: E0416 17:40:29.376152 2574 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 17:40:29.376618 ip-10-0-138-45 kubenswrapper[2574]: E0416 17:40:29.376236 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c8d26167-1916-46e8-8dba-80fb8694cf64-networking-console-plugin-cert podName:c8d26167-1916-46e8-8dba-80fb8694cf64 nodeName:}" failed. No retries permitted until 2026-04-16 17:40:31.376214371 +0000 UTC m=+41.535584636 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/c8d26167-1916-46e8-8dba-80fb8694cf64-networking-console-plugin-cert") pod "networking-console-plugin-5cb6cf4cb4-bbrtg" (UID: "c8d26167-1916-46e8-8dba-80fb8694cf64") : secret "networking-console-plugin-cert" not found Apr 16 17:40:31.394458 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:31.394419 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/c8d26167-1916-46e8-8dba-80fb8694cf64-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-bbrtg\" (UID: \"c8d26167-1916-46e8-8dba-80fb8694cf64\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-bbrtg" Apr 16 17:40:31.395027 ip-10-0-138-45 kubenswrapper[2574]: E0416 17:40:31.394610 2574 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 17:40:31.395027 ip-10-0-138-45 kubenswrapper[2574]: E0416 17:40:31.394698 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c8d26167-1916-46e8-8dba-80fb8694cf64-networking-console-plugin-cert podName:c8d26167-1916-46e8-8dba-80fb8694cf64 nodeName:}" failed. No retries permitted until 2026-04-16 17:40:35.394677239 +0000 UTC m=+45.554047516 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/c8d26167-1916-46e8-8dba-80fb8694cf64-networking-console-plugin-cert") pod "networking-console-plugin-5cb6cf4cb4-bbrtg" (UID: "c8d26167-1916-46e8-8dba-80fb8694cf64") : secret "networking-console-plugin-cert" not found Apr 16 17:40:31.596124 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:31.596084 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/3bb520c8-5109-453e-8cb0-2d1c298b05a9-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-gm955\" (UID: \"3bb520c8-5109-453e-8cb0-2d1c298b05a9\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-gm955" Apr 16 17:40:31.596306 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:31.596140 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7d3e502c-98bf-4943-aa5a-2412d9b53369-registry-tls\") pod \"image-registry-7f8454dc74-5t5l5\" (UID: \"7d3e502c-98bf-4943-aa5a-2412d9b53369\") " pod="openshift-image-registry/image-registry-7f8454dc74-5t5l5" Apr 16 17:40:31.596306 ip-10-0-138-45 kubenswrapper[2574]: E0416 17:40:31.596257 2574 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 17:40:31.596306 ip-10-0-138-45 kubenswrapper[2574]: E0416 17:40:31.596294 2574 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 17:40:31.596306 ip-10-0-138-45 kubenswrapper[2574]: E0416 17:40:31.596309 2574 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7f8454dc74-5t5l5: secret "image-registry-tls" not found Apr 16 17:40:31.596470 ip-10-0-138-45 kubenswrapper[2574]: E0416 17:40:31.596328 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3bb520c8-5109-453e-8cb0-2d1c298b05a9-samples-operator-tls podName:3bb520c8-5109-453e-8cb0-2d1c298b05a9 nodeName:}" failed. No retries permitted until 2026-04-16 17:40:39.596311964 +0000 UTC m=+49.755682228 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/3bb520c8-5109-453e-8cb0-2d1c298b05a9-samples-operator-tls") pod "cluster-samples-operator-667775844f-gm955" (UID: "3bb520c8-5109-453e-8cb0-2d1c298b05a9") : secret "samples-operator-tls" not found Apr 16 17:40:31.596470 ip-10-0-138-45 kubenswrapper[2574]: E0416 17:40:31.596358 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7d3e502c-98bf-4943-aa5a-2412d9b53369-registry-tls podName:7d3e502c-98bf-4943-aa5a-2412d9b53369 nodeName:}" failed. No retries permitted until 2026-04-16 17:40:39.596342316 +0000 UTC m=+49.755712577 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/7d3e502c-98bf-4943-aa5a-2412d9b53369-registry-tls") pod "image-registry-7f8454dc74-5t5l5" (UID: "7d3e502c-98bf-4943-aa5a-2412d9b53369") : secret "image-registry-tls" not found Apr 16 17:40:31.696817 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:31.696786 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d237b125-8598-4bde-9a2c-f3145e257d0a-metrics-tls\") pod \"dns-default-bs84n\" (UID: \"d237b125-8598-4bde-9a2c-f3145e257d0a\") " pod="openshift-dns/dns-default-bs84n" Apr 16 17:40:31.696976 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:31.696886 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c1e50cfa-f2ad-4fdb-bf97-8bab7c30928c-cert\") pod \"ingress-canary-jv4sl\" (UID: \"c1e50cfa-f2ad-4fdb-bf97-8bab7c30928c\") " pod="openshift-ingress-canary/ingress-canary-jv4sl" Apr 16 17:40:31.696976 ip-10-0-138-45 kubenswrapper[2574]: E0416 17:40:31.696947 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 17:40:31.696976 ip-10-0-138-45 kubenswrapper[2574]: E0416 17:40:31.696967 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 17:40:31.697078 ip-10-0-138-45 kubenswrapper[2574]: E0416 17:40:31.697011 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c1e50cfa-f2ad-4fdb-bf97-8bab7c30928c-cert podName:c1e50cfa-f2ad-4fdb-bf97-8bab7c30928c nodeName:}" failed. No retries permitted until 2026-04-16 17:40:39.696994867 +0000 UTC m=+49.856365129 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c1e50cfa-f2ad-4fdb-bf97-8bab7c30928c-cert") pod "ingress-canary-jv4sl" (UID: "c1e50cfa-f2ad-4fdb-bf97-8bab7c30928c") : secret "canary-serving-cert" not found Apr 16 17:40:31.697078 ip-10-0-138-45 kubenswrapper[2574]: E0416 17:40:31.697026 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d237b125-8598-4bde-9a2c-f3145e257d0a-metrics-tls podName:d237b125-8598-4bde-9a2c-f3145e257d0a nodeName:}" failed. No retries permitted until 2026-04-16 17:40:39.697019243 +0000 UTC m=+49.856389510 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d237b125-8598-4bde-9a2c-f3145e257d0a-metrics-tls") pod "dns-default-bs84n" (UID: "d237b125-8598-4bde-9a2c-f3145e257d0a") : secret "dns-default-metrics-tls" not found Apr 16 17:40:34.621636 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:34.621585 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/cf7246f1-b28a-4d7e-bda9-3608f1c76516-original-pull-secret\") pod \"global-pull-secret-syncer-49fnc\" (UID: \"cf7246f1-b28a-4d7e-bda9-3608f1c76516\") " pod="kube-system/global-pull-secret-syncer-49fnc" Apr 16 17:40:34.626440 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:34.626409 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/cf7246f1-b28a-4d7e-bda9-3608f1c76516-original-pull-secret\") pod \"global-pull-secret-syncer-49fnc\" (UID: \"cf7246f1-b28a-4d7e-bda9-3608f1c76516\") " pod="kube-system/global-pull-secret-syncer-49fnc" Apr 16 17:40:34.815677 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:34.815640 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-49fnc" Apr 16 17:40:35.427716 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:35.427681 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/c8d26167-1916-46e8-8dba-80fb8694cf64-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-bbrtg\" (UID: \"c8d26167-1916-46e8-8dba-80fb8694cf64\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-bbrtg" Apr 16 17:40:35.427892 ip-10-0-138-45 kubenswrapper[2574]: E0416 17:40:35.427855 2574 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 17:40:35.427941 ip-10-0-138-45 kubenswrapper[2574]: E0416 17:40:35.427934 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c8d26167-1916-46e8-8dba-80fb8694cf64-networking-console-plugin-cert podName:c8d26167-1916-46e8-8dba-80fb8694cf64 nodeName:}" failed. No retries permitted until 2026-04-16 17:40:43.42791712 +0000 UTC m=+53.587287379 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/c8d26167-1916-46e8-8dba-80fb8694cf64-networking-console-plugin-cert") pod "networking-console-plugin-5cb6cf4cb4-bbrtg" (UID: "c8d26167-1916-46e8-8dba-80fb8694cf64") : secret "networking-console-plugin-cert" not found Apr 16 17:40:36.302308 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:36.302250 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-49fnc"] Apr 16 17:40:36.728496 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:36.728456 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-76b6b8b57d-ht8l8" event={"ID":"0583cad7-fc5e-4718-a618-7be058c6fa40","Type":"ContainerStarted","Data":"b41f28fd8235e345a419d55ea5d37e6e11c7824ce36bdce2a8f9081e05d4fd18"} Apr 16 17:40:36.729844 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:36.729814 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-9c6fb4dd8-k7hnt" event={"ID":"6b4b86fb-c5fd-4c18-8467-71f4575e6f41","Type":"ContainerStarted","Data":"2df8ffdddda9732974a93091c18ef21a1675508b8497a173c806703e1830d58e"} Apr 16 17:40:36.730144 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:36.730127 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-9c6fb4dd8-k7hnt" Apr 16 17:40:36.731175 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:36.731152 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-7fwlg" event={"ID":"a957470a-21dc-4720-b171-61e476f5ec68","Type":"ContainerStarted","Data":"a9a19f4f24c0a4d321994be228a3103f3eecc886c7de8e3bdc96639f9e54abe3"} Apr 16 17:40:36.737536 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:36.737512 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-9c6fb4dd8-k7hnt" Apr 16 17:40:36.737766 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:36.737735 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-49fnc" event={"ID":"cf7246f1-b28a-4d7e-bda9-3608f1c76516","Type":"ContainerStarted","Data":"25a67e66350defba1353e7e14dc1156bac1e5018406d5a660d3f3ecc266813a9"} Apr 16 17:40:36.739060 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:36.739030 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-7b678d77c7-dn2q5" event={"ID":"e224778c-6e6a-4ece-bf4c-6f9f2accf344","Type":"ContainerStarted","Data":"c7bfd0a6ac2fb2e026d97bf883fe53dfa53af5d83184d201996c7bb74bb67a2a"} Apr 16 17:40:36.740360 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:36.740338 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-n6pht" event={"ID":"c2483f11-da48-413a-a820-2d4de16e6ae2","Type":"ContainerStarted","Data":"2cdb84c28e34dc75bc4ee69010e2e9d5befdd27ddbcb3ad2cfcdbd48b171c240"} Apr 16 17:40:36.741682 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:36.741659 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-qdmf7" event={"ID":"7df6448d-96ef-4887-bf9c-4b4dfb72c238","Type":"ContainerStarted","Data":"e6107f2c7f154063e69356a4e0a729c43ce94bfa7d54bc67df288f8b8e47d476"} Apr 16 17:40:36.743348 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:36.743330 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-zcvg5_05d7dd61-bda6-4c4d-87b6-86d7c20fe0cd/console-operator/0.log" Apr 16 17:40:36.743436 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:36.743366 2574 generic.go:358] "Generic (PLEG): container finished" podID="05d7dd61-bda6-4c4d-87b6-86d7c20fe0cd" containerID="00996d795eac846e0fe68dc81750ea3b21869613635d14ee386beeecaf59c80e" exitCode=255 Apr 16 17:40:36.743493 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:36.743461 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-d87b8d5fc-zcvg5" event={"ID":"05d7dd61-bda6-4c4d-87b6-86d7c20fe0cd","Type":"ContainerDied","Data":"00996d795eac846e0fe68dc81750ea3b21869613635d14ee386beeecaf59c80e"} Apr 16 17:40:36.743679 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:36.743655 2574 scope.go:117] "RemoveContainer" containerID="00996d795eac846e0fe68dc81750ea3b21869613635d14ee386beeecaf59c80e" Apr 16 17:40:36.747934 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:36.747913 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-79b8f4f7c7-ps6xx" event={"ID":"ec0d3d6e-865b-4664-bf7c-6b26346d0631","Type":"ContainerStarted","Data":"9ef5369433c36af478fd5d0dc278f93db99f6de112a0318d1c965420d7094331"} Apr 16 17:40:36.749920 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:36.749878 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-2kgzs" event={"ID":"b2fa5fab-8dc1-4b89-bb5a-d252dd35fa2d","Type":"ContainerStarted","Data":"dae90fca80dab76f86fb7b696880a9ebc78d825d076cafbb19bdc5e2c017ea2b"} Apr 16 17:40:36.750091 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:36.750073 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-2kgzs" Apr 16 17:40:36.751016 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:36.750972 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-9c6fb4dd8-k7hnt" podStartSLOduration=18.068665659 podStartE2EDuration="29.750957603s" podCreationTimestamp="2026-04-16 17:40:07 +0000 UTC" firstStartedPulling="2026-04-16 17:40:24.463426253 +0000 UTC m=+34.622796528" lastFinishedPulling="2026-04-16 17:40:36.145718208 +0000 UTC m=+46.305088472" observedRunningTime="2026-04-16 17:40:36.749349791 +0000 UTC m=+46.908720073" watchObservedRunningTime="2026-04-16 17:40:36.750957603 +0000 UTC m=+46.910327890" Apr 16 17:40:36.791618 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:36.791539 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-n6pht" podStartSLOduration=34.476781149 podStartE2EDuration="45.791520018s" podCreationTimestamp="2026-04-16 17:39:51 +0000 UTC" firstStartedPulling="2026-04-16 17:40:24.402733715 +0000 UTC m=+34.562103989" lastFinishedPulling="2026-04-16 17:40:35.71747258 +0000 UTC m=+45.876842858" observedRunningTime="2026-04-16 17:40:36.766197695 +0000 UTC m=+46.925567977" watchObservedRunningTime="2026-04-16 17:40:36.791520018 +0000 UTC m=+46.950890303" Apr 16 17:40:36.812706 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:36.812653 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-qdmf7" podStartSLOduration=29.092033994 podStartE2EDuration="40.812633451s" podCreationTimestamp="2026-04-16 17:39:56 +0000 UTC" firstStartedPulling="2026-04-16 17:40:24.408533125 +0000 UTC m=+34.567903385" lastFinishedPulling="2026-04-16 17:40:36.129132569 +0000 UTC m=+46.288502842" observedRunningTime="2026-04-16 17:40:36.811235082 +0000 UTC m=+46.970605363" watchObservedRunningTime="2026-04-16 17:40:36.812633451 +0000 UTC m=+46.972003735" Apr 16 17:40:36.834174 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:36.834125 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-7fwlg" podStartSLOduration=26.108669502 podStartE2EDuration="37.834105376s" podCreationTimestamp="2026-04-16 17:39:59 +0000 UTC" firstStartedPulling="2026-04-16 17:40:24.403109089 +0000 UTC m=+34.562479365" lastFinishedPulling="2026-04-16 17:40:36.128544969 +0000 UTC m=+46.287915239" observedRunningTime="2026-04-16 17:40:36.83238264 +0000 UTC m=+46.991752919" watchObservedRunningTime="2026-04-16 17:40:36.834105376 +0000 UTC m=+46.993475659" Apr 16 17:40:36.851412 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:36.851161 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-79b8f4f7c7-ps6xx" podStartSLOduration=18.149325945 podStartE2EDuration="29.851143737s" podCreationTimestamp="2026-04-16 17:40:07 +0000 UTC" firstStartedPulling="2026-04-16 17:40:24.44363008 +0000 UTC m=+34.603000345" lastFinishedPulling="2026-04-16 17:40:36.14544787 +0000 UTC m=+46.304818137" observedRunningTime="2026-04-16 17:40:36.8498893 +0000 UTC m=+47.009259578" watchObservedRunningTime="2026-04-16 17:40:36.851143737 +0000 UTC m=+47.010514020" Apr 16 17:40:36.871989 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:36.871473 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-7b678d77c7-dn2q5" podStartSLOduration=25.17028562 podStartE2EDuration="36.871455964s" podCreationTimestamp="2026-04-16 17:40:00 +0000 UTC" firstStartedPulling="2026-04-16 17:40:24.444286217 +0000 UTC m=+34.603656484" lastFinishedPulling="2026-04-16 17:40:36.145456564 +0000 UTC m=+46.304826828" observedRunningTime="2026-04-16 17:40:36.87072527 +0000 UTC m=+47.030095569" watchObservedRunningTime="2026-04-16 17:40:36.871455964 +0000 UTC m=+47.030826246" Apr 16 17:40:36.909242 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:36.909088 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-2kgzs" podStartSLOduration=36.470944742 podStartE2EDuration="46.909068332s" podCreationTimestamp="2026-04-16 17:39:50 +0000 UTC" firstStartedPulling="2026-04-16 17:40:25.72023128 +0000 UTC m=+35.879601545" lastFinishedPulling="2026-04-16 17:40:36.158354861 +0000 UTC m=+46.317725135" observedRunningTime="2026-04-16 17:40:36.907450287 +0000 UTC m=+47.066820568" watchObservedRunningTime="2026-04-16 17:40:36.909068332 +0000 UTC m=+47.068438615" Apr 16 17:40:37.757422 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:37.757390 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-zcvg5_05d7dd61-bda6-4c4d-87b6-86d7c20fe0cd/console-operator/1.log" Apr 16 17:40:37.757997 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:37.757947 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-zcvg5_05d7dd61-bda6-4c4d-87b6-86d7c20fe0cd/console-operator/0.log" Apr 16 17:40:37.757997 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:37.757984 2574 generic.go:358] "Generic (PLEG): container finished" podID="05d7dd61-bda6-4c4d-87b6-86d7c20fe0cd" containerID="7d23f14203d633bb37abe47b7a082673cf6cb6fa07f6ee5ef3c7c07060cc5f32" exitCode=255 Apr 16 17:40:37.758561 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:37.758535 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-d87b8d5fc-zcvg5" event={"ID":"05d7dd61-bda6-4c4d-87b6-86d7c20fe0cd","Type":"ContainerDied","Data":"7d23f14203d633bb37abe47b7a082673cf6cb6fa07f6ee5ef3c7c07060cc5f32"} Apr 16 17:40:37.758683 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:37.758593 2574 scope.go:117] "RemoveContainer" containerID="00996d795eac846e0fe68dc81750ea3b21869613635d14ee386beeecaf59c80e" Apr 16 17:40:37.758990 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:37.758975 2574 scope.go:117] "RemoveContainer" containerID="7d23f14203d633bb37abe47b7a082673cf6cb6fa07f6ee5ef3c7c07060cc5f32" Apr 16 17:40:37.759215 ip-10-0-138-45 kubenswrapper[2574]: E0416 17:40:37.759193 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-d87b8d5fc-zcvg5_openshift-console-operator(05d7dd61-bda6-4c4d-87b6-86d7c20fe0cd)\"" pod="openshift-console-operator/console-operator-d87b8d5fc-zcvg5" podUID="05d7dd61-bda6-4c4d-87b6-86d7c20fe0cd" Apr 16 17:40:38.765555 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:38.765528 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-zcvg5_05d7dd61-bda6-4c4d-87b6-86d7c20fe0cd/console-operator/1.log" Apr 16 17:40:38.766150 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:38.766126 2574 scope.go:117] "RemoveContainer" containerID="7d23f14203d633bb37abe47b7a082673cf6cb6fa07f6ee5ef3c7c07060cc5f32" Apr 16 17:40:38.766349 ip-10-0-138-45 kubenswrapper[2574]: E0416 17:40:38.766325 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-d87b8d5fc-zcvg5_openshift-console-operator(05d7dd61-bda6-4c4d-87b6-86d7c20fe0cd)\"" pod="openshift-console-operator/console-operator-d87b8d5fc-zcvg5" podUID="05d7dd61-bda6-4c4d-87b6-86d7c20fe0cd" Apr 16 17:40:39.671318 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:39.670716 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7d3e502c-98bf-4943-aa5a-2412d9b53369-registry-tls\") pod \"image-registry-7f8454dc74-5t5l5\" (UID: \"7d3e502c-98bf-4943-aa5a-2412d9b53369\") " pod="openshift-image-registry/image-registry-7f8454dc74-5t5l5" Apr 16 17:40:39.671318 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:39.670879 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/3bb520c8-5109-453e-8cb0-2d1c298b05a9-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-gm955\" (UID: \"3bb520c8-5109-453e-8cb0-2d1c298b05a9\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-gm955" Apr 16 17:40:39.671318 ip-10-0-138-45 kubenswrapper[2574]: E0416 17:40:39.670883 2574 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 17:40:39.671318 ip-10-0-138-45 kubenswrapper[2574]: E0416 17:40:39.670926 2574 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7f8454dc74-5t5l5: secret "image-registry-tls" not found Apr 16 17:40:39.671318 ip-10-0-138-45 kubenswrapper[2574]: E0416 17:40:39.670938 2574 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 17:40:39.671318 ip-10-0-138-45 kubenswrapper[2574]: E0416 17:40:39.670989 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7d3e502c-98bf-4943-aa5a-2412d9b53369-registry-tls podName:7d3e502c-98bf-4943-aa5a-2412d9b53369 nodeName:}" failed. No retries permitted until 2026-04-16 17:40:55.670968414 +0000 UTC m=+65.830338698 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/7d3e502c-98bf-4943-aa5a-2412d9b53369-registry-tls") pod "image-registry-7f8454dc74-5t5l5" (UID: "7d3e502c-98bf-4943-aa5a-2412d9b53369") : secret "image-registry-tls" not found Apr 16 17:40:39.671318 ip-10-0-138-45 kubenswrapper[2574]: E0416 17:40:39.671009 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3bb520c8-5109-453e-8cb0-2d1c298b05a9-samples-operator-tls podName:3bb520c8-5109-453e-8cb0-2d1c298b05a9 nodeName:}" failed. No retries permitted until 2026-04-16 17:40:55.670996434 +0000 UTC m=+65.830366694 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/3bb520c8-5109-453e-8cb0-2d1c298b05a9-samples-operator-tls") pod "cluster-samples-operator-667775844f-gm955" (UID: "3bb520c8-5109-453e-8cb0-2d1c298b05a9") : secret "samples-operator-tls" not found Apr 16 17:40:39.771894 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:39.771859 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d237b125-8598-4bde-9a2c-f3145e257d0a-metrics-tls\") pod \"dns-default-bs84n\" (UID: \"d237b125-8598-4bde-9a2c-f3145e257d0a\") " pod="openshift-dns/dns-default-bs84n" Apr 16 17:40:39.772318 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:39.771970 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c1e50cfa-f2ad-4fdb-bf97-8bab7c30928c-cert\") pod \"ingress-canary-jv4sl\" (UID: \"c1e50cfa-f2ad-4fdb-bf97-8bab7c30928c\") " pod="openshift-ingress-canary/ingress-canary-jv4sl" Apr 16 17:40:39.772318 ip-10-0-138-45 kubenswrapper[2574]: E0416 17:40:39.772008 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 17:40:39.772318 ip-10-0-138-45 kubenswrapper[2574]: E0416 17:40:39.772067 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 17:40:39.772318 ip-10-0-138-45 kubenswrapper[2574]: E0416 17:40:39.772086 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d237b125-8598-4bde-9a2c-f3145e257d0a-metrics-tls podName:d237b125-8598-4bde-9a2c-f3145e257d0a nodeName:}" failed. No retries permitted until 2026-04-16 17:40:55.772065842 +0000 UTC m=+65.931436119 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d237b125-8598-4bde-9a2c-f3145e257d0a-metrics-tls") pod "dns-default-bs84n" (UID: "d237b125-8598-4bde-9a2c-f3145e257d0a") : secret "dns-default-metrics-tls" not found Apr 16 17:40:39.772318 ip-10-0-138-45 kubenswrapper[2574]: E0416 17:40:39.772157 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c1e50cfa-f2ad-4fdb-bf97-8bab7c30928c-cert podName:c1e50cfa-f2ad-4fdb-bf97-8bab7c30928c nodeName:}" failed. No retries permitted until 2026-04-16 17:40:55.772144854 +0000 UTC m=+65.931515114 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c1e50cfa-f2ad-4fdb-bf97-8bab7c30928c-cert") pod "ingress-canary-jv4sl" (UID: "c1e50cfa-f2ad-4fdb-bf97-8bab7c30928c") : secret "canary-serving-cert" not found Apr 16 17:40:41.775767 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:41.775728 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-49fnc" event={"ID":"cf7246f1-b28a-4d7e-bda9-3608f1c76516","Type":"ContainerStarted","Data":"10acaeeb4a80befa088973abfb9789e5970859df09b051c9ae52e7c702afa871"} Apr 16 17:40:41.777381 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:41.777355 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-76b6b8b57d-ht8l8" event={"ID":"0583cad7-fc5e-4718-a618-7be058c6fa40","Type":"ContainerStarted","Data":"58b4345e2d074c4673d863b3ae0ca510af0cdd06caf607753277e97875904fb7"} Apr 16 17:40:41.777504 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:41.777386 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-76b6b8b57d-ht8l8" event={"ID":"0583cad7-fc5e-4718-a618-7be058c6fa40","Type":"ContainerStarted","Data":"7c3fef33f13093334dd7228bc3f55b5e08d2f33b4dc0a9ebc435e08d9d5dcf46"} Apr 16 17:40:41.792118 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:41.792076 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-49fnc" podStartSLOduration=34.854034926 podStartE2EDuration="39.792062474s" podCreationTimestamp="2026-04-16 17:40:02 +0000 UTC" firstStartedPulling="2026-04-16 17:40:36.320726383 +0000 UTC m=+46.480096642" lastFinishedPulling="2026-04-16 17:40:41.258753923 +0000 UTC m=+51.418124190" observedRunningTime="2026-04-16 17:40:41.791020873 +0000 UTC m=+51.950391154" watchObservedRunningTime="2026-04-16 17:40:41.792062474 +0000 UTC m=+51.951432779" Apr 16 17:40:41.811293 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:41.811233 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-76b6b8b57d-ht8l8" podStartSLOduration=18.407036833 podStartE2EDuration="34.811218743s" podCreationTimestamp="2026-04-16 17:40:07 +0000 UTC" firstStartedPulling="2026-04-16 17:40:24.488193837 +0000 UTC m=+34.647564097" lastFinishedPulling="2026-04-16 17:40:40.892375747 +0000 UTC m=+51.051746007" observedRunningTime="2026-04-16 17:40:41.809592621 +0000 UTC m=+51.968962894" watchObservedRunningTime="2026-04-16 17:40:41.811218743 +0000 UTC m=+51.970589064" Apr 16 17:40:43.502702 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:43.502654 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/c8d26167-1916-46e8-8dba-80fb8694cf64-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-bbrtg\" (UID: \"c8d26167-1916-46e8-8dba-80fb8694cf64\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-bbrtg" Apr 16 17:40:43.503176 ip-10-0-138-45 kubenswrapper[2574]: E0416 17:40:43.502800 2574 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 17:40:43.503176 ip-10-0-138-45 kubenswrapper[2574]: E0416 17:40:43.502870 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c8d26167-1916-46e8-8dba-80fb8694cf64-networking-console-plugin-cert podName:c8d26167-1916-46e8-8dba-80fb8694cf64 nodeName:}" failed. No retries permitted until 2026-04-16 17:40:59.502853253 +0000 UTC m=+69.662223513 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/c8d26167-1916-46e8-8dba-80fb8694cf64-networking-console-plugin-cert") pod "networking-console-plugin-5cb6cf4cb4-bbrtg" (UID: "c8d26167-1916-46e8-8dba-80fb8694cf64") : secret "networking-console-plugin-cert" not found Apr 16 17:40:44.109518 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:44.109463 2574 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-d87b8d5fc-zcvg5" Apr 16 17:40:44.109518 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:44.109520 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-d87b8d5fc-zcvg5" Apr 16 17:40:44.109968 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:44.109951 2574 scope.go:117] "RemoveContainer" containerID="7d23f14203d633bb37abe47b7a082673cf6cb6fa07f6ee5ef3c7c07060cc5f32" Apr 16 17:40:44.110161 ip-10-0-138-45 kubenswrapper[2574]: E0416 17:40:44.110139 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-d87b8d5fc-zcvg5_openshift-console-operator(05d7dd61-bda6-4c4d-87b6-86d7c20fe0cd)\"" pod="openshift-console-operator/console-operator-d87b8d5fc-zcvg5" podUID="05d7dd61-bda6-4c4d-87b6-86d7c20fe0cd" Apr 16 17:40:49.591925 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:49.591891 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-42vwn" Apr 16 17:40:55.450734 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:55.450702 2574 scope.go:117] "RemoveContainer" containerID="7d23f14203d633bb37abe47b7a082673cf6cb6fa07f6ee5ef3c7c07060cc5f32" Apr 16 17:40:55.711543 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:55.707386 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/3bb520c8-5109-453e-8cb0-2d1c298b05a9-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-gm955\" (UID: \"3bb520c8-5109-453e-8cb0-2d1c298b05a9\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-gm955" Apr 16 17:40:55.711543 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:55.707465 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7d3e502c-98bf-4943-aa5a-2412d9b53369-registry-tls\") pod \"image-registry-7f8454dc74-5t5l5\" (UID: \"7d3e502c-98bf-4943-aa5a-2412d9b53369\") " pod="openshift-image-registry/image-registry-7f8454dc74-5t5l5" Apr 16 17:40:55.711543 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:55.710783 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7d3e502c-98bf-4943-aa5a-2412d9b53369-registry-tls\") pod \"image-registry-7f8454dc74-5t5l5\" (UID: \"7d3e502c-98bf-4943-aa5a-2412d9b53369\") " pod="openshift-image-registry/image-registry-7f8454dc74-5t5l5" Apr 16 17:40:55.712202 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:55.712180 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/3bb520c8-5109-453e-8cb0-2d1c298b05a9-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-gm955\" (UID: \"3bb520c8-5109-453e-8cb0-2d1c298b05a9\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-gm955" Apr 16 17:40:55.808551 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:55.808509 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d237b125-8598-4bde-9a2c-f3145e257d0a-metrics-tls\") pod \"dns-default-bs84n\" (UID: \"d237b125-8598-4bde-9a2c-f3145e257d0a\") " pod="openshift-dns/dns-default-bs84n" Apr 16 17:40:55.808751 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:55.808613 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c1e50cfa-f2ad-4fdb-bf97-8bab7c30928c-cert\") pod \"ingress-canary-jv4sl\" (UID: \"c1e50cfa-f2ad-4fdb-bf97-8bab7c30928c\") " pod="openshift-ingress-canary/ingress-canary-jv4sl" Apr 16 17:40:55.811176 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:55.811143 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d237b125-8598-4bde-9a2c-f3145e257d0a-metrics-tls\") pod \"dns-default-bs84n\" (UID: \"d237b125-8598-4bde-9a2c-f3145e257d0a\") " pod="openshift-dns/dns-default-bs84n" Apr 16 17:40:55.811299 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:55.811232 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c1e50cfa-f2ad-4fdb-bf97-8bab7c30928c-cert\") pod \"ingress-canary-jv4sl\" (UID: \"c1e50cfa-f2ad-4fdb-bf97-8bab7c30928c\") " pod="openshift-ingress-canary/ingress-canary-jv4sl" Apr 16 17:40:55.823044 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:55.823023 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-zcvg5_05d7dd61-bda6-4c4d-87b6-86d7c20fe0cd/console-operator/1.log" Apr 16 17:40:55.823153 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:55.823105 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-d87b8d5fc-zcvg5" event={"ID":"05d7dd61-bda6-4c4d-87b6-86d7c20fe0cd","Type":"ContainerStarted","Data":"e8479172bb927caf53b74508d120b4a59b08ff90248bda66cd842d8676fd089a"} Apr 16 17:40:55.823431 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:55.823404 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-d87b8d5fc-zcvg5" Apr 16 17:40:55.841590 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:55.841533 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-d87b8d5fc-zcvg5" podStartSLOduration=52.10426859 podStartE2EDuration="1m3.841520735s" podCreationTimestamp="2026-04-16 17:39:52 +0000 UTC" firstStartedPulling="2026-04-16 17:40:24.406984512 +0000 UTC m=+34.566354786" lastFinishedPulling="2026-04-16 17:40:36.144236657 +0000 UTC m=+46.303606931" observedRunningTime="2026-04-16 17:40:55.840776833 +0000 UTC m=+66.000147116" watchObservedRunningTime="2026-04-16 17:40:55.841520735 +0000 UTC m=+66.000891017" Apr 16 17:40:55.879014 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:55.878985 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-n5znr\"" Apr 16 17:40:55.886866 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:55.886844 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7f8454dc74-5t5l5" Apr 16 17:40:55.894991 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:55.894968 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-r9plp\"" Apr 16 17:40:55.903589 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:55.903541 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-gm955" Apr 16 17:40:55.995522 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:55.995379 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-zdjnc\"" Apr 16 17:40:56.003605 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:56.003563 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-jv4sl" Apr 16 17:40:56.004412 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:56.004272 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-qhzh2\"" Apr 16 17:40:56.007052 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:56.006985 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-bs84n" Apr 16 17:40:56.029262 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:56.029179 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-7f8454dc74-5t5l5"] Apr 16 17:40:56.034933 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:40:56.034758 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7d3e502c_98bf_4943_aa5a_2412d9b53369.slice/crio-3319e4cd0884cc384d29f264e65f9dbc66b06576b609ea8600ec7a6d99a46832 WatchSource:0}: Error finding container 3319e4cd0884cc384d29f264e65f9dbc66b06576b609ea8600ec7a6d99a46832: Status 404 returned error can't find the container with id 3319e4cd0884cc384d29f264e65f9dbc66b06576b609ea8600ec7a6d99a46832 Apr 16 17:40:56.043509 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:56.043486 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-667775844f-gm955"] Apr 16 17:40:56.190980 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:56.190944 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-bs84n"] Apr 16 17:40:56.194801 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:40:56.194772 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd237b125_8598_4bde_9a2c_f3145e257d0a.slice/crio-f1a826c1298a7abeb40f5a0b0185a7f7e0ae78c72cb819381709ca380a24566d WatchSource:0}: Error finding container f1a826c1298a7abeb40f5a0b0185a7f7e0ae78c72cb819381709ca380a24566d: Status 404 returned error can't find the container with id f1a826c1298a7abeb40f5a0b0185a7f7e0ae78c72cb819381709ca380a24566d Apr 16 17:40:56.199989 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:56.199951 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-jv4sl"] Apr 16 17:40:56.203365 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:40:56.203341 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc1e50cfa_f2ad_4fdb_bf97_8bab7c30928c.slice/crio-28832357cee3ab771a9a1b2eb5363b62e47bac3853c293884bb3e39e14b297d7 WatchSource:0}: Error finding container 28832357cee3ab771a9a1b2eb5363b62e47bac3853c293884bb3e39e14b297d7: Status 404 returned error can't find the container with id 28832357cee3ab771a9a1b2eb5363b62e47bac3853c293884bb3e39e14b297d7 Apr 16 17:40:56.313603 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:56.313488 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/88ac585a-175b-45f2-8696-1c754b2a5a05-metrics-certs\") pod \"network-metrics-daemon-cxxrm\" (UID: \"88ac585a-175b-45f2-8696-1c754b2a5a05\") " pod="openshift-multus/network-metrics-daemon-cxxrm" Apr 16 17:40:56.315783 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:56.315758 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 17:40:56.326929 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:56.326898 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/88ac585a-175b-45f2-8696-1c754b2a5a05-metrics-certs\") pod \"network-metrics-daemon-cxxrm\" (UID: \"88ac585a-175b-45f2-8696-1c754b2a5a05\") " pod="openshift-multus/network-metrics-daemon-cxxrm" Apr 16 17:40:56.393424 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:56.393389 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-wdxwt\"" Apr 16 17:40:56.401683 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:56.401657 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cxxrm" Apr 16 17:40:56.542596 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:56.542538 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-cxxrm"] Apr 16 17:40:56.554618 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:40:56.554560 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod88ac585a_175b_45f2_8696_1c754b2a5a05.slice/crio-0995b0173b728875d092182f5753ac9fb1f11a6272f4b5d3125af19e70869281 WatchSource:0}: Error finding container 0995b0173b728875d092182f5753ac9fb1f11a6272f4b5d3125af19e70869281: Status 404 returned error can't find the container with id 0995b0173b728875d092182f5753ac9fb1f11a6272f4b5d3125af19e70869281 Apr 16 17:40:56.823822 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:56.823793 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-d87b8d5fc-zcvg5" Apr 16 17:40:56.829080 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:56.829035 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-jv4sl" event={"ID":"c1e50cfa-f2ad-4fdb-bf97-8bab7c30928c","Type":"ContainerStarted","Data":"28832357cee3ab771a9a1b2eb5363b62e47bac3853c293884bb3e39e14b297d7"} Apr 16 17:40:56.830462 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:56.830428 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-cxxrm" event={"ID":"88ac585a-175b-45f2-8696-1c754b2a5a05","Type":"ContainerStarted","Data":"0995b0173b728875d092182f5753ac9fb1f11a6272f4b5d3125af19e70869281"} Apr 16 17:40:56.832336 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:56.832306 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-7f8454dc74-5t5l5" event={"ID":"7d3e502c-98bf-4943-aa5a-2412d9b53369","Type":"ContainerStarted","Data":"5c80e3e4809676491289eb2502a1562ae3f5299f91ffb28eb93d6dedf726b68a"} Apr 16 17:40:56.832461 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:56.832342 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-7f8454dc74-5t5l5" event={"ID":"7d3e502c-98bf-4943-aa5a-2412d9b53369","Type":"ContainerStarted","Data":"3319e4cd0884cc384d29f264e65f9dbc66b06576b609ea8600ec7a6d99a46832"} Apr 16 17:40:56.832517 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:56.832491 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-7f8454dc74-5t5l5" Apr 16 17:40:56.835616 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:56.835549 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-bs84n" event={"ID":"d237b125-8598-4bde-9a2c-f3145e257d0a","Type":"ContainerStarted","Data":"f1a826c1298a7abeb40f5a0b0185a7f7e0ae78c72cb819381709ca380a24566d"} Apr 16 17:40:56.837425 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:56.837387 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-gm955" event={"ID":"3bb520c8-5109-453e-8cb0-2d1c298b05a9","Type":"ContainerStarted","Data":"b55f5b0a68645c3b65a7a290033e5bee2d95da9b1ceae13d849a9aa7b98b8b6b"} Apr 16 17:40:59.546590 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:59.546523 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/c8d26167-1916-46e8-8dba-80fb8694cf64-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-bbrtg\" (UID: \"c8d26167-1916-46e8-8dba-80fb8694cf64\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-bbrtg" Apr 16 17:40:59.561714 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:59.561680 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/c8d26167-1916-46e8-8dba-80fb8694cf64-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-bbrtg\" (UID: \"c8d26167-1916-46e8-8dba-80fb8694cf64\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-bbrtg" Apr 16 17:40:59.733945 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:59.733908 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-bv2x4\"" Apr 16 17:40:59.737653 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:40:59.737635 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-bbrtg" Apr 16 17:41:00.031154 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:41:00.031100 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-7f8454dc74-5t5l5" podStartSLOduration=69.031064651 podStartE2EDuration="1m9.031064651s" podCreationTimestamp="2026-04-16 17:39:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 17:40:56.876564527 +0000 UTC m=+67.035934815" watchObservedRunningTime="2026-04-16 17:41:00.031064651 +0000 UTC m=+70.190434935" Apr 16 17:41:00.032176 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:41:00.032152 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-5cb6cf4cb4-bbrtg"] Apr 16 17:41:00.855236 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:41:00.855196 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-bs84n" event={"ID":"d237b125-8598-4bde-9a2c-f3145e257d0a","Type":"ContainerStarted","Data":"2f3fa3f9edf31b61778a6d647ef3f2d43841b69c00eff09e5dd6d230b14d1aae"} Apr 16 17:41:00.855236 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:41:00.855231 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-bs84n" event={"ID":"d237b125-8598-4bde-9a2c-f3145e257d0a","Type":"ContainerStarted","Data":"db5366248963e661af36e668b4201037f5231298e6bfac28116122d2ee2858f7"} Apr 16 17:41:00.856460 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:41:00.855373 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-bs84n" Apr 16 17:41:00.856885 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:41:00.856863 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-gm955" event={"ID":"3bb520c8-5109-453e-8cb0-2d1c298b05a9","Type":"ContainerStarted","Data":"ce6f1b1827db8f4b0ce02ab5a26428d02097ba8e10e174fec2c7d93d5e639917"} Apr 16 17:41:00.856980 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:41:00.856893 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-gm955" event={"ID":"3bb520c8-5109-453e-8cb0-2d1c298b05a9","Type":"ContainerStarted","Data":"9d45b9d2722e81a0154395b872de38ce5a934b043af45185900f5ea1b763ed97"} Apr 16 17:41:00.857921 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:41:00.857902 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-bbrtg" event={"ID":"c8d26167-1916-46e8-8dba-80fb8694cf64","Type":"ContainerStarted","Data":"71f9a6ab02626f5b733693228db143233f8710c1ceb5e80f5f3ca5e2e45b10d5"} Apr 16 17:41:00.859085 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:41:00.859066 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-jv4sl" event={"ID":"c1e50cfa-f2ad-4fdb-bf97-8bab7c30928c","Type":"ContainerStarted","Data":"f130679bf038a5ffaca808590233ed47ed4a0aacfe31b67ee406bff8726587e2"} Apr 16 17:41:00.860554 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:41:00.860536 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-cxxrm" event={"ID":"88ac585a-175b-45f2-8696-1c754b2a5a05","Type":"ContainerStarted","Data":"cab51b334011292267e70ec322620b77c9481ea30e18ed6a979faff907511fb8"} Apr 16 17:41:00.860653 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:41:00.860558 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-cxxrm" event={"ID":"88ac585a-175b-45f2-8696-1c754b2a5a05","Type":"ContainerStarted","Data":"8d6464ddbdeaf53bf1179a727a476b90396c383871787019dbe192ce90302843"} Apr 16 17:41:00.873880 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:41:00.873841 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-bs84n" podStartSLOduration=34.184016595 podStartE2EDuration="37.873829759s" podCreationTimestamp="2026-04-16 17:40:23 +0000 UTC" firstStartedPulling="2026-04-16 17:40:56.199976758 +0000 UTC m=+66.359347024" lastFinishedPulling="2026-04-16 17:40:59.889789928 +0000 UTC m=+70.049160188" observedRunningTime="2026-04-16 17:41:00.873131031 +0000 UTC m=+71.032501325" watchObservedRunningTime="2026-04-16 17:41:00.873829759 +0000 UTC m=+71.033200024" Apr 16 17:41:00.890878 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:41:00.890832 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-cxxrm" podStartSLOduration=67.558816026 podStartE2EDuration="1m10.890817852s" podCreationTimestamp="2026-04-16 17:39:50 +0000 UTC" firstStartedPulling="2026-04-16 17:40:56.557269312 +0000 UTC m=+66.716639585" lastFinishedPulling="2026-04-16 17:40:59.889271134 +0000 UTC m=+70.048641411" observedRunningTime="2026-04-16 17:41:00.89011831 +0000 UTC m=+71.049488593" watchObservedRunningTime="2026-04-16 17:41:00.890817852 +0000 UTC m=+71.050188138" Apr 16 17:41:00.907205 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:41:00.907159 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-gm955" podStartSLOduration=66.159638277 podStartE2EDuration="1m9.90713985s" podCreationTimestamp="2026-04-16 17:39:51 +0000 UTC" firstStartedPulling="2026-04-16 17:40:56.141325733 +0000 UTC m=+66.300696007" lastFinishedPulling="2026-04-16 17:40:59.888827305 +0000 UTC m=+70.048197580" observedRunningTime="2026-04-16 17:41:00.90634704 +0000 UTC m=+71.065717325" watchObservedRunningTime="2026-04-16 17:41:00.90713985 +0000 UTC m=+71.066510133" Apr 16 17:41:00.922328 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:41:00.922282 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-jv4sl" podStartSLOduration=34.237986093 podStartE2EDuration="37.922266229s" podCreationTimestamp="2026-04-16 17:40:23 +0000 UTC" firstStartedPulling="2026-04-16 17:40:56.205141813 +0000 UTC m=+66.364512074" lastFinishedPulling="2026-04-16 17:40:59.88942195 +0000 UTC m=+70.048792210" observedRunningTime="2026-04-16 17:41:00.921510083 +0000 UTC m=+71.080880391" watchObservedRunningTime="2026-04-16 17:41:00.922266229 +0000 UTC m=+71.081636557" Apr 16 17:41:01.864817 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:41:01.864715 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-bbrtg" event={"ID":"c8d26167-1916-46e8-8dba-80fb8694cf64","Type":"ContainerStarted","Data":"24b341a05d8e1bc16cea7ac6f768103c6de1954228fdabac6ccf9aae6a2d4fa1"} Apr 16 17:41:01.881771 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:41:01.881725 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-bbrtg" podStartSLOduration=33.329710671 podStartE2EDuration="34.881711821s" podCreationTimestamp="2026-04-16 17:40:27 +0000 UTC" firstStartedPulling="2026-04-16 17:41:00.042388255 +0000 UTC m=+70.201758529" lastFinishedPulling="2026-04-16 17:41:01.594389405 +0000 UTC m=+71.753759679" observedRunningTime="2026-04-16 17:41:01.880262336 +0000 UTC m=+72.039632617" watchObservedRunningTime="2026-04-16 17:41:01.881711821 +0000 UTC m=+72.041082173" Apr 16 17:41:07.762291 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:41:07.762260 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-2kgzs" Apr 16 17:41:10.867073 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:41:10.867040 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-bs84n" Apr 16 17:41:13.409965 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:41:13.409934 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-bs84n_d237b125-8598-4bde-9a2c-f3145e257d0a/dns/0.log" Apr 16 17:41:13.594159 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:41:13.594133 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-bs84n_d237b125-8598-4bde-9a2c-f3145e257d0a/kube-rbac-proxy/0.log" Apr 16 17:41:14.194052 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:41:14.194026 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-ncccq_cf1030f1-c004-49a8-a6a6-1c7e1bbe4b25/dns-node-resolver/0.log" Apr 16 17:41:14.994588 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:41:14.994546 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-7f8454dc74-5t5l5_7d3e502c-98bf-4943-aa5a-2412d9b53369/registry/0.log" Apr 16 17:41:15.594140 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:41:15.594118 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-gbhm8_810acbe7-5369-4316-8f52-ddfcdd34e838/node-ca/0.log" Apr 16 17:41:16.193704 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:41:16.193663 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-jv4sl_c1e50cfa-f2ad-4fdb-bf97-8bab7c30928c/serve-healthcheck-canary/0.log" Apr 16 17:41:17.820753 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:41:17.820720 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-bc7hk"] Apr 16 17:41:17.825856 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:41:17.825832 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-bc7hk" Apr 16 17:41:17.827957 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:41:17.827935 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 16 17:41:17.828069 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:41:17.828017 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 16 17:41:17.828424 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:41:17.828384 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 16 17:41:17.828539 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:41:17.828469 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-wqdvb\"" Apr 16 17:41:17.828789 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:41:17.828768 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 16 17:41:17.828975 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:41:17.828957 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 16 17:41:17.829112 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:41:17.829094 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 16 17:41:17.844560 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:41:17.844540 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-7f8454dc74-5t5l5" Apr 16 17:41:17.991141 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:41:17.991103 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9ghjh\" (UniqueName: \"kubernetes.io/projected/d187c518-52cd-400f-b7eb-ee4502e56915-kube-api-access-9ghjh\") pod \"node-exporter-bc7hk\" (UID: \"d187c518-52cd-400f-b7eb-ee4502e56915\") " pod="openshift-monitoring/node-exporter-bc7hk" Apr 16 17:41:17.991314 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:41:17.991242 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/d187c518-52cd-400f-b7eb-ee4502e56915-node-exporter-accelerators-collector-config\") pod \"node-exporter-bc7hk\" (UID: \"d187c518-52cd-400f-b7eb-ee4502e56915\") " pod="openshift-monitoring/node-exporter-bc7hk" Apr 16 17:41:17.991400 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:41:17.991381 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/d187c518-52cd-400f-b7eb-ee4502e56915-node-exporter-textfile\") pod \"node-exporter-bc7hk\" (UID: \"d187c518-52cd-400f-b7eb-ee4502e56915\") " pod="openshift-monitoring/node-exporter-bc7hk" Apr 16 17:41:17.991561 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:41:17.991533 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/d187c518-52cd-400f-b7eb-ee4502e56915-node-exporter-wtmp\") pod \"node-exporter-bc7hk\" (UID: \"d187c518-52cd-400f-b7eb-ee4502e56915\") " pod="openshift-monitoring/node-exporter-bc7hk" Apr 16 17:41:17.991678 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:41:17.991661 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d187c518-52cd-400f-b7eb-ee4502e56915-sys\") pod \"node-exporter-bc7hk\" (UID: \"d187c518-52cd-400f-b7eb-ee4502e56915\") " pod="openshift-monitoring/node-exporter-bc7hk" Apr 16 17:41:17.991715 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:41:17.991692 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/d187c518-52cd-400f-b7eb-ee4502e56915-node-exporter-tls\") pod \"node-exporter-bc7hk\" (UID: \"d187c518-52cd-400f-b7eb-ee4502e56915\") " pod="openshift-monitoring/node-exporter-bc7hk" Apr 16 17:41:17.991758 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:41:17.991728 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d187c518-52cd-400f-b7eb-ee4502e56915-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-bc7hk\" (UID: \"d187c518-52cd-400f-b7eb-ee4502e56915\") " pod="openshift-monitoring/node-exporter-bc7hk" Apr 16 17:41:17.991758 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:41:17.991747 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/d187c518-52cd-400f-b7eb-ee4502e56915-root\") pod \"node-exporter-bc7hk\" (UID: \"d187c518-52cd-400f-b7eb-ee4502e56915\") " pod="openshift-monitoring/node-exporter-bc7hk" Apr 16 17:41:17.991835 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:41:17.991763 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d187c518-52cd-400f-b7eb-ee4502e56915-metrics-client-ca\") pod \"node-exporter-bc7hk\" (UID: \"d187c518-52cd-400f-b7eb-ee4502e56915\") " pod="openshift-monitoring/node-exporter-bc7hk" Apr 16 17:41:18.092458 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:41:18.092359 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/d187c518-52cd-400f-b7eb-ee4502e56915-node-exporter-textfile\") pod \"node-exporter-bc7hk\" (UID: \"d187c518-52cd-400f-b7eb-ee4502e56915\") " pod="openshift-monitoring/node-exporter-bc7hk" Apr 16 17:41:18.092458 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:41:18.092407 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/d187c518-52cd-400f-b7eb-ee4502e56915-node-exporter-wtmp\") pod \"node-exporter-bc7hk\" (UID: \"d187c518-52cd-400f-b7eb-ee4502e56915\") " pod="openshift-monitoring/node-exporter-bc7hk" Apr 16 17:41:18.092725 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:41:18.092547 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/d187c518-52cd-400f-b7eb-ee4502e56915-node-exporter-wtmp\") pod \"node-exporter-bc7hk\" (UID: \"d187c518-52cd-400f-b7eb-ee4502e56915\") " pod="openshift-monitoring/node-exporter-bc7hk" Apr 16 17:41:18.092725 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:41:18.092549 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d187c518-52cd-400f-b7eb-ee4502e56915-sys\") pod \"node-exporter-bc7hk\" (UID: \"d187c518-52cd-400f-b7eb-ee4502e56915\") " pod="openshift-monitoring/node-exporter-bc7hk" Apr 16 17:41:18.092725 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:41:18.092627 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/d187c518-52cd-400f-b7eb-ee4502e56915-node-exporter-tls\") pod \"node-exporter-bc7hk\" (UID: \"d187c518-52cd-400f-b7eb-ee4502e56915\") " pod="openshift-monitoring/node-exporter-bc7hk" Apr 16 17:41:18.092725 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:41:18.092638 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d187c518-52cd-400f-b7eb-ee4502e56915-sys\") pod \"node-exporter-bc7hk\" (UID: \"d187c518-52cd-400f-b7eb-ee4502e56915\") " pod="openshift-monitoring/node-exporter-bc7hk" Apr 16 17:41:18.092725 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:41:18.092672 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d187c518-52cd-400f-b7eb-ee4502e56915-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-bc7hk\" (UID: \"d187c518-52cd-400f-b7eb-ee4502e56915\") " pod="openshift-monitoring/node-exporter-bc7hk" Apr 16 17:41:18.092725 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:41:18.092713 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/d187c518-52cd-400f-b7eb-ee4502e56915-root\") pod \"node-exporter-bc7hk\" (UID: \"d187c518-52cd-400f-b7eb-ee4502e56915\") " pod="openshift-monitoring/node-exporter-bc7hk" Apr 16 17:41:18.093056 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:41:18.092737 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/d187c518-52cd-400f-b7eb-ee4502e56915-node-exporter-textfile\") pod \"node-exporter-bc7hk\" (UID: \"d187c518-52cd-400f-b7eb-ee4502e56915\") " pod="openshift-monitoring/node-exporter-bc7hk" Apr 16 17:41:18.093056 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:41:18.092746 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d187c518-52cd-400f-b7eb-ee4502e56915-metrics-client-ca\") pod \"node-exporter-bc7hk\" (UID: \"d187c518-52cd-400f-b7eb-ee4502e56915\") " pod="openshift-monitoring/node-exporter-bc7hk" Apr 16 17:41:18.093056 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:41:18.092808 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9ghjh\" (UniqueName: \"kubernetes.io/projected/d187c518-52cd-400f-b7eb-ee4502e56915-kube-api-access-9ghjh\") pod \"node-exporter-bc7hk\" (UID: \"d187c518-52cd-400f-b7eb-ee4502e56915\") " pod="openshift-monitoring/node-exporter-bc7hk" Apr 16 17:41:18.093056 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:41:18.092839 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/d187c518-52cd-400f-b7eb-ee4502e56915-node-exporter-accelerators-collector-config\") pod \"node-exporter-bc7hk\" (UID: \"d187c518-52cd-400f-b7eb-ee4502e56915\") " pod="openshift-monitoring/node-exporter-bc7hk" Apr 16 17:41:18.093056 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:41:18.092879 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/d187c518-52cd-400f-b7eb-ee4502e56915-root\") pod \"node-exporter-bc7hk\" (UID: \"d187c518-52cd-400f-b7eb-ee4502e56915\") " pod="openshift-monitoring/node-exporter-bc7hk" Apr 16 17:41:18.093360 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:41:18.093340 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d187c518-52cd-400f-b7eb-ee4502e56915-metrics-client-ca\") pod \"node-exporter-bc7hk\" (UID: \"d187c518-52cd-400f-b7eb-ee4502e56915\") " pod="openshift-monitoring/node-exporter-bc7hk" Apr 16 17:41:18.093360 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:41:18.093346 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/d187c518-52cd-400f-b7eb-ee4502e56915-node-exporter-accelerators-collector-config\") pod \"node-exporter-bc7hk\" (UID: \"d187c518-52cd-400f-b7eb-ee4502e56915\") " pod="openshift-monitoring/node-exporter-bc7hk" Apr 16 17:41:18.095251 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:41:18.095229 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d187c518-52cd-400f-b7eb-ee4502e56915-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-bc7hk\" (UID: \"d187c518-52cd-400f-b7eb-ee4502e56915\") " pod="openshift-monitoring/node-exporter-bc7hk" Apr 16 17:41:18.095371 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:41:18.095355 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/d187c518-52cd-400f-b7eb-ee4502e56915-node-exporter-tls\") pod \"node-exporter-bc7hk\" (UID: \"d187c518-52cd-400f-b7eb-ee4502e56915\") " pod="openshift-monitoring/node-exporter-bc7hk" Apr 16 17:41:18.113147 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:41:18.113116 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9ghjh\" (UniqueName: \"kubernetes.io/projected/d187c518-52cd-400f-b7eb-ee4502e56915-kube-api-access-9ghjh\") pod \"node-exporter-bc7hk\" (UID: \"d187c518-52cd-400f-b7eb-ee4502e56915\") " pod="openshift-monitoring/node-exporter-bc7hk" Apr 16 17:41:18.135244 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:41:18.135220 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-bc7hk" Apr 16 17:41:18.144280 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:41:18.144252 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd187c518_52cd_400f_b7eb_ee4502e56915.slice/crio-255677b2f640850f9937d9eb8ac9513fd36b54318dac84a8ce8daa6a590d7992 WatchSource:0}: Error finding container 255677b2f640850f9937d9eb8ac9513fd36b54318dac84a8ce8daa6a590d7992: Status 404 returned error can't find the container with id 255677b2f640850f9937d9eb8ac9513fd36b54318dac84a8ce8daa6a590d7992 Apr 16 17:41:18.912530 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:41:18.912501 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-bc7hk" event={"ID":"d187c518-52cd-400f-b7eb-ee4502e56915","Type":"ContainerStarted","Data":"255677b2f640850f9937d9eb8ac9513fd36b54318dac84a8ce8daa6a590d7992"} Apr 16 17:41:19.919078 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:41:19.919035 2574 generic.go:358] "Generic (PLEG): container finished" podID="d187c518-52cd-400f-b7eb-ee4502e56915" containerID="49021476c374c286b7a9e96d0e21b564d6873feda320d30160287525b75d336f" exitCode=0 Apr 16 17:41:19.919518 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:41:19.919108 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-bc7hk" event={"ID":"d187c518-52cd-400f-b7eb-ee4502e56915","Type":"ContainerDied","Data":"49021476c374c286b7a9e96d0e21b564d6873feda320d30160287525b75d336f"} Apr 16 17:41:20.923745 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:41:20.923710 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-bc7hk" event={"ID":"d187c518-52cd-400f-b7eb-ee4502e56915","Type":"ContainerStarted","Data":"b21fb2b3603d0715793a559486b7bcd07ce31badf40bdb0be592b1b948d03d73"} Apr 16 17:41:20.923745 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:41:20.923748 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-bc7hk" event={"ID":"d187c518-52cd-400f-b7eb-ee4502e56915","Type":"ContainerStarted","Data":"a58fd0f61f3c4ac2b4bd098e72125abc771025beeaf92235e43ebe40a6498a76"} Apr 16 17:41:20.945288 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:41:20.945234 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-bc7hk" podStartSLOduration=3.221164832 podStartE2EDuration="3.945219058s" podCreationTimestamp="2026-04-16 17:41:17 +0000 UTC" firstStartedPulling="2026-04-16 17:41:18.146110101 +0000 UTC m=+88.305480360" lastFinishedPulling="2026-04-16 17:41:18.870164324 +0000 UTC m=+89.029534586" observedRunningTime="2026-04-16 17:41:20.943722544 +0000 UTC m=+91.103092839" watchObservedRunningTime="2026-04-16 17:41:20.945219058 +0000 UTC m=+91.104589336" Apr 16 17:41:25.955776 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:41:25.955745 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-7f8454dc74-5t5l5"] Apr 16 17:41:27.119390 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:41:27.119351 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-m4q6m"] Apr 16 17:41:27.123848 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:41:27.123827 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-m4q6m" Apr 16 17:41:27.126128 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:41:27.126108 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 16 17:41:27.126128 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:41:27.126113 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 16 17:41:27.126376 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:41:27.126360 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 16 17:41:27.126890 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:41:27.126868 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 16 17:41:27.127034 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:41:27.127018 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-kg8l7\"" Apr 16 17:41:27.131673 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:41:27.131654 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-m4q6m"] Apr 16 17:41:27.262205 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:41:27.262173 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/11da63ca-3756-4ddf-8338-7b87d9600243-data-volume\") pod \"insights-runtime-extractor-m4q6m\" (UID: \"11da63ca-3756-4ddf-8338-7b87d9600243\") " pod="openshift-insights/insights-runtime-extractor-m4q6m" Apr 16 17:41:27.262205 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:41:27.262209 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/11da63ca-3756-4ddf-8338-7b87d9600243-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-m4q6m\" (UID: \"11da63ca-3756-4ddf-8338-7b87d9600243\") " pod="openshift-insights/insights-runtime-extractor-m4q6m" Apr 16 17:41:27.262420 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:41:27.262247 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/11da63ca-3756-4ddf-8338-7b87d9600243-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-m4q6m\" (UID: \"11da63ca-3756-4ddf-8338-7b87d9600243\") " pod="openshift-insights/insights-runtime-extractor-m4q6m" Apr 16 17:41:27.262420 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:41:27.262307 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/11da63ca-3756-4ddf-8338-7b87d9600243-crio-socket\") pod \"insights-runtime-extractor-m4q6m\" (UID: \"11da63ca-3756-4ddf-8338-7b87d9600243\") " pod="openshift-insights/insights-runtime-extractor-m4q6m" Apr 16 17:41:27.262420 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:41:27.262398 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-94j5q\" (UniqueName: \"kubernetes.io/projected/11da63ca-3756-4ddf-8338-7b87d9600243-kube-api-access-94j5q\") pod \"insights-runtime-extractor-m4q6m\" (UID: \"11da63ca-3756-4ddf-8338-7b87d9600243\") " pod="openshift-insights/insights-runtime-extractor-m4q6m" Apr 16 17:41:27.363874 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:41:27.363784 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-94j5q\" (UniqueName: \"kubernetes.io/projected/11da63ca-3756-4ddf-8338-7b87d9600243-kube-api-access-94j5q\") pod \"insights-runtime-extractor-m4q6m\" (UID: \"11da63ca-3756-4ddf-8338-7b87d9600243\") " pod="openshift-insights/insights-runtime-extractor-m4q6m" Apr 16 17:41:27.363874 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:41:27.363853 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/11da63ca-3756-4ddf-8338-7b87d9600243-data-volume\") pod \"insights-runtime-extractor-m4q6m\" (UID: \"11da63ca-3756-4ddf-8338-7b87d9600243\") " pod="openshift-insights/insights-runtime-extractor-m4q6m" Apr 16 17:41:27.363874 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:41:27.363878 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/11da63ca-3756-4ddf-8338-7b87d9600243-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-m4q6m\" (UID: \"11da63ca-3756-4ddf-8338-7b87d9600243\") " pod="openshift-insights/insights-runtime-extractor-m4q6m" Apr 16 17:41:27.364090 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:41:27.363922 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/11da63ca-3756-4ddf-8338-7b87d9600243-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-m4q6m\" (UID: \"11da63ca-3756-4ddf-8338-7b87d9600243\") " pod="openshift-insights/insights-runtime-extractor-m4q6m" Apr 16 17:41:27.364090 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:41:27.363954 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/11da63ca-3756-4ddf-8338-7b87d9600243-crio-socket\") pod \"insights-runtime-extractor-m4q6m\" (UID: \"11da63ca-3756-4ddf-8338-7b87d9600243\") " pod="openshift-insights/insights-runtime-extractor-m4q6m" Apr 16 17:41:27.364090 ip-10-0-138-45 kubenswrapper[2574]: E0416 17:41:27.364044 2574 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 16 17:41:27.364192 ip-10-0-138-45 kubenswrapper[2574]: E0416 17:41:27.364125 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/11da63ca-3756-4ddf-8338-7b87d9600243-insights-runtime-extractor-tls podName:11da63ca-3756-4ddf-8338-7b87d9600243 nodeName:}" failed. No retries permitted until 2026-04-16 17:41:27.864102737 +0000 UTC m=+98.023473014 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/11da63ca-3756-4ddf-8338-7b87d9600243-insights-runtime-extractor-tls") pod "insights-runtime-extractor-m4q6m" (UID: "11da63ca-3756-4ddf-8338-7b87d9600243") : secret "insights-runtime-extractor-tls" not found Apr 16 17:41:27.364192 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:41:27.364049 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/11da63ca-3756-4ddf-8338-7b87d9600243-crio-socket\") pod \"insights-runtime-extractor-m4q6m\" (UID: \"11da63ca-3756-4ddf-8338-7b87d9600243\") " pod="openshift-insights/insights-runtime-extractor-m4q6m" Apr 16 17:41:27.364192 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:41:27.364183 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/11da63ca-3756-4ddf-8338-7b87d9600243-data-volume\") pod \"insights-runtime-extractor-m4q6m\" (UID: \"11da63ca-3756-4ddf-8338-7b87d9600243\") " pod="openshift-insights/insights-runtime-extractor-m4q6m" Apr 16 17:41:27.364498 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:41:27.364481 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/11da63ca-3756-4ddf-8338-7b87d9600243-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-m4q6m\" (UID: \"11da63ca-3756-4ddf-8338-7b87d9600243\") " pod="openshift-insights/insights-runtime-extractor-m4q6m" Apr 16 17:41:27.373226 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:41:27.373206 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-94j5q\" (UniqueName: \"kubernetes.io/projected/11da63ca-3756-4ddf-8338-7b87d9600243-kube-api-access-94j5q\") pod \"insights-runtime-extractor-m4q6m\" (UID: \"11da63ca-3756-4ddf-8338-7b87d9600243\") " pod="openshift-insights/insights-runtime-extractor-m4q6m" Apr 16 17:41:27.868400 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:41:27.868363 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/11da63ca-3756-4ddf-8338-7b87d9600243-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-m4q6m\" (UID: \"11da63ca-3756-4ddf-8338-7b87d9600243\") " pod="openshift-insights/insights-runtime-extractor-m4q6m" Apr 16 17:41:27.870907 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:41:27.870882 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/11da63ca-3756-4ddf-8338-7b87d9600243-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-m4q6m\" (UID: \"11da63ca-3756-4ddf-8338-7b87d9600243\") " pod="openshift-insights/insights-runtime-extractor-m4q6m" Apr 16 17:41:28.034067 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:41:28.034036 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-m4q6m" Apr 16 17:41:28.161828 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:41:28.161798 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-m4q6m"] Apr 16 17:41:28.163863 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:41:28.163831 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod11da63ca_3756_4ddf_8338_7b87d9600243.slice/crio-29166fe5119219338f4246979ef9b3eb2acc08884d09ee9f18796a1620b327e8 WatchSource:0}: Error finding container 29166fe5119219338f4246979ef9b3eb2acc08884d09ee9f18796a1620b327e8: Status 404 returned error can't find the container with id 29166fe5119219338f4246979ef9b3eb2acc08884d09ee9f18796a1620b327e8 Apr 16 17:41:28.956855 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:41:28.956822 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-m4q6m" event={"ID":"11da63ca-3756-4ddf-8338-7b87d9600243","Type":"ContainerStarted","Data":"0341dda93ef3f65888396204755d9de6d37fff4dcf4f390ff00f091a7a3b66fc"} Apr 16 17:41:28.956855 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:41:28.956858 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-m4q6m" event={"ID":"11da63ca-3756-4ddf-8338-7b87d9600243","Type":"ContainerStarted","Data":"9c9456b89e5d12f9f40c36554ff8d5afac01a648de4b172e1a8f5002f4f42b36"} Apr 16 17:41:28.957055 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:41:28.956868 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-m4q6m" event={"ID":"11da63ca-3756-4ddf-8338-7b87d9600243","Type":"ContainerStarted","Data":"29166fe5119219338f4246979ef9b3eb2acc08884d09ee9f18796a1620b327e8"} Apr 16 17:41:30.963901 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:41:30.963866 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-m4q6m" event={"ID":"11da63ca-3756-4ddf-8338-7b87d9600243","Type":"ContainerStarted","Data":"588979f10c9589d96779e71168f6eafa6291d21902ca2f565c89da14b8240bad"} Apr 16 17:41:30.983002 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:41:30.982948 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-m4q6m" podStartSLOduration=2.066954284 podStartE2EDuration="3.982931896s" podCreationTimestamp="2026-04-16 17:41:27 +0000 UTC" firstStartedPulling="2026-04-16 17:41:28.219092509 +0000 UTC m=+98.378462770" lastFinishedPulling="2026-04-16 17:41:30.135070102 +0000 UTC m=+100.294440382" observedRunningTime="2026-04-16 17:41:30.982802115 +0000 UTC m=+101.142172421" watchObservedRunningTime="2026-04-16 17:41:30.982931896 +0000 UTC m=+101.142302375" Apr 16 17:41:50.975713 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:41:50.975648 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-7f8454dc74-5t5l5" podUID="7d3e502c-98bf-4943-aa5a-2412d9b53369" containerName="registry" containerID="cri-o://5c80e3e4809676491289eb2502a1562ae3f5299f91ffb28eb93d6dedf726b68a" gracePeriod=30 Apr 16 17:41:51.210800 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:41:51.210776 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7f8454dc74-5t5l5" Apr 16 17:41:51.244831 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:41:51.244751 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/7d3e502c-98bf-4943-aa5a-2412d9b53369-ca-trust-extracted\") pod \"7d3e502c-98bf-4943-aa5a-2412d9b53369\" (UID: \"7d3e502c-98bf-4943-aa5a-2412d9b53369\") " Apr 16 17:41:51.244831 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:41:51.244809 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/7d3e502c-98bf-4943-aa5a-2412d9b53369-image-registry-private-configuration\") pod \"7d3e502c-98bf-4943-aa5a-2412d9b53369\" (UID: \"7d3e502c-98bf-4943-aa5a-2412d9b53369\") " Apr 16 17:41:51.244831 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:41:51.244829 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7d3e502c-98bf-4943-aa5a-2412d9b53369-trusted-ca\") pod \"7d3e502c-98bf-4943-aa5a-2412d9b53369\" (UID: \"7d3e502c-98bf-4943-aa5a-2412d9b53369\") " Apr 16 17:41:51.245092 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:41:51.244846 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7d3e502c-98bf-4943-aa5a-2412d9b53369-registry-tls\") pod \"7d3e502c-98bf-4943-aa5a-2412d9b53369\" (UID: \"7d3e502c-98bf-4943-aa5a-2412d9b53369\") " Apr 16 17:41:51.245092 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:41:51.244894 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nldfw\" (UniqueName: \"kubernetes.io/projected/7d3e502c-98bf-4943-aa5a-2412d9b53369-kube-api-access-nldfw\") pod \"7d3e502c-98bf-4943-aa5a-2412d9b53369\" (UID: \"7d3e502c-98bf-4943-aa5a-2412d9b53369\") " Apr 16 17:41:51.245092 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:41:51.244930 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/7d3e502c-98bf-4943-aa5a-2412d9b53369-installation-pull-secrets\") pod \"7d3e502c-98bf-4943-aa5a-2412d9b53369\" (UID: \"7d3e502c-98bf-4943-aa5a-2412d9b53369\") " Apr 16 17:41:51.245092 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:41:51.244956 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7d3e502c-98bf-4943-aa5a-2412d9b53369-bound-sa-token\") pod \"7d3e502c-98bf-4943-aa5a-2412d9b53369\" (UID: \"7d3e502c-98bf-4943-aa5a-2412d9b53369\") " Apr 16 17:41:51.245092 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:41:51.244983 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/7d3e502c-98bf-4943-aa5a-2412d9b53369-registry-certificates\") pod \"7d3e502c-98bf-4943-aa5a-2412d9b53369\" (UID: \"7d3e502c-98bf-4943-aa5a-2412d9b53369\") " Apr 16 17:41:51.245394 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:41:51.245364 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7d3e502c-98bf-4943-aa5a-2412d9b53369-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "7d3e502c-98bf-4943-aa5a-2412d9b53369" (UID: "7d3e502c-98bf-4943-aa5a-2412d9b53369"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 17:41:51.245504 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:41:51.245473 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7d3e502c-98bf-4943-aa5a-2412d9b53369-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "7d3e502c-98bf-4943-aa5a-2412d9b53369" (UID: "7d3e502c-98bf-4943-aa5a-2412d9b53369"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 17:41:51.247909 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:41:51.247879 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d3e502c-98bf-4943-aa5a-2412d9b53369-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "7d3e502c-98bf-4943-aa5a-2412d9b53369" (UID: "7d3e502c-98bf-4943-aa5a-2412d9b53369"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 17:41:51.248095 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:41:51.248043 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d3e502c-98bf-4943-aa5a-2412d9b53369-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "7d3e502c-98bf-4943-aa5a-2412d9b53369" (UID: "7d3e502c-98bf-4943-aa5a-2412d9b53369"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 17:41:51.248199 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:41:51.248139 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d3e502c-98bf-4943-aa5a-2412d9b53369-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "7d3e502c-98bf-4943-aa5a-2412d9b53369" (UID: "7d3e502c-98bf-4943-aa5a-2412d9b53369"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 17:41:51.248301 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:41:51.248273 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d3e502c-98bf-4943-aa5a-2412d9b53369-kube-api-access-nldfw" (OuterVolumeSpecName: "kube-api-access-nldfw") pod "7d3e502c-98bf-4943-aa5a-2412d9b53369" (UID: "7d3e502c-98bf-4943-aa5a-2412d9b53369"). InnerVolumeSpecName "kube-api-access-nldfw". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 17:41:51.248343 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:41:51.248309 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d3e502c-98bf-4943-aa5a-2412d9b53369-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "7d3e502c-98bf-4943-aa5a-2412d9b53369" (UID: "7d3e502c-98bf-4943-aa5a-2412d9b53369"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 17:41:51.254094 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:41:51.254069 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7d3e502c-98bf-4943-aa5a-2412d9b53369-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "7d3e502c-98bf-4943-aa5a-2412d9b53369" (UID: "7d3e502c-98bf-4943-aa5a-2412d9b53369"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 17:41:51.345801 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:41:51.345753 2574 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/7d3e502c-98bf-4943-aa5a-2412d9b53369-image-registry-private-configuration\") on node \"ip-10-0-138-45.ec2.internal\" DevicePath \"\"" Apr 16 17:41:51.345801 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:41:51.345799 2574 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7d3e502c-98bf-4943-aa5a-2412d9b53369-trusted-ca\") on node \"ip-10-0-138-45.ec2.internal\" DevicePath \"\"" Apr 16 17:41:51.345801 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:41:51.345809 2574 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7d3e502c-98bf-4943-aa5a-2412d9b53369-registry-tls\") on node \"ip-10-0-138-45.ec2.internal\" DevicePath \"\"" Apr 16 17:41:51.345801 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:41:51.345819 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-nldfw\" (UniqueName: \"kubernetes.io/projected/7d3e502c-98bf-4943-aa5a-2412d9b53369-kube-api-access-nldfw\") on node \"ip-10-0-138-45.ec2.internal\" DevicePath \"\"" Apr 16 17:41:51.346062 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:41:51.345829 2574 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/7d3e502c-98bf-4943-aa5a-2412d9b53369-installation-pull-secrets\") on node \"ip-10-0-138-45.ec2.internal\" DevicePath \"\"" Apr 16 17:41:51.346062 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:41:51.345838 2574 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7d3e502c-98bf-4943-aa5a-2412d9b53369-bound-sa-token\") on node \"ip-10-0-138-45.ec2.internal\" DevicePath \"\"" Apr 16 17:41:51.346062 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:41:51.345848 2574 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/7d3e502c-98bf-4943-aa5a-2412d9b53369-registry-certificates\") on node \"ip-10-0-138-45.ec2.internal\" DevicePath \"\"" Apr 16 17:41:51.346062 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:41:51.345856 2574 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/7d3e502c-98bf-4943-aa5a-2412d9b53369-ca-trust-extracted\") on node \"ip-10-0-138-45.ec2.internal\" DevicePath \"\"" Apr 16 17:41:52.021126 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:41:52.021089 2574 generic.go:358] "Generic (PLEG): container finished" podID="7d3e502c-98bf-4943-aa5a-2412d9b53369" containerID="5c80e3e4809676491289eb2502a1562ae3f5299f91ffb28eb93d6dedf726b68a" exitCode=0 Apr 16 17:41:52.021559 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:41:52.021172 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7f8454dc74-5t5l5" Apr 16 17:41:52.021559 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:41:52.021171 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-7f8454dc74-5t5l5" event={"ID":"7d3e502c-98bf-4943-aa5a-2412d9b53369","Type":"ContainerDied","Data":"5c80e3e4809676491289eb2502a1562ae3f5299f91ffb28eb93d6dedf726b68a"} Apr 16 17:41:52.021559 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:41:52.021217 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-7f8454dc74-5t5l5" event={"ID":"7d3e502c-98bf-4943-aa5a-2412d9b53369","Type":"ContainerDied","Data":"3319e4cd0884cc384d29f264e65f9dbc66b06576b609ea8600ec7a6d99a46832"} Apr 16 17:41:52.021559 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:41:52.021234 2574 scope.go:117] "RemoveContainer" containerID="5c80e3e4809676491289eb2502a1562ae3f5299f91ffb28eb93d6dedf726b68a" Apr 16 17:41:52.031026 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:41:52.031010 2574 scope.go:117] "RemoveContainer" containerID="5c80e3e4809676491289eb2502a1562ae3f5299f91ffb28eb93d6dedf726b68a" Apr 16 17:41:52.031301 ip-10-0-138-45 kubenswrapper[2574]: E0416 17:41:52.031280 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5c80e3e4809676491289eb2502a1562ae3f5299f91ffb28eb93d6dedf726b68a\": container with ID starting with 5c80e3e4809676491289eb2502a1562ae3f5299f91ffb28eb93d6dedf726b68a not found: ID does not exist" containerID="5c80e3e4809676491289eb2502a1562ae3f5299f91ffb28eb93d6dedf726b68a" Apr 16 17:41:52.031351 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:41:52.031311 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5c80e3e4809676491289eb2502a1562ae3f5299f91ffb28eb93d6dedf726b68a"} err="failed to get container status \"5c80e3e4809676491289eb2502a1562ae3f5299f91ffb28eb93d6dedf726b68a\": rpc error: code = NotFound desc = could not find container \"5c80e3e4809676491289eb2502a1562ae3f5299f91ffb28eb93d6dedf726b68a\": container with ID starting with 5c80e3e4809676491289eb2502a1562ae3f5299f91ffb28eb93d6dedf726b68a not found: ID does not exist" Apr 16 17:41:52.045845 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:41:52.045822 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-7f8454dc74-5t5l5"] Apr 16 17:41:52.052565 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:41:52.052545 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-7f8454dc74-5t5l5"] Apr 16 17:41:52.454763 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:41:52.454731 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7d3e502c-98bf-4943-aa5a-2412d9b53369" path="/var/lib/kubelet/pods/7d3e502c-98bf-4943-aa5a-2412d9b53369/volumes" Apr 16 17:41:53.025891 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:41:53.025859 2574 generic.go:358] "Generic (PLEG): container finished" podID="a957470a-21dc-4720-b171-61e476f5ec68" containerID="a9a19f4f24c0a4d321994be228a3103f3eecc886c7de8e3bdc96639f9e54abe3" exitCode=0 Apr 16 17:41:53.026363 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:41:53.025903 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-7fwlg" event={"ID":"a957470a-21dc-4720-b171-61e476f5ec68","Type":"ContainerDied","Data":"a9a19f4f24c0a4d321994be228a3103f3eecc886c7de8e3bdc96639f9e54abe3"} Apr 16 17:41:53.026363 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:41:53.026297 2574 scope.go:117] "RemoveContainer" containerID="a9a19f4f24c0a4d321994be228a3103f3eecc886c7de8e3bdc96639f9e54abe3" Apr 16 17:41:54.030504 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:41:54.030470 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-7fwlg" event={"ID":"a957470a-21dc-4720-b171-61e476f5ec68","Type":"ContainerStarted","Data":"e60822fa47826b8056d0a63f127dac48e39fb8eef90fe3ab6cb74d730079a643"} Apr 16 17:41:57.038746 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:41:57.038712 2574 generic.go:358] "Generic (PLEG): container finished" podID="7df6448d-96ef-4887-bf9c-4b4dfb72c238" containerID="e6107f2c7f154063e69356a4e0a729c43ce94bfa7d54bc67df288f8b8e47d476" exitCode=0 Apr 16 17:41:57.039193 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:41:57.038787 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-qdmf7" event={"ID":"7df6448d-96ef-4887-bf9c-4b4dfb72c238","Type":"ContainerDied","Data":"e6107f2c7f154063e69356a4e0a729c43ce94bfa7d54bc67df288f8b8e47d476"} Apr 16 17:41:57.039193 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:41:57.039125 2574 scope.go:117] "RemoveContainer" containerID="e6107f2c7f154063e69356a4e0a729c43ce94bfa7d54bc67df288f8b8e47d476" Apr 16 17:41:58.043228 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:41:58.043192 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-qdmf7" event={"ID":"7df6448d-96ef-4887-bf9c-4b4dfb72c238","Type":"ContainerStarted","Data":"1db3c05c3c637949f2ed038c72baaa3c91715b82f88e475910be21a00b9ba071"} Apr 16 17:42:14.175442 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:42:14.175376 2574 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-76b6b8b57d-ht8l8" podUID="0583cad7-fc5e-4718-a618-7be058c6fa40" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 16 17:42:24.175604 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:42:24.175542 2574 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-76b6b8b57d-ht8l8" podUID="0583cad7-fc5e-4718-a618-7be058c6fa40" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 16 17:42:34.175709 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:42:34.175667 2574 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-76b6b8b57d-ht8l8" podUID="0583cad7-fc5e-4718-a618-7be058c6fa40" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 16 17:42:34.176149 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:42:34.175740 2574 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-76b6b8b57d-ht8l8" Apr 16 17:42:34.176213 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:42:34.176197 2574 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="service-proxy" containerStatusID={"Type":"cri-o","ID":"58b4345e2d074c4673d863b3ae0ca510af0cdd06caf607753277e97875904fb7"} pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-76b6b8b57d-ht8l8" containerMessage="Container service-proxy failed liveness probe, will be restarted" Apr 16 17:42:34.176252 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:42:34.176235 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-76b6b8b57d-ht8l8" podUID="0583cad7-fc5e-4718-a618-7be058c6fa40" containerName="service-proxy" containerID="cri-o://58b4345e2d074c4673d863b3ae0ca510af0cdd06caf607753277e97875904fb7" gracePeriod=30 Apr 16 17:42:35.143841 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:42:35.143810 2574 generic.go:358] "Generic (PLEG): container finished" podID="0583cad7-fc5e-4718-a618-7be058c6fa40" containerID="58b4345e2d074c4673d863b3ae0ca510af0cdd06caf607753277e97875904fb7" exitCode=2 Apr 16 17:42:35.144006 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:42:35.143852 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-76b6b8b57d-ht8l8" event={"ID":"0583cad7-fc5e-4718-a618-7be058c6fa40","Type":"ContainerDied","Data":"58b4345e2d074c4673d863b3ae0ca510af0cdd06caf607753277e97875904fb7"} Apr 16 17:42:35.144006 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:42:35.143882 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-76b6b8b57d-ht8l8" event={"ID":"0583cad7-fc5e-4718-a618-7be058c6fa40","Type":"ContainerStarted","Data":"84e4267fae45362146843fce59f46ce6534f603020ae88b73807df31581a49ed"} Apr 16 17:44:50.349517 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:44:50.349490 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-zcvg5_05d7dd61-bda6-4c4d-87b6-86d7c20fe0cd/console-operator/1.log" Apr 16 17:44:50.350592 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:44:50.350557 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-zcvg5_05d7dd61-bda6-4c4d-87b6-86d7c20fe0cd/console-operator/1.log" Apr 16 17:44:50.354404 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:44:50.354387 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-42vwn_2ee69609-b777-4f06-988b-8e74dd389f13/ovn-acl-logging/0.log" Apr 16 17:44:50.355420 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:44:50.355402 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-42vwn_2ee69609-b777-4f06-988b-8e74dd389f13/ovn-acl-logging/0.log" Apr 16 17:44:50.357720 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:44:50.357700 2574 kubelet.go:1628] "Image garbage collection succeeded" Apr 16 17:49:50.376720 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:49:50.376693 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-zcvg5_05d7dd61-bda6-4c4d-87b6-86d7c20fe0cd/console-operator/1.log" Apr 16 17:49:50.377237 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:49:50.376772 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-zcvg5_05d7dd61-bda6-4c4d-87b6-86d7c20fe0cd/console-operator/1.log" Apr 16 17:49:50.382368 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:49:50.382344 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-42vwn_2ee69609-b777-4f06-988b-8e74dd389f13/ovn-acl-logging/0.log" Apr 16 17:49:50.382525 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:49:50.382353 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-42vwn_2ee69609-b777-4f06-988b-8e74dd389f13/ovn-acl-logging/0.log" Apr 16 17:50:55.280274 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:50:55.280232 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-7zznt"] Apr 16 17:50:55.280703 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:50:55.280523 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7d3e502c-98bf-4943-aa5a-2412d9b53369" containerName="registry" Apr 16 17:50:55.280703 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:50:55.280534 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d3e502c-98bf-4943-aa5a-2412d9b53369" containerName="registry" Apr 16 17:50:55.280703 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:50:55.280600 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="7d3e502c-98bf-4943-aa5a-2412d9b53369" containerName="registry" Apr 16 17:50:55.283332 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:50:55.283316 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-7zznt" Apr 16 17:50:55.285856 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:50:55.285834 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"openshift-service-ca.crt\"" Apr 16 17:50:55.285990 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:50:55.285880 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"kube-root-ca.crt\"" Apr 16 17:50:55.286462 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:50:55.286448 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"openshift-lws-operator-dockercfg-k9zsj\"" Apr 16 17:50:55.297097 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:50:55.297068 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-7zznt"] Apr 16 17:50:55.379420 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:50:55.379388 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sn62q\" (UniqueName: \"kubernetes.io/projected/2dbf90b0-cdb4-49bf-9657-25e58574cc52-kube-api-access-sn62q\") pod \"openshift-lws-operator-bfc7f696d-7zznt\" (UID: \"2dbf90b0-cdb4-49bf-9657-25e58574cc52\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-7zznt" Apr 16 17:50:55.379420 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:50:55.379427 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/2dbf90b0-cdb4-49bf-9657-25e58574cc52-tmp\") pod \"openshift-lws-operator-bfc7f696d-7zznt\" (UID: \"2dbf90b0-cdb4-49bf-9657-25e58574cc52\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-7zznt" Apr 16 17:50:55.480322 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:50:55.480286 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sn62q\" (UniqueName: \"kubernetes.io/projected/2dbf90b0-cdb4-49bf-9657-25e58574cc52-kube-api-access-sn62q\") pod \"openshift-lws-operator-bfc7f696d-7zznt\" (UID: \"2dbf90b0-cdb4-49bf-9657-25e58574cc52\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-7zznt" Apr 16 17:50:55.480501 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:50:55.480332 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/2dbf90b0-cdb4-49bf-9657-25e58574cc52-tmp\") pod \"openshift-lws-operator-bfc7f696d-7zznt\" (UID: \"2dbf90b0-cdb4-49bf-9657-25e58574cc52\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-7zznt" Apr 16 17:50:55.480744 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:50:55.480723 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/2dbf90b0-cdb4-49bf-9657-25e58574cc52-tmp\") pod \"openshift-lws-operator-bfc7f696d-7zznt\" (UID: \"2dbf90b0-cdb4-49bf-9657-25e58574cc52\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-7zznt" Apr 16 17:50:55.492870 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:50:55.492848 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sn62q\" (UniqueName: \"kubernetes.io/projected/2dbf90b0-cdb4-49bf-9657-25e58574cc52-kube-api-access-sn62q\") pod \"openshift-lws-operator-bfc7f696d-7zznt\" (UID: \"2dbf90b0-cdb4-49bf-9657-25e58574cc52\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-7zznt" Apr 16 17:50:55.592056 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:50:55.591958 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-7zznt" Apr 16 17:50:55.723756 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:50:55.723729 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-7zznt"] Apr 16 17:50:55.725499 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:50:55.725472 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2dbf90b0_cdb4_49bf_9657_25e58574cc52.slice/crio-76f8bb946a8a0e7c870a5d1c6b3fd5e148013bf9f523353b9c813cf637819b0d WatchSource:0}: Error finding container 76f8bb946a8a0e7c870a5d1c6b3fd5e148013bf9f523353b9c813cf637819b0d: Status 404 returned error can't find the container with id 76f8bb946a8a0e7c870a5d1c6b3fd5e148013bf9f523353b9c813cf637819b0d Apr 16 17:50:55.726932 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:50:55.726915 2574 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 17:50:56.538601 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:50:56.538518 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-7zznt" event={"ID":"2dbf90b0-cdb4-49bf-9657-25e58574cc52","Type":"ContainerStarted","Data":"76f8bb946a8a0e7c870a5d1c6b3fd5e148013bf9f523353b9c813cf637819b0d"} Apr 16 17:50:58.546485 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:50:58.546447 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-7zznt" event={"ID":"2dbf90b0-cdb4-49bf-9657-25e58574cc52","Type":"ContainerStarted","Data":"d1d3e4686a610590ec679bfcd096c3cfe30d80933fbfd9dbe7442939d0b95235"} Apr 16 17:50:58.565296 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:50:58.565245 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-7zznt" podStartSLOduration=1.131515263 podStartE2EDuration="3.565228666s" podCreationTimestamp="2026-04-16 17:50:55 +0000 UTC" firstStartedPulling="2026-04-16 17:50:55.727034977 +0000 UTC m=+665.886405237" lastFinishedPulling="2026-04-16 17:50:58.160748377 +0000 UTC m=+668.320118640" observedRunningTime="2026-04-16 17:50:58.564263208 +0000 UTC m=+668.723633491" watchObservedRunningTime="2026-04-16 17:50:58.565228666 +0000 UTC m=+668.724598948" Apr 16 17:51:21.801864 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:51:21.801825 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/lws-controller-manager-64d875bb5b-lpxfh"] Apr 16 17:51:21.804977 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:51:21.804958 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-64d875bb5b-lpxfh" Apr 16 17:51:21.809060 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:51:21.809035 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"lws-manager-config\"" Apr 16 17:51:21.809175 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:51:21.809040 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"metrics-server-cert\"" Apr 16 17:51:21.809175 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:51:21.809045 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"webhook-server-cert\"" Apr 16 17:51:21.809175 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:51:21.809047 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"lws-controller-manager-dockercfg-zm8gf\"" Apr 16 17:51:21.823611 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:51:21.823559 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-64d875bb5b-lpxfh"] Apr 16 17:51:21.872996 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:51:21.872951 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/c72848ee-1c1c-4e36-8b44-84817597b2e5-metrics-cert\") pod \"lws-controller-manager-64d875bb5b-lpxfh\" (UID: \"c72848ee-1c1c-4e36-8b44-84817597b2e5\") " pod="openshift-lws-operator/lws-controller-manager-64d875bb5b-lpxfh" Apr 16 17:51:21.873181 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:51:21.873011 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/c72848ee-1c1c-4e36-8b44-84817597b2e5-manager-config\") pod \"lws-controller-manager-64d875bb5b-lpxfh\" (UID: \"c72848ee-1c1c-4e36-8b44-84817597b2e5\") " pod="openshift-lws-operator/lws-controller-manager-64d875bb5b-lpxfh" Apr 16 17:51:21.873181 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:51:21.873062 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gm9rf\" (UniqueName: \"kubernetes.io/projected/c72848ee-1c1c-4e36-8b44-84817597b2e5-kube-api-access-gm9rf\") pod \"lws-controller-manager-64d875bb5b-lpxfh\" (UID: \"c72848ee-1c1c-4e36-8b44-84817597b2e5\") " pod="openshift-lws-operator/lws-controller-manager-64d875bb5b-lpxfh" Apr 16 17:51:21.873181 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:51:21.873102 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c72848ee-1c1c-4e36-8b44-84817597b2e5-cert\") pod \"lws-controller-manager-64d875bb5b-lpxfh\" (UID: \"c72848ee-1c1c-4e36-8b44-84817597b2e5\") " pod="openshift-lws-operator/lws-controller-manager-64d875bb5b-lpxfh" Apr 16 17:51:21.973656 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:51:21.973621 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gm9rf\" (UniqueName: \"kubernetes.io/projected/c72848ee-1c1c-4e36-8b44-84817597b2e5-kube-api-access-gm9rf\") pod \"lws-controller-manager-64d875bb5b-lpxfh\" (UID: \"c72848ee-1c1c-4e36-8b44-84817597b2e5\") " pod="openshift-lws-operator/lws-controller-manager-64d875bb5b-lpxfh" Apr 16 17:51:21.973821 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:51:21.973673 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c72848ee-1c1c-4e36-8b44-84817597b2e5-cert\") pod \"lws-controller-manager-64d875bb5b-lpxfh\" (UID: \"c72848ee-1c1c-4e36-8b44-84817597b2e5\") " pod="openshift-lws-operator/lws-controller-manager-64d875bb5b-lpxfh" Apr 16 17:51:21.973821 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:51:21.973803 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/c72848ee-1c1c-4e36-8b44-84817597b2e5-metrics-cert\") pod \"lws-controller-manager-64d875bb5b-lpxfh\" (UID: \"c72848ee-1c1c-4e36-8b44-84817597b2e5\") " pod="openshift-lws-operator/lws-controller-manager-64d875bb5b-lpxfh" Apr 16 17:51:21.973891 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:51:21.973853 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/c72848ee-1c1c-4e36-8b44-84817597b2e5-manager-config\") pod \"lws-controller-manager-64d875bb5b-lpxfh\" (UID: \"c72848ee-1c1c-4e36-8b44-84817597b2e5\") " pod="openshift-lws-operator/lws-controller-manager-64d875bb5b-lpxfh" Apr 16 17:51:21.974413 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:51:21.974392 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/c72848ee-1c1c-4e36-8b44-84817597b2e5-manager-config\") pod \"lws-controller-manager-64d875bb5b-lpxfh\" (UID: \"c72848ee-1c1c-4e36-8b44-84817597b2e5\") " pod="openshift-lws-operator/lws-controller-manager-64d875bb5b-lpxfh" Apr 16 17:51:21.976315 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:51:21.976289 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/c72848ee-1c1c-4e36-8b44-84817597b2e5-metrics-cert\") pod \"lws-controller-manager-64d875bb5b-lpxfh\" (UID: \"c72848ee-1c1c-4e36-8b44-84817597b2e5\") " pod="openshift-lws-operator/lws-controller-manager-64d875bb5b-lpxfh" Apr 16 17:51:21.976759 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:51:21.976739 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c72848ee-1c1c-4e36-8b44-84817597b2e5-cert\") pod \"lws-controller-manager-64d875bb5b-lpxfh\" (UID: \"c72848ee-1c1c-4e36-8b44-84817597b2e5\") " pod="openshift-lws-operator/lws-controller-manager-64d875bb5b-lpxfh" Apr 16 17:51:21.985184 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:51:21.985158 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gm9rf\" (UniqueName: \"kubernetes.io/projected/c72848ee-1c1c-4e36-8b44-84817597b2e5-kube-api-access-gm9rf\") pod \"lws-controller-manager-64d875bb5b-lpxfh\" (UID: \"c72848ee-1c1c-4e36-8b44-84817597b2e5\") " pod="openshift-lws-operator/lws-controller-manager-64d875bb5b-lpxfh" Apr 16 17:51:22.114118 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:51:22.114031 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-64d875bb5b-lpxfh" Apr 16 17:51:22.250842 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:51:22.250696 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-64d875bb5b-lpxfh"] Apr 16 17:51:22.253934 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:51:22.253903 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc72848ee_1c1c_4e36_8b44_84817597b2e5.slice/crio-cd0d2e7bf55ec08b8704fe1ddce35453b47ff850e37b950c26e5b892c418afaa WatchSource:0}: Error finding container cd0d2e7bf55ec08b8704fe1ddce35453b47ff850e37b950c26e5b892c418afaa: Status 404 returned error can't find the container with id cd0d2e7bf55ec08b8704fe1ddce35453b47ff850e37b950c26e5b892c418afaa Apr 16 17:51:22.612927 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:51:22.612895 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-64d875bb5b-lpxfh" event={"ID":"c72848ee-1c1c-4e36-8b44-84817597b2e5","Type":"ContainerStarted","Data":"cd0d2e7bf55ec08b8704fe1ddce35453b47ff850e37b950c26e5b892c418afaa"} Apr 16 17:51:24.620710 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:51:24.620674 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-64d875bb5b-lpxfh" event={"ID":"c72848ee-1c1c-4e36-8b44-84817597b2e5","Type":"ContainerStarted","Data":"a54e8df476cf7ad6a914790cbe1fab60d4aa0f7b9febde61e13542231036190b"} Apr 16 17:51:24.621080 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:51:24.620810 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-lws-operator/lws-controller-manager-64d875bb5b-lpxfh" Apr 16 17:51:24.639496 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:51:24.639436 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/lws-controller-manager-64d875bb5b-lpxfh" podStartSLOduration=2.195033792 podStartE2EDuration="3.639418754s" podCreationTimestamp="2026-04-16 17:51:21 +0000 UTC" firstStartedPulling="2026-04-16 17:51:22.255707847 +0000 UTC m=+692.415078110" lastFinishedPulling="2026-04-16 17:51:23.700092807 +0000 UTC m=+693.859463072" observedRunningTime="2026-04-16 17:51:24.638769108 +0000 UTC m=+694.798139390" watchObservedRunningTime="2026-04-16 17:51:24.639418754 +0000 UTC m=+694.798789037" Apr 16 17:51:35.626184 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:51:35.626150 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-lws-operator/lws-controller-manager-64d875bb5b-lpxfh" Apr 16 17:52:28.026429 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:52:28.026387 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6c886788f8-rhf4w"] Apr 16 17:52:28.029560 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:52:28.029543 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-rhf4w" Apr 16 17:52:28.031490 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:52:28.031453 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"plugin-serving-cert\"" Apr 16 17:52:28.031650 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:52:28.031611 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kuadrant-console-nginx-conf\"" Apr 16 17:52:28.031650 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:52:28.031615 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 16 17:52:28.031768 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:52:28.031703 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-v8xg6\"" Apr 16 17:52:28.032201 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:52:28.032184 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 16 17:52:28.038905 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:52:28.038884 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6c886788f8-rhf4w"] Apr 16 17:52:28.194638 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:52:28.194567 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pr2z5\" (UniqueName: \"kubernetes.io/projected/26296ac9-1125-4bfb-a32b-3be340b90915-kube-api-access-pr2z5\") pod \"kuadrant-console-plugin-6c886788f8-rhf4w\" (UID: \"26296ac9-1125-4bfb-a32b-3be340b90915\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-rhf4w" Apr 16 17:52:28.194807 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:52:28.194677 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/26296ac9-1125-4bfb-a32b-3be340b90915-nginx-conf\") pod \"kuadrant-console-plugin-6c886788f8-rhf4w\" (UID: \"26296ac9-1125-4bfb-a32b-3be340b90915\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-rhf4w" Apr 16 17:52:28.194807 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:52:28.194702 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/26296ac9-1125-4bfb-a32b-3be340b90915-plugin-serving-cert\") pod \"kuadrant-console-plugin-6c886788f8-rhf4w\" (UID: \"26296ac9-1125-4bfb-a32b-3be340b90915\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-rhf4w" Apr 16 17:52:28.295535 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:52:28.295431 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/26296ac9-1125-4bfb-a32b-3be340b90915-nginx-conf\") pod \"kuadrant-console-plugin-6c886788f8-rhf4w\" (UID: \"26296ac9-1125-4bfb-a32b-3be340b90915\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-rhf4w" Apr 16 17:52:28.295535 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:52:28.295483 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/26296ac9-1125-4bfb-a32b-3be340b90915-plugin-serving-cert\") pod \"kuadrant-console-plugin-6c886788f8-rhf4w\" (UID: \"26296ac9-1125-4bfb-a32b-3be340b90915\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-rhf4w" Apr 16 17:52:28.295802 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:52:28.295550 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pr2z5\" (UniqueName: \"kubernetes.io/projected/26296ac9-1125-4bfb-a32b-3be340b90915-kube-api-access-pr2z5\") pod \"kuadrant-console-plugin-6c886788f8-rhf4w\" (UID: \"26296ac9-1125-4bfb-a32b-3be340b90915\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-rhf4w" Apr 16 17:52:28.296053 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:52:28.296031 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/26296ac9-1125-4bfb-a32b-3be340b90915-nginx-conf\") pod \"kuadrant-console-plugin-6c886788f8-rhf4w\" (UID: \"26296ac9-1125-4bfb-a32b-3be340b90915\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-rhf4w" Apr 16 17:52:28.298114 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:52:28.298089 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/26296ac9-1125-4bfb-a32b-3be340b90915-plugin-serving-cert\") pod \"kuadrant-console-plugin-6c886788f8-rhf4w\" (UID: \"26296ac9-1125-4bfb-a32b-3be340b90915\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-rhf4w" Apr 16 17:52:28.306816 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:52:28.306783 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pr2z5\" (UniqueName: \"kubernetes.io/projected/26296ac9-1125-4bfb-a32b-3be340b90915-kube-api-access-pr2z5\") pod \"kuadrant-console-plugin-6c886788f8-rhf4w\" (UID: \"26296ac9-1125-4bfb-a32b-3be340b90915\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-rhf4w" Apr 16 17:52:28.339849 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:52:28.339817 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-rhf4w" Apr 16 17:52:28.462103 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:52:28.462074 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6c886788f8-rhf4w"] Apr 16 17:52:28.465119 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:52:28.465094 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod26296ac9_1125_4bfb_a32b_3be340b90915.slice/crio-255aa5391bce292d14db5c80b843f9afbf3918085ab00f712bb3453815eeca2f WatchSource:0}: Error finding container 255aa5391bce292d14db5c80b843f9afbf3918085ab00f712bb3453815eeca2f: Status 404 returned error can't find the container with id 255aa5391bce292d14db5c80b843f9afbf3918085ab00f712bb3453815eeca2f Apr 16 17:52:28.804265 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:52:28.804224 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-rhf4w" event={"ID":"26296ac9-1125-4bfb-a32b-3be340b90915","Type":"ContainerStarted","Data":"255aa5391bce292d14db5c80b843f9afbf3918085ab00f712bb3453815eeca2f"} Apr 16 17:52:33.822665 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:52:33.822627 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-rhf4w" event={"ID":"26296ac9-1125-4bfb-a32b-3be340b90915","Type":"ContainerStarted","Data":"f7832bc6bc67d1c1c3d4924a4f1d4a1dd286c5a77338098680e703a36009fd9b"} Apr 16 17:52:33.840362 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:52:33.840315 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-rhf4w" podStartSLOduration=1.2538584720000001 podStartE2EDuration="5.840299813s" podCreationTimestamp="2026-04-16 17:52:28 +0000 UTC" firstStartedPulling="2026-04-16 17:52:28.466809878 +0000 UTC m=+758.626180137" lastFinishedPulling="2026-04-16 17:52:33.053251213 +0000 UTC m=+763.212621478" observedRunningTime="2026-04-16 17:52:33.839728922 +0000 UTC m=+763.999099205" watchObservedRunningTime="2026-04-16 17:52:33.840299813 +0000 UTC m=+763.999670094" Apr 16 17:53:09.986103 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:53:09.986023 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-limitador-67566c68b4-9b5kw"] Apr 16 17:53:10.009318 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:53:10.009279 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-67566c68b4-9b5kw"] Apr 16 17:53:10.009472 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:53:10.009408 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-67566c68b4-9b5kw" Apr 16 17:53:10.011743 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:53:10.011714 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"limitador-limits-config-limitador\"" Apr 16 17:53:10.013606 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:53:10.013585 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-67566c68b4-9b5kw"] Apr 16 17:53:10.090787 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:53:10.090734 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/6b7b3e39-33a1-454b-a469-1317dc782df3-config-file\") pod \"limitador-limitador-67566c68b4-9b5kw\" (UID: \"6b7b3e39-33a1-454b-a469-1317dc782df3\") " pod="kuadrant-system/limitador-limitador-67566c68b4-9b5kw" Apr 16 17:53:10.090976 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:53:10.090830 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xspgs\" (UniqueName: \"kubernetes.io/projected/6b7b3e39-33a1-454b-a469-1317dc782df3-kube-api-access-xspgs\") pod \"limitador-limitador-67566c68b4-9b5kw\" (UID: \"6b7b3e39-33a1-454b-a469-1317dc782df3\") " pod="kuadrant-system/limitador-limitador-67566c68b4-9b5kw" Apr 16 17:53:10.196643 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:53:10.196598 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/6b7b3e39-33a1-454b-a469-1317dc782df3-config-file\") pod \"limitador-limitador-67566c68b4-9b5kw\" (UID: \"6b7b3e39-33a1-454b-a469-1317dc782df3\") " pod="kuadrant-system/limitador-limitador-67566c68b4-9b5kw" Apr 16 17:53:10.196812 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:53:10.196770 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xspgs\" (UniqueName: \"kubernetes.io/projected/6b7b3e39-33a1-454b-a469-1317dc782df3-kube-api-access-xspgs\") pod \"limitador-limitador-67566c68b4-9b5kw\" (UID: \"6b7b3e39-33a1-454b-a469-1317dc782df3\") " pod="kuadrant-system/limitador-limitador-67566c68b4-9b5kw" Apr 16 17:53:10.197354 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:53:10.197327 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/6b7b3e39-33a1-454b-a469-1317dc782df3-config-file\") pod \"limitador-limitador-67566c68b4-9b5kw\" (UID: \"6b7b3e39-33a1-454b-a469-1317dc782df3\") " pod="kuadrant-system/limitador-limitador-67566c68b4-9b5kw" Apr 16 17:53:10.206190 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:53:10.206160 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xspgs\" (UniqueName: \"kubernetes.io/projected/6b7b3e39-33a1-454b-a469-1317dc782df3-kube-api-access-xspgs\") pod \"limitador-limitador-67566c68b4-9b5kw\" (UID: \"6b7b3e39-33a1-454b-a469-1317dc782df3\") " pod="kuadrant-system/limitador-limitador-67566c68b4-9b5kw" Apr 16 17:53:10.319892 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:53:10.319800 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-67566c68b4-9b5kw" Apr 16 17:53:10.454623 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:53:10.454591 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-67566c68b4-9b5kw"] Apr 16 17:53:10.457748 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:53:10.457712 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6b7b3e39_33a1_454b_a469_1317dc782df3.slice/crio-27d0fc447383a2a5ba3430716e0f7fba3099dff2e627f8ad74748746ad36be9d WatchSource:0}: Error finding container 27d0fc447383a2a5ba3430716e0f7fba3099dff2e627f8ad74748746ad36be9d: Status 404 returned error can't find the container with id 27d0fc447383a2a5ba3430716e0f7fba3099dff2e627f8ad74748746ad36be9d Apr 16 17:53:10.928667 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:53:10.928629 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-67566c68b4-9b5kw" event={"ID":"6b7b3e39-33a1-454b-a469-1317dc782df3","Type":"ContainerStarted","Data":"27d0fc447383a2a5ba3430716e0f7fba3099dff2e627f8ad74748746ad36be9d"} Apr 16 17:53:11.933338 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:53:11.933294 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-67566c68b4-9b5kw" event={"ID":"6b7b3e39-33a1-454b-a469-1317dc782df3","Type":"ContainerStarted","Data":"c643b8cf60a5d5347136b0f9b0c784a8b11cca064f43c58af653a0f92737e088"} Apr 16 17:53:11.933756 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:53:11.933421 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-limitador-67566c68b4-9b5kw" Apr 16 17:53:11.951752 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:53:11.951693 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-limitador-67566c68b4-9b5kw" podStartSLOduration=1.873843036 podStartE2EDuration="2.951670492s" podCreationTimestamp="2026-04-16 17:53:09 +0000 UTC" firstStartedPulling="2026-04-16 17:53:10.459992544 +0000 UTC m=+800.619362803" lastFinishedPulling="2026-04-16 17:53:11.537819994 +0000 UTC m=+801.697190259" observedRunningTime="2026-04-16 17:53:11.950922588 +0000 UTC m=+802.110292870" watchObservedRunningTime="2026-04-16 17:53:11.951670492 +0000 UTC m=+802.111040777" Apr 16 17:53:22.936937 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:53:22.936904 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-limitador-67566c68b4-9b5kw" Apr 16 17:54:50.396532 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:54:50.396452 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-zcvg5_05d7dd61-bda6-4c4d-87b6-86d7c20fe0cd/console-operator/1.log" Apr 16 17:54:50.398224 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:54:50.398200 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-zcvg5_05d7dd61-bda6-4c4d-87b6-86d7c20fe0cd/console-operator/1.log" Apr 16 17:54:50.402329 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:54:50.402310 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-42vwn_2ee69609-b777-4f06-988b-8e74dd389f13/ovn-acl-logging/0.log" Apr 16 17:54:50.404018 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:54:50.404000 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-42vwn_2ee69609-b777-4f06-988b-8e74dd389f13/ovn-acl-logging/0.log" Apr 16 17:55:19.023183 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:55:19.023150 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/s3-init-mtdcv"] Apr 16 17:55:19.026354 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:55:19.026338 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-mtdcv" Apr 16 17:55:19.028267 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:55:19.028243 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-6w4xh\"" Apr 16 17:55:19.028405 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:55:19.028276 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"mlpipeline-s3-artifact\"" Apr 16 17:55:19.028405 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:55:19.028284 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 16 17:55:19.028405 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:55:19.028365 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 16 17:55:19.034549 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:55:19.034516 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-mtdcv"] Apr 16 17:55:19.100341 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:55:19.100308 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n48pq\" (UniqueName: \"kubernetes.io/projected/663cb3c8-1b78-4009-9914-a9da0d3efede-kube-api-access-n48pq\") pod \"s3-init-mtdcv\" (UID: \"663cb3c8-1b78-4009-9914-a9da0d3efede\") " pod="kserve/s3-init-mtdcv" Apr 16 17:55:19.201505 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:55:19.201475 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n48pq\" (UniqueName: \"kubernetes.io/projected/663cb3c8-1b78-4009-9914-a9da0d3efede-kube-api-access-n48pq\") pod \"s3-init-mtdcv\" (UID: \"663cb3c8-1b78-4009-9914-a9da0d3efede\") " pod="kserve/s3-init-mtdcv" Apr 16 17:55:19.211735 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:55:19.211705 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n48pq\" (UniqueName: \"kubernetes.io/projected/663cb3c8-1b78-4009-9914-a9da0d3efede-kube-api-access-n48pq\") pod \"s3-init-mtdcv\" (UID: \"663cb3c8-1b78-4009-9914-a9da0d3efede\") " pod="kserve/s3-init-mtdcv" Apr 16 17:55:19.335280 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:55:19.335193 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-mtdcv" Apr 16 17:55:19.457024 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:55:19.456943 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-mtdcv"] Apr 16 17:55:19.459717 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:55:19.459546 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod663cb3c8_1b78_4009_9914_a9da0d3efede.slice/crio-de19f428540660b1d954c4b364b093d7dd75859e5d2273983b843b1070930994 WatchSource:0}: Error finding container de19f428540660b1d954c4b364b093d7dd75859e5d2273983b843b1070930994: Status 404 returned error can't find the container with id de19f428540660b1d954c4b364b093d7dd75859e5d2273983b843b1070930994 Apr 16 17:55:20.305902 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:55:20.305853 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-mtdcv" event={"ID":"663cb3c8-1b78-4009-9914-a9da0d3efede","Type":"ContainerStarted","Data":"de19f428540660b1d954c4b364b093d7dd75859e5d2273983b843b1070930994"} Apr 16 17:55:24.323461 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:55:24.323427 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-mtdcv" event={"ID":"663cb3c8-1b78-4009-9914-a9da0d3efede","Type":"ContainerStarted","Data":"d3ffce20bcf905e30bba2429e2e3ad9f67883c6fce3994a696ca0cef18830775"} Apr 16 17:55:24.341261 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:55:24.341207 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/s3-init-mtdcv" podStartSLOduration=0.903206681 podStartE2EDuration="5.341191239s" podCreationTimestamp="2026-04-16 17:55:19 +0000 UTC" firstStartedPulling="2026-04-16 17:55:19.46152297 +0000 UTC m=+929.620893230" lastFinishedPulling="2026-04-16 17:55:23.899507524 +0000 UTC m=+934.058877788" observedRunningTime="2026-04-16 17:55:24.339509069 +0000 UTC m=+934.498879351" watchObservedRunningTime="2026-04-16 17:55:24.341191239 +0000 UTC m=+934.500561522" Apr 16 17:55:27.334368 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:55:27.334283 2574 generic.go:358] "Generic (PLEG): container finished" podID="663cb3c8-1b78-4009-9914-a9da0d3efede" containerID="d3ffce20bcf905e30bba2429e2e3ad9f67883c6fce3994a696ca0cef18830775" exitCode=0 Apr 16 17:55:27.334730 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:55:27.334359 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-mtdcv" event={"ID":"663cb3c8-1b78-4009-9914-a9da0d3efede","Type":"ContainerDied","Data":"d3ffce20bcf905e30bba2429e2e3ad9f67883c6fce3994a696ca0cef18830775"} Apr 16 17:55:28.465001 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:55:28.464979 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-mtdcv" Apr 16 17:55:28.585369 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:55:28.585334 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n48pq\" (UniqueName: \"kubernetes.io/projected/663cb3c8-1b78-4009-9914-a9da0d3efede-kube-api-access-n48pq\") pod \"663cb3c8-1b78-4009-9914-a9da0d3efede\" (UID: \"663cb3c8-1b78-4009-9914-a9da0d3efede\") " Apr 16 17:55:28.587680 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:55:28.587653 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/663cb3c8-1b78-4009-9914-a9da0d3efede-kube-api-access-n48pq" (OuterVolumeSpecName: "kube-api-access-n48pq") pod "663cb3c8-1b78-4009-9914-a9da0d3efede" (UID: "663cb3c8-1b78-4009-9914-a9da0d3efede"). InnerVolumeSpecName "kube-api-access-n48pq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 17:55:28.686313 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:55:28.686279 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-n48pq\" (UniqueName: \"kubernetes.io/projected/663cb3c8-1b78-4009-9914-a9da0d3efede-kube-api-access-n48pq\") on node \"ip-10-0-138-45.ec2.internal\" DevicePath \"\"" Apr 16 17:55:29.343388 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:55:29.343356 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-mtdcv" Apr 16 17:55:29.343388 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:55:29.343371 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-mtdcv" event={"ID":"663cb3c8-1b78-4009-9914-a9da0d3efede","Type":"ContainerDied","Data":"de19f428540660b1d954c4b364b093d7dd75859e5d2273983b843b1070930994"} Apr 16 17:55:29.343619 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:55:29.343399 2574 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="de19f428540660b1d954c4b364b093d7dd75859e5d2273983b843b1070930994" Apr 16 17:55:53.344029 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:55:53.343998 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5f99cjdjql"] Apr 16 17:55:53.344528 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:55:53.344309 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="663cb3c8-1b78-4009-9914-a9da0d3efede" containerName="s3-init" Apr 16 17:55:53.344528 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:55:53.344320 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="663cb3c8-1b78-4009-9914-a9da0d3efede" containerName="s3-init" Apr 16 17:55:53.344528 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:55:53.344369 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="663cb3c8-1b78-4009-9914-a9da0d3efede" containerName="s3-init" Apr 16 17:55:53.376368 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:55:53.376335 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5f99cjdjql"] Apr 16 17:55:53.376520 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:55:53.376436 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5f99cjdjql" Apr 16 17:55:53.379005 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:55:53.378981 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-configmap-ref-test-kserve-self-signed-certs\"" Apr 16 17:55:53.379288 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:55:53.379273 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-configmap-ref-test-epp-sa-dockercfg-cd9lt\"" Apr 16 17:55:53.379338 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:55:53.379283 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-4mzq6\"" Apr 16 17:55:53.379338 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:55:53.379299 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 16 17:55:53.381062 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:55:53.381045 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 16 17:55:53.466702 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:55:53.466670 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/b7e2ef10-3ac2-4abb-bfb8-5240f39552d8-tokenizer-uds\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-5f99cjdjql\" (UID: \"b7e2ef10-3ac2-4abb-bfb8-5240f39552d8\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5f99cjdjql" Apr 16 17:55:53.466870 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:55:53.466724 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/b7e2ef10-3ac2-4abb-bfb8-5240f39552d8-tokenizer-cache\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-5f99cjdjql\" (UID: \"b7e2ef10-3ac2-4abb-bfb8-5240f39552d8\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5f99cjdjql" Apr 16 17:55:53.466870 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:55:53.466760 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/b7e2ef10-3ac2-4abb-bfb8-5240f39552d8-tokenizer-tmp\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-5f99cjdjql\" (UID: \"b7e2ef10-3ac2-4abb-bfb8-5240f39552d8\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5f99cjdjql" Apr 16 17:55:53.466870 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:55:53.466793 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b7e2ef10-3ac2-4abb-bfb8-5240f39552d8-tls-certs\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-5f99cjdjql\" (UID: \"b7e2ef10-3ac2-4abb-bfb8-5240f39552d8\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5f99cjdjql" Apr 16 17:55:53.466870 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:55:53.466816 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b7e2ef10-3ac2-4abb-bfb8-5240f39552d8-kserve-provision-location\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-5f99cjdjql\" (UID: \"b7e2ef10-3ac2-4abb-bfb8-5240f39552d8\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5f99cjdjql" Apr 16 17:55:53.466870 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:55:53.466835 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xd7vk\" (UniqueName: \"kubernetes.io/projected/b7e2ef10-3ac2-4abb-bfb8-5240f39552d8-kube-api-access-xd7vk\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-5f99cjdjql\" (UID: \"b7e2ef10-3ac2-4abb-bfb8-5240f39552d8\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5f99cjdjql" Apr 16 17:55:53.567646 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:55:53.567609 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/b7e2ef10-3ac2-4abb-bfb8-5240f39552d8-tokenizer-cache\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-5f99cjdjql\" (UID: \"b7e2ef10-3ac2-4abb-bfb8-5240f39552d8\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5f99cjdjql" Apr 16 17:55:53.567646 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:55:53.567651 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/b7e2ef10-3ac2-4abb-bfb8-5240f39552d8-tokenizer-tmp\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-5f99cjdjql\" (UID: \"b7e2ef10-3ac2-4abb-bfb8-5240f39552d8\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5f99cjdjql" Apr 16 17:55:53.567866 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:55:53.567679 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b7e2ef10-3ac2-4abb-bfb8-5240f39552d8-tls-certs\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-5f99cjdjql\" (UID: \"b7e2ef10-3ac2-4abb-bfb8-5240f39552d8\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5f99cjdjql" Apr 16 17:55:53.567866 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:55:53.567700 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b7e2ef10-3ac2-4abb-bfb8-5240f39552d8-kserve-provision-location\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-5f99cjdjql\" (UID: \"b7e2ef10-3ac2-4abb-bfb8-5240f39552d8\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5f99cjdjql" Apr 16 17:55:53.567866 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:55:53.567719 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xd7vk\" (UniqueName: \"kubernetes.io/projected/b7e2ef10-3ac2-4abb-bfb8-5240f39552d8-kube-api-access-xd7vk\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-5f99cjdjql\" (UID: \"b7e2ef10-3ac2-4abb-bfb8-5240f39552d8\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5f99cjdjql" Apr 16 17:55:53.567866 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:55:53.567757 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/b7e2ef10-3ac2-4abb-bfb8-5240f39552d8-tokenizer-uds\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-5f99cjdjql\" (UID: \"b7e2ef10-3ac2-4abb-bfb8-5240f39552d8\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5f99cjdjql" Apr 16 17:55:53.568078 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:55:53.568040 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/b7e2ef10-3ac2-4abb-bfb8-5240f39552d8-tokenizer-cache\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-5f99cjdjql\" (UID: \"b7e2ef10-3ac2-4abb-bfb8-5240f39552d8\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5f99cjdjql" Apr 16 17:55:53.568118 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:55:53.568063 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/b7e2ef10-3ac2-4abb-bfb8-5240f39552d8-tokenizer-tmp\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-5f99cjdjql\" (UID: \"b7e2ef10-3ac2-4abb-bfb8-5240f39552d8\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5f99cjdjql" Apr 16 17:55:53.568118 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:55:53.568093 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b7e2ef10-3ac2-4abb-bfb8-5240f39552d8-kserve-provision-location\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-5f99cjdjql\" (UID: \"b7e2ef10-3ac2-4abb-bfb8-5240f39552d8\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5f99cjdjql" Apr 16 17:55:53.568228 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:55:53.568211 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/b7e2ef10-3ac2-4abb-bfb8-5240f39552d8-tokenizer-uds\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-5f99cjdjql\" (UID: \"b7e2ef10-3ac2-4abb-bfb8-5240f39552d8\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5f99cjdjql" Apr 16 17:55:53.570330 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:55:53.570308 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b7e2ef10-3ac2-4abb-bfb8-5240f39552d8-tls-certs\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-5f99cjdjql\" (UID: \"b7e2ef10-3ac2-4abb-bfb8-5240f39552d8\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5f99cjdjql" Apr 16 17:55:53.578202 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:55:53.578181 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xd7vk\" (UniqueName: \"kubernetes.io/projected/b7e2ef10-3ac2-4abb-bfb8-5240f39552d8-kube-api-access-xd7vk\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-5f99cjdjql\" (UID: \"b7e2ef10-3ac2-4abb-bfb8-5240f39552d8\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5f99cjdjql" Apr 16 17:55:53.684981 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:55:53.684945 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5f99cjdjql" Apr 16 17:55:53.816110 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:55:53.816077 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5f99cjdjql"] Apr 16 17:55:53.819430 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:55:53.819402 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb7e2ef10_3ac2_4abb_bfb8_5240f39552d8.slice/crio-9c6b9ad400429f95013433a69eea89abb117f559a7c86cea35bdddb31c2142df WatchSource:0}: Error finding container 9c6b9ad400429f95013433a69eea89abb117f559a7c86cea35bdddb31c2142df: Status 404 returned error can't find the container with id 9c6b9ad400429f95013433a69eea89abb117f559a7c86cea35bdddb31c2142df Apr 16 17:55:54.415519 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:55:54.415476 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5f99cjdjql" event={"ID":"b7e2ef10-3ac2-4abb-bfb8-5240f39552d8","Type":"ContainerStarted","Data":"9c6b9ad400429f95013433a69eea89abb117f559a7c86cea35bdddb31c2142df"} Apr 16 17:55:57.426728 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:55:57.426694 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5f99cjdjql" event={"ID":"b7e2ef10-3ac2-4abb-bfb8-5240f39552d8","Type":"ContainerStarted","Data":"fb74fe6ac86108c0558f12f46103cc944bea78d71e6a4327d673c1af2e9651c8"} Apr 16 17:55:58.435006 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:55:58.434962 2574 generic.go:358] "Generic (PLEG): container finished" podID="b7e2ef10-3ac2-4abb-bfb8-5240f39552d8" containerID="fb74fe6ac86108c0558f12f46103cc944bea78d71e6a4327d673c1af2e9651c8" exitCode=0 Apr 16 17:55:58.435471 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:55:58.435027 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5f99cjdjql" event={"ID":"b7e2ef10-3ac2-4abb-bfb8-5240f39552d8","Type":"ContainerDied","Data":"fb74fe6ac86108c0558f12f46103cc944bea78d71e6a4327d673c1af2e9651c8"} Apr 16 17:55:58.436063 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:55:58.436048 2574 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 17:56:00.444235 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:56:00.444199 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5f99cjdjql" event={"ID":"b7e2ef10-3ac2-4abb-bfb8-5240f39552d8","Type":"ContainerStarted","Data":"fff993026bc8ac3c674be598a0b310888dce68e469c18acd39f2ad41f15aa860"} Apr 16 17:56:30.541258 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:56:30.541221 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5f99cjdjql" event={"ID":"b7e2ef10-3ac2-4abb-bfb8-5240f39552d8","Type":"ContainerStarted","Data":"1dc4733de5a979a3b36e81d283d75021df8eebe272787c0b14e52bc8c45871ba"} Apr 16 17:56:30.541682 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:56:30.541523 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5f99cjdjql" Apr 16 17:56:30.543951 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:56:30.543927 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5f99cjdjql" Apr 16 17:56:30.564971 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:56:30.564913 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5f99cjdjql" podStartSLOduration=1.413940598 podStartE2EDuration="37.564899166s" podCreationTimestamp="2026-04-16 17:55:53 +0000 UTC" firstStartedPulling="2026-04-16 17:55:53.821456699 +0000 UTC m=+963.980826959" lastFinishedPulling="2026-04-16 17:56:29.972415263 +0000 UTC m=+1000.131785527" observedRunningTime="2026-04-16 17:56:30.562182686 +0000 UTC m=+1000.721552968" watchObservedRunningTime="2026-04-16 17:56:30.564899166 +0000 UTC m=+1000.724269448" Apr 16 17:56:33.685062 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:56:33.685024 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5f99cjdjql" Apr 16 17:56:33.685062 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:56:33.685070 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5f99cjdjql" Apr 16 17:56:43.687253 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:56:43.687218 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5f99cjdjql" Apr 16 17:56:43.688493 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:56:43.688472 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5f99cjdjql" Apr 16 17:56:54.846787 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:56:54.846750 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-f674f99bb-m7t8f"] Apr 16 17:56:54.948869 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:56:54.948838 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-f674f99bb-m7t8f"] Apr 16 17:56:54.949036 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:56:54.948964 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-f674f99bb-m7t8f" Apr 16 17:56:54.951472 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:56:54.951450 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-ha-replicas-test-kserve-self-signed-certs\"" Apr 16 17:56:55.000403 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:56:55.000365 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d5fg6\" (UniqueName: \"kubernetes.io/projected/f43070d1-0aca-470c-8964-f29ed7887e95-kube-api-access-d5fg6\") pod \"scheduler-ha-replicas-test-kserve-f674f99bb-m7t8f\" (UID: \"f43070d1-0aca-470c-8964-f29ed7887e95\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-f674f99bb-m7t8f" Apr 16 17:56:55.000531 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:56:55.000407 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/f43070d1-0aca-470c-8964-f29ed7887e95-model-cache\") pod \"scheduler-ha-replicas-test-kserve-f674f99bb-m7t8f\" (UID: \"f43070d1-0aca-470c-8964-f29ed7887e95\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-f674f99bb-m7t8f" Apr 16 17:56:55.000531 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:56:55.000429 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/f43070d1-0aca-470c-8964-f29ed7887e95-dshm\") pod \"scheduler-ha-replicas-test-kserve-f674f99bb-m7t8f\" (UID: \"f43070d1-0aca-470c-8964-f29ed7887e95\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-f674f99bb-m7t8f" Apr 16 17:56:55.000531 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:56:55.000491 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/f43070d1-0aca-470c-8964-f29ed7887e95-home\") pod \"scheduler-ha-replicas-test-kserve-f674f99bb-m7t8f\" (UID: \"f43070d1-0aca-470c-8964-f29ed7887e95\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-f674f99bb-m7t8f" Apr 16 17:56:55.000531 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:56:55.000524 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f43070d1-0aca-470c-8964-f29ed7887e95-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-f674f99bb-m7t8f\" (UID: \"f43070d1-0aca-470c-8964-f29ed7887e95\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-f674f99bb-m7t8f" Apr 16 17:56:55.000699 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:56:55.000540 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f43070d1-0aca-470c-8964-f29ed7887e95-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-f674f99bb-m7t8f\" (UID: \"f43070d1-0aca-470c-8964-f29ed7887e95\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-f674f99bb-m7t8f" Apr 16 17:56:55.101167 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:56:55.101088 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/f43070d1-0aca-470c-8964-f29ed7887e95-home\") pod \"scheduler-ha-replicas-test-kserve-f674f99bb-m7t8f\" (UID: \"f43070d1-0aca-470c-8964-f29ed7887e95\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-f674f99bb-m7t8f" Apr 16 17:56:55.101167 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:56:55.101146 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f43070d1-0aca-470c-8964-f29ed7887e95-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-f674f99bb-m7t8f\" (UID: \"f43070d1-0aca-470c-8964-f29ed7887e95\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-f674f99bb-m7t8f" Apr 16 17:56:55.101352 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:56:55.101266 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f43070d1-0aca-470c-8964-f29ed7887e95-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-f674f99bb-m7t8f\" (UID: \"f43070d1-0aca-470c-8964-f29ed7887e95\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-f674f99bb-m7t8f" Apr 16 17:56:55.101352 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:56:55.101308 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d5fg6\" (UniqueName: \"kubernetes.io/projected/f43070d1-0aca-470c-8964-f29ed7887e95-kube-api-access-d5fg6\") pod \"scheduler-ha-replicas-test-kserve-f674f99bb-m7t8f\" (UID: \"f43070d1-0aca-470c-8964-f29ed7887e95\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-f674f99bb-m7t8f" Apr 16 17:56:55.101466 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:56:55.101441 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/f43070d1-0aca-470c-8964-f29ed7887e95-model-cache\") pod \"scheduler-ha-replicas-test-kserve-f674f99bb-m7t8f\" (UID: \"f43070d1-0aca-470c-8964-f29ed7887e95\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-f674f99bb-m7t8f" Apr 16 17:56:55.101513 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:56:55.101500 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/f43070d1-0aca-470c-8964-f29ed7887e95-dshm\") pod \"scheduler-ha-replicas-test-kserve-f674f99bb-m7t8f\" (UID: \"f43070d1-0aca-470c-8964-f29ed7887e95\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-f674f99bb-m7t8f" Apr 16 17:56:55.101558 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:56:55.101520 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/f43070d1-0aca-470c-8964-f29ed7887e95-home\") pod \"scheduler-ha-replicas-test-kserve-f674f99bb-m7t8f\" (UID: \"f43070d1-0aca-470c-8964-f29ed7887e95\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-f674f99bb-m7t8f" Apr 16 17:56:55.101635 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:56:55.101563 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f43070d1-0aca-470c-8964-f29ed7887e95-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-f674f99bb-m7t8f\" (UID: \"f43070d1-0aca-470c-8964-f29ed7887e95\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-f674f99bb-m7t8f" Apr 16 17:56:55.101836 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:56:55.101817 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/f43070d1-0aca-470c-8964-f29ed7887e95-model-cache\") pod \"scheduler-ha-replicas-test-kserve-f674f99bb-m7t8f\" (UID: \"f43070d1-0aca-470c-8964-f29ed7887e95\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-f674f99bb-m7t8f" Apr 16 17:56:55.103819 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:56:55.103792 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/f43070d1-0aca-470c-8964-f29ed7887e95-dshm\") pod \"scheduler-ha-replicas-test-kserve-f674f99bb-m7t8f\" (UID: \"f43070d1-0aca-470c-8964-f29ed7887e95\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-f674f99bb-m7t8f" Apr 16 17:56:55.103819 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:56:55.103809 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f43070d1-0aca-470c-8964-f29ed7887e95-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-f674f99bb-m7t8f\" (UID: \"f43070d1-0aca-470c-8964-f29ed7887e95\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-f674f99bb-m7t8f" Apr 16 17:56:55.113064 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:56:55.113032 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d5fg6\" (UniqueName: \"kubernetes.io/projected/f43070d1-0aca-470c-8964-f29ed7887e95-kube-api-access-d5fg6\") pod \"scheduler-ha-replicas-test-kserve-f674f99bb-m7t8f\" (UID: \"f43070d1-0aca-470c-8964-f29ed7887e95\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-f674f99bb-m7t8f" Apr 16 17:56:55.259547 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:56:55.259507 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-f674f99bb-m7t8f" Apr 16 17:56:55.387704 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:56:55.387678 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-f674f99bb-m7t8f"] Apr 16 17:56:55.390367 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:56:55.390334 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf43070d1_0aca_470c_8964_f29ed7887e95.slice/crio-a5ec77452e064825dcbeafecd4c5f12c9f1a8942e0af9784c81a00bee90fe52b WatchSource:0}: Error finding container a5ec77452e064825dcbeafecd4c5f12c9f1a8942e0af9784c81a00bee90fe52b: Status 404 returned error can't find the container with id a5ec77452e064825dcbeafecd4c5f12c9f1a8942e0af9784c81a00bee90fe52b Apr 16 17:56:55.629787 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:56:55.629750 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-f674f99bb-m7t8f" event={"ID":"f43070d1-0aca-470c-8964-f29ed7887e95","Type":"ContainerStarted","Data":"3aa7fa930cb20d0da868863ca69e5c54da6e593ba966451716236eed98f68f05"} Apr 16 17:56:55.629787 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:56:55.629784 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-f674f99bb-m7t8f" event={"ID":"f43070d1-0aca-470c-8964-f29ed7887e95","Type":"ContainerStarted","Data":"a5ec77452e064825dcbeafecd4c5f12c9f1a8942e0af9784c81a00bee90fe52b"} Apr 16 17:56:56.984004 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:56:56.983964 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5f99cjdjql"] Apr 16 17:56:56.984472 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:56:56.984429 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5f99cjdjql" podUID="b7e2ef10-3ac2-4abb-bfb8-5240f39552d8" containerName="main" containerID="cri-o://fff993026bc8ac3c674be598a0b310888dce68e469c18acd39f2ad41f15aa860" gracePeriod=30 Apr 16 17:56:56.984544 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:56:56.984476 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5f99cjdjql" podUID="b7e2ef10-3ac2-4abb-bfb8-5240f39552d8" containerName="tokenizer" containerID="cri-o://1dc4733de5a979a3b36e81d283d75021df8eebe272787c0b14e52bc8c45871ba" gracePeriod=30 Apr 16 17:56:57.648636 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:56:57.648598 2574 generic.go:358] "Generic (PLEG): container finished" podID="b7e2ef10-3ac2-4abb-bfb8-5240f39552d8" containerID="fff993026bc8ac3c674be598a0b310888dce68e469c18acd39f2ad41f15aa860" exitCode=0 Apr 16 17:56:57.648636 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:56:57.648615 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5f99cjdjql" event={"ID":"b7e2ef10-3ac2-4abb-bfb8-5240f39552d8","Type":"ContainerDied","Data":"fff993026bc8ac3c674be598a0b310888dce68e469c18acd39f2ad41f15aa860"} Apr 16 17:56:58.654860 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:56:58.654782 2574 generic.go:358] "Generic (PLEG): container finished" podID="b7e2ef10-3ac2-4abb-bfb8-5240f39552d8" containerID="1dc4733de5a979a3b36e81d283d75021df8eebe272787c0b14e52bc8c45871ba" exitCode=0 Apr 16 17:56:58.654860 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:56:58.654839 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5f99cjdjql" event={"ID":"b7e2ef10-3ac2-4abb-bfb8-5240f39552d8","Type":"ContainerDied","Data":"1dc4733de5a979a3b36e81d283d75021df8eebe272787c0b14e52bc8c45871ba"} Apr 16 17:56:58.913862 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:56:58.913735 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5f99cjdjql" Apr 16 17:56:59.039059 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:56:59.039027 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/b7e2ef10-3ac2-4abb-bfb8-5240f39552d8-tokenizer-uds\") pod \"b7e2ef10-3ac2-4abb-bfb8-5240f39552d8\" (UID: \"b7e2ef10-3ac2-4abb-bfb8-5240f39552d8\") " Apr 16 17:56:59.039059 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:56:59.039072 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xd7vk\" (UniqueName: \"kubernetes.io/projected/b7e2ef10-3ac2-4abb-bfb8-5240f39552d8-kube-api-access-xd7vk\") pod \"b7e2ef10-3ac2-4abb-bfb8-5240f39552d8\" (UID: \"b7e2ef10-3ac2-4abb-bfb8-5240f39552d8\") " Apr 16 17:56:59.039299 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:56:59.039106 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/b7e2ef10-3ac2-4abb-bfb8-5240f39552d8-tokenizer-tmp\") pod \"b7e2ef10-3ac2-4abb-bfb8-5240f39552d8\" (UID: \"b7e2ef10-3ac2-4abb-bfb8-5240f39552d8\") " Apr 16 17:56:59.039299 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:56:59.039154 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b7e2ef10-3ac2-4abb-bfb8-5240f39552d8-kserve-provision-location\") pod \"b7e2ef10-3ac2-4abb-bfb8-5240f39552d8\" (UID: \"b7e2ef10-3ac2-4abb-bfb8-5240f39552d8\") " Apr 16 17:56:59.039419 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:56:59.039302 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/b7e2ef10-3ac2-4abb-bfb8-5240f39552d8-tokenizer-cache\") pod \"b7e2ef10-3ac2-4abb-bfb8-5240f39552d8\" (UID: \"b7e2ef10-3ac2-4abb-bfb8-5240f39552d8\") " Apr 16 17:56:59.039419 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:56:59.039353 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b7e2ef10-3ac2-4abb-bfb8-5240f39552d8-tls-certs\") pod \"b7e2ef10-3ac2-4abb-bfb8-5240f39552d8\" (UID: \"b7e2ef10-3ac2-4abb-bfb8-5240f39552d8\") " Apr 16 17:56:59.039419 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:56:59.039385 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b7e2ef10-3ac2-4abb-bfb8-5240f39552d8-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "b7e2ef10-3ac2-4abb-bfb8-5240f39552d8" (UID: "b7e2ef10-3ac2-4abb-bfb8-5240f39552d8"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 17:56:59.039560 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:56:59.039480 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b7e2ef10-3ac2-4abb-bfb8-5240f39552d8-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "b7e2ef10-3ac2-4abb-bfb8-5240f39552d8" (UID: "b7e2ef10-3ac2-4abb-bfb8-5240f39552d8"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 17:56:59.039658 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:56:59.039610 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b7e2ef10-3ac2-4abb-bfb8-5240f39552d8-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "b7e2ef10-3ac2-4abb-bfb8-5240f39552d8" (UID: "b7e2ef10-3ac2-4abb-bfb8-5240f39552d8"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 17:56:59.039715 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:56:59.039702 2574 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/b7e2ef10-3ac2-4abb-bfb8-5240f39552d8-tokenizer-tmp\") on node \"ip-10-0-138-45.ec2.internal\" DevicePath \"\"" Apr 16 17:56:59.039752 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:56:59.039722 2574 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/b7e2ef10-3ac2-4abb-bfb8-5240f39552d8-tokenizer-cache\") on node \"ip-10-0-138-45.ec2.internal\" DevicePath \"\"" Apr 16 17:56:59.039752 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:56:59.039736 2574 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/b7e2ef10-3ac2-4abb-bfb8-5240f39552d8-tokenizer-uds\") on node \"ip-10-0-138-45.ec2.internal\" DevicePath \"\"" Apr 16 17:56:59.040180 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:56:59.040128 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b7e2ef10-3ac2-4abb-bfb8-5240f39552d8-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "b7e2ef10-3ac2-4abb-bfb8-5240f39552d8" (UID: "b7e2ef10-3ac2-4abb-bfb8-5240f39552d8"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 17:56:59.041608 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:56:59.041553 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b7e2ef10-3ac2-4abb-bfb8-5240f39552d8-kube-api-access-xd7vk" (OuterVolumeSpecName: "kube-api-access-xd7vk") pod "b7e2ef10-3ac2-4abb-bfb8-5240f39552d8" (UID: "b7e2ef10-3ac2-4abb-bfb8-5240f39552d8"). InnerVolumeSpecName "kube-api-access-xd7vk". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 17:56:59.041745 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:56:59.041724 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b7e2ef10-3ac2-4abb-bfb8-5240f39552d8-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "b7e2ef10-3ac2-4abb-bfb8-5240f39552d8" (UID: "b7e2ef10-3ac2-4abb-bfb8-5240f39552d8"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 17:56:59.140925 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:56:59.140893 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xd7vk\" (UniqueName: \"kubernetes.io/projected/b7e2ef10-3ac2-4abb-bfb8-5240f39552d8-kube-api-access-xd7vk\") on node \"ip-10-0-138-45.ec2.internal\" DevicePath \"\"" Apr 16 17:56:59.140925 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:56:59.140922 2574 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b7e2ef10-3ac2-4abb-bfb8-5240f39552d8-kserve-provision-location\") on node \"ip-10-0-138-45.ec2.internal\" DevicePath \"\"" Apr 16 17:56:59.140925 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:56:59.140932 2574 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b7e2ef10-3ac2-4abb-bfb8-5240f39552d8-tls-certs\") on node \"ip-10-0-138-45.ec2.internal\" DevicePath \"\"" Apr 16 17:56:59.660040 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:56:59.660012 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5f99cjdjql" Apr 16 17:56:59.660040 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:56:59.660020 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5f99cjdjql" event={"ID":"b7e2ef10-3ac2-4abb-bfb8-5240f39552d8","Type":"ContainerDied","Data":"9c6b9ad400429f95013433a69eea89abb117f559a7c86cea35bdddb31c2142df"} Apr 16 17:56:59.660739 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:56:59.660076 2574 scope.go:117] "RemoveContainer" containerID="1dc4733de5a979a3b36e81d283d75021df8eebe272787c0b14e52bc8c45871ba" Apr 16 17:56:59.661566 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:56:59.661546 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-f674f99bb-m7t8f" event={"ID":"f43070d1-0aca-470c-8964-f29ed7887e95","Type":"ContainerDied","Data":"3aa7fa930cb20d0da868863ca69e5c54da6e593ba966451716236eed98f68f05"} Apr 16 17:56:59.661691 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:56:59.661530 2574 generic.go:358] "Generic (PLEG): container finished" podID="f43070d1-0aca-470c-8964-f29ed7887e95" containerID="3aa7fa930cb20d0da868863ca69e5c54da6e593ba966451716236eed98f68f05" exitCode=0 Apr 16 17:56:59.669448 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:56:59.669419 2574 scope.go:117] "RemoveContainer" containerID="fff993026bc8ac3c674be598a0b310888dce68e469c18acd39f2ad41f15aa860" Apr 16 17:56:59.676605 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:56:59.676591 2574 scope.go:117] "RemoveContainer" containerID="fb74fe6ac86108c0558f12f46103cc944bea78d71e6a4327d673c1af2e9651c8" Apr 16 17:56:59.708742 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:56:59.708711 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5f99cjdjql"] Apr 16 17:56:59.721997 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:56:59.721954 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5f99cjdjql"] Apr 16 17:57:00.456012 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:57:00.455982 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b7e2ef10-3ac2-4abb-bfb8-5240f39552d8" path="/var/lib/kubelet/pods/b7e2ef10-3ac2-4abb-bfb8-5240f39552d8/volumes" Apr 16 17:57:01.671629 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:57:01.671594 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-f674f99bb-m7t8f" event={"ID":"f43070d1-0aca-470c-8964-f29ed7887e95","Type":"ContainerStarted","Data":"0704aa4667e715420a563ae3dfd10351a04ef668b51b649d9f93d44da21cfbe3"} Apr 16 17:57:01.694445 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:57:01.694385 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-f674f99bb-m7t8f" podStartSLOduration=6.292621259 podStartE2EDuration="7.694360514s" podCreationTimestamp="2026-04-16 17:56:54 +0000 UTC" firstStartedPulling="2026-04-16 17:56:59.662653765 +0000 UTC m=+1029.822024029" lastFinishedPulling="2026-04-16 17:57:01.064393022 +0000 UTC m=+1031.223763284" observedRunningTime="2026-04-16 17:57:01.691184388 +0000 UTC m=+1031.850554670" watchObservedRunningTime="2026-04-16 17:57:01.694360514 +0000 UTC m=+1031.853730801" Apr 16 17:57:05.259815 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:57:05.259775 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-f674f99bb-m7t8f" Apr 16 17:57:05.259815 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:57:05.259826 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-f674f99bb-m7t8f" Apr 16 17:57:05.272271 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:57:05.272238 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-f674f99bb-m7t8f" Apr 16 17:57:05.695834 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:57:05.695811 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-f674f99bb-m7t8f" Apr 16 17:57:44.929563 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:57:44.929479 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-857c96cf596mn5t"] Apr 16 17:57:44.930126 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:57:44.929925 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b7e2ef10-3ac2-4abb-bfb8-5240f39552d8" containerName="main" Apr 16 17:57:44.930126 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:57:44.929943 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7e2ef10-3ac2-4abb-bfb8-5240f39552d8" containerName="main" Apr 16 17:57:44.930126 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:57:44.929966 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b7e2ef10-3ac2-4abb-bfb8-5240f39552d8" containerName="storage-initializer" Apr 16 17:57:44.930126 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:57:44.929974 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7e2ef10-3ac2-4abb-bfb8-5240f39552d8" containerName="storage-initializer" Apr 16 17:57:44.930126 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:57:44.929992 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b7e2ef10-3ac2-4abb-bfb8-5240f39552d8" containerName="tokenizer" Apr 16 17:57:44.930126 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:57:44.930002 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7e2ef10-3ac2-4abb-bfb8-5240f39552d8" containerName="tokenizer" Apr 16 17:57:44.930126 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:57:44.930088 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="b7e2ef10-3ac2-4abb-bfb8-5240f39552d8" containerName="main" Apr 16 17:57:44.930126 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:57:44.930101 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="b7e2ef10-3ac2-4abb-bfb8-5240f39552d8" containerName="tokenizer" Apr 16 17:57:44.932996 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:57:44.932977 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-857c96cf596mn5t" Apr 16 17:57:44.946476 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:57:44.946457 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvdde380eaa9fe1facad32d45131f9e34d-kserve-self-signed-certs\"" Apr 16 17:57:44.984552 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:57:44.984513 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-857c96cf596mn5t"] Apr 16 17:57:45.006118 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:57:45.006084 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nmkb7\" (UniqueName: \"kubernetes.io/projected/5c25f100-adff-45c0-b7f2-1e0046d5081e-kube-api-access-nmkb7\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-857c96cf596mn5t\" (UID: \"5c25f100-adff-45c0-b7f2-1e0046d5081e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-857c96cf596mn5t" Apr 16 17:57:45.006118 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:57:45.006126 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/5c25f100-adff-45c0-b7f2-1e0046d5081e-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-857c96cf596mn5t\" (UID: \"5c25f100-adff-45c0-b7f2-1e0046d5081e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-857c96cf596mn5t" Apr 16 17:57:45.006309 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:57:45.006171 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5c25f100-adff-45c0-b7f2-1e0046d5081e-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-857c96cf596mn5t\" (UID: \"5c25f100-adff-45c0-b7f2-1e0046d5081e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-857c96cf596mn5t" Apr 16 17:57:45.006309 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:57:45.006194 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/5c25f100-adff-45c0-b7f2-1e0046d5081e-home\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-857c96cf596mn5t\" (UID: \"5c25f100-adff-45c0-b7f2-1e0046d5081e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-857c96cf596mn5t" Apr 16 17:57:45.006309 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:57:45.006217 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/5c25f100-adff-45c0-b7f2-1e0046d5081e-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-857c96cf596mn5t\" (UID: \"5c25f100-adff-45c0-b7f2-1e0046d5081e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-857c96cf596mn5t" Apr 16 17:57:45.006309 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:57:45.006275 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/5c25f100-adff-45c0-b7f2-1e0046d5081e-dshm\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-857c96cf596mn5t\" (UID: \"5c25f100-adff-45c0-b7f2-1e0046d5081e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-857c96cf596mn5t" Apr 16 17:57:45.107439 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:57:45.107397 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5c25f100-adff-45c0-b7f2-1e0046d5081e-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-857c96cf596mn5t\" (UID: \"5c25f100-adff-45c0-b7f2-1e0046d5081e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-857c96cf596mn5t" Apr 16 17:57:45.107439 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:57:45.107443 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/5c25f100-adff-45c0-b7f2-1e0046d5081e-home\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-857c96cf596mn5t\" (UID: \"5c25f100-adff-45c0-b7f2-1e0046d5081e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-857c96cf596mn5t" Apr 16 17:57:45.107709 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:57:45.107480 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/5c25f100-adff-45c0-b7f2-1e0046d5081e-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-857c96cf596mn5t\" (UID: \"5c25f100-adff-45c0-b7f2-1e0046d5081e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-857c96cf596mn5t" Apr 16 17:57:45.107709 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:57:45.107511 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/5c25f100-adff-45c0-b7f2-1e0046d5081e-dshm\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-857c96cf596mn5t\" (UID: \"5c25f100-adff-45c0-b7f2-1e0046d5081e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-857c96cf596mn5t" Apr 16 17:57:45.107709 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:57:45.107550 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nmkb7\" (UniqueName: \"kubernetes.io/projected/5c25f100-adff-45c0-b7f2-1e0046d5081e-kube-api-access-nmkb7\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-857c96cf596mn5t\" (UID: \"5c25f100-adff-45c0-b7f2-1e0046d5081e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-857c96cf596mn5t" Apr 16 17:57:45.107709 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:57:45.107604 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/5c25f100-adff-45c0-b7f2-1e0046d5081e-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-857c96cf596mn5t\" (UID: \"5c25f100-adff-45c0-b7f2-1e0046d5081e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-857c96cf596mn5t" Apr 16 17:57:45.107897 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:57:45.107834 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5c25f100-adff-45c0-b7f2-1e0046d5081e-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-857c96cf596mn5t\" (UID: \"5c25f100-adff-45c0-b7f2-1e0046d5081e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-857c96cf596mn5t" Apr 16 17:57:45.107978 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:57:45.107890 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/5c25f100-adff-45c0-b7f2-1e0046d5081e-home\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-857c96cf596mn5t\" (UID: \"5c25f100-adff-45c0-b7f2-1e0046d5081e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-857c96cf596mn5t" Apr 16 17:57:45.107978 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:57:45.107925 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/5c25f100-adff-45c0-b7f2-1e0046d5081e-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-857c96cf596mn5t\" (UID: \"5c25f100-adff-45c0-b7f2-1e0046d5081e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-857c96cf596mn5t" Apr 16 17:57:45.110075 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:57:45.110040 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/5c25f100-adff-45c0-b7f2-1e0046d5081e-dshm\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-857c96cf596mn5t\" (UID: \"5c25f100-adff-45c0-b7f2-1e0046d5081e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-857c96cf596mn5t" Apr 16 17:57:45.110305 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:57:45.110289 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/5c25f100-adff-45c0-b7f2-1e0046d5081e-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-857c96cf596mn5t\" (UID: \"5c25f100-adff-45c0-b7f2-1e0046d5081e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-857c96cf596mn5t" Apr 16 17:57:45.120556 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:57:45.120529 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nmkb7\" (UniqueName: \"kubernetes.io/projected/5c25f100-adff-45c0-b7f2-1e0046d5081e-kube-api-access-nmkb7\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-857c96cf596mn5t\" (UID: \"5c25f100-adff-45c0-b7f2-1e0046d5081e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-857c96cf596mn5t" Apr 16 17:57:45.242590 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:57:45.242471 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-857c96cf596mn5t" Apr 16 17:57:45.367051 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:57:45.367016 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schej66sh"] Apr 16 17:57:45.371685 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:57:45.371666 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schej66sh" Apr 16 17:57:45.375205 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:57:45.375184 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvc-model-fb-opt-125m-route-55f7ae4a-epp-sa-dockercfg-zm5gd\"" Apr 16 17:57:45.390859 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:57:45.390824 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schej66sh"] Apr 16 17:57:45.401805 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:57:45.401770 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-857c96cf596mn5t"] Apr 16 17:57:45.408012 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:57:45.407984 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5c25f100_adff_45c0_b7f2_1e0046d5081e.slice/crio-a0ded536e1c309a9dd820ef60b9af020d2c0b92f9b90aa7556bd0d053f9422db WatchSource:0}: Error finding container a0ded536e1c309a9dd820ef60b9af020d2c0b92f9b90aa7556bd0d053f9422db: Status 404 returned error can't find the container with id a0ded536e1c309a9dd820ef60b9af020d2c0b92f9b90aa7556bd0d053f9422db Apr 16 17:57:45.409840 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:57:45.409819 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/92093a23-ac7d-4b93-9a22-4985c792fe1d-kserve-provision-location\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schej66sh\" (UID: \"92093a23-ac7d-4b93-9a22-4985c792fe1d\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schej66sh" Apr 16 17:57:45.409934 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:57:45.409859 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hgm4q\" (UniqueName: \"kubernetes.io/projected/92093a23-ac7d-4b93-9a22-4985c792fe1d-kube-api-access-hgm4q\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schej66sh\" (UID: \"92093a23-ac7d-4b93-9a22-4985c792fe1d\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schej66sh" Apr 16 17:57:45.409934 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:57:45.409893 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/92093a23-ac7d-4b93-9a22-4985c792fe1d-tls-certs\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schej66sh\" (UID: \"92093a23-ac7d-4b93-9a22-4985c792fe1d\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schej66sh" Apr 16 17:57:45.410004 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:57:45.409941 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/92093a23-ac7d-4b93-9a22-4985c792fe1d-tokenizer-cache\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schej66sh\" (UID: \"92093a23-ac7d-4b93-9a22-4985c792fe1d\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schej66sh" Apr 16 17:57:45.410041 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:57:45.410004 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/92093a23-ac7d-4b93-9a22-4985c792fe1d-tokenizer-uds\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schej66sh\" (UID: \"92093a23-ac7d-4b93-9a22-4985c792fe1d\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schej66sh" Apr 16 17:57:45.410041 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:57:45.410035 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/92093a23-ac7d-4b93-9a22-4985c792fe1d-tokenizer-tmp\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schej66sh\" (UID: \"92093a23-ac7d-4b93-9a22-4985c792fe1d\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schej66sh" Apr 16 17:57:45.511425 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:57:45.511344 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/92093a23-ac7d-4b93-9a22-4985c792fe1d-tokenizer-uds\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schej66sh\" (UID: \"92093a23-ac7d-4b93-9a22-4985c792fe1d\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schej66sh" Apr 16 17:57:45.511425 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:57:45.511375 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/92093a23-ac7d-4b93-9a22-4985c792fe1d-tokenizer-tmp\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schej66sh\" (UID: \"92093a23-ac7d-4b93-9a22-4985c792fe1d\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schej66sh" Apr 16 17:57:45.511425 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:57:45.511409 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/92093a23-ac7d-4b93-9a22-4985c792fe1d-kserve-provision-location\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schej66sh\" (UID: \"92093a23-ac7d-4b93-9a22-4985c792fe1d\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schej66sh" Apr 16 17:57:45.511734 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:57:45.511592 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hgm4q\" (UniqueName: \"kubernetes.io/projected/92093a23-ac7d-4b93-9a22-4985c792fe1d-kube-api-access-hgm4q\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schej66sh\" (UID: \"92093a23-ac7d-4b93-9a22-4985c792fe1d\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schej66sh" Apr 16 17:57:45.511734 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:57:45.511658 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/92093a23-ac7d-4b93-9a22-4985c792fe1d-tls-certs\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schej66sh\" (UID: \"92093a23-ac7d-4b93-9a22-4985c792fe1d\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schej66sh" Apr 16 17:57:45.511832 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:57:45.511741 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/92093a23-ac7d-4b93-9a22-4985c792fe1d-tokenizer-cache\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schej66sh\" (UID: \"92093a23-ac7d-4b93-9a22-4985c792fe1d\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schej66sh" Apr 16 17:57:45.511832 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:57:45.511788 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/92093a23-ac7d-4b93-9a22-4985c792fe1d-tokenizer-tmp\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schej66sh\" (UID: \"92093a23-ac7d-4b93-9a22-4985c792fe1d\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schej66sh" Apr 16 17:57:45.512123 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:57:45.512096 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/92093a23-ac7d-4b93-9a22-4985c792fe1d-tokenizer-uds\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schej66sh\" (UID: \"92093a23-ac7d-4b93-9a22-4985c792fe1d\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schej66sh" Apr 16 17:57:45.512219 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:57:45.512137 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/92093a23-ac7d-4b93-9a22-4985c792fe1d-kserve-provision-location\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schej66sh\" (UID: \"92093a23-ac7d-4b93-9a22-4985c792fe1d\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schej66sh" Apr 16 17:57:45.512256 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:57:45.512243 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/92093a23-ac7d-4b93-9a22-4985c792fe1d-tokenizer-cache\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schej66sh\" (UID: \"92093a23-ac7d-4b93-9a22-4985c792fe1d\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schej66sh" Apr 16 17:57:45.514333 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:57:45.514313 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/92093a23-ac7d-4b93-9a22-4985c792fe1d-tls-certs\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schej66sh\" (UID: \"92093a23-ac7d-4b93-9a22-4985c792fe1d\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schej66sh" Apr 16 17:57:45.527566 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:57:45.527532 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hgm4q\" (UniqueName: \"kubernetes.io/projected/92093a23-ac7d-4b93-9a22-4985c792fe1d-kube-api-access-hgm4q\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schej66sh\" (UID: \"92093a23-ac7d-4b93-9a22-4985c792fe1d\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schej66sh" Apr 16 17:57:45.680938 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:57:45.680896 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schej66sh" Apr 16 17:57:45.815474 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:57:45.815386 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-857c96cf596mn5t" event={"ID":"5c25f100-adff-45c0-b7f2-1e0046d5081e","Type":"ContainerStarted","Data":"ecb6eb6b797d46c47570708993e2d1587db4f39a0cbd31655abbf500aef3d5a1"} Apr 16 17:57:45.815474 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:57:45.815424 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-857c96cf596mn5t" event={"ID":"5c25f100-adff-45c0-b7f2-1e0046d5081e","Type":"ContainerStarted","Data":"a0ded536e1c309a9dd820ef60b9af020d2c0b92f9b90aa7556bd0d053f9422db"} Apr 16 17:57:45.847655 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:57:45.847624 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schej66sh"] Apr 16 17:57:45.850520 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:57:45.850491 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod92093a23_ac7d_4b93_9a22_4985c792fe1d.slice/crio-a37e234777090dc47a4c40d9f4fdbe723fb978e9000e85be81c2a1554e517289 WatchSource:0}: Error finding container a37e234777090dc47a4c40d9f4fdbe723fb978e9000e85be81c2a1554e517289: Status 404 returned error can't find the container with id a37e234777090dc47a4c40d9f4fdbe723fb978e9000e85be81c2a1554e517289 Apr 16 17:57:46.821667 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:57:46.821624 2574 generic.go:358] "Generic (PLEG): container finished" podID="92093a23-ac7d-4b93-9a22-4985c792fe1d" containerID="7e5c1ea818e6c1097c6cd75e0ba5bc60b48d8f9ef1ebcba91140d14db25e0085" exitCode=0 Apr 16 17:57:46.822089 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:57:46.821710 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schej66sh" event={"ID":"92093a23-ac7d-4b93-9a22-4985c792fe1d","Type":"ContainerDied","Data":"7e5c1ea818e6c1097c6cd75e0ba5bc60b48d8f9ef1ebcba91140d14db25e0085"} Apr 16 17:57:46.822089 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:57:46.821752 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schej66sh" event={"ID":"92093a23-ac7d-4b93-9a22-4985c792fe1d","Type":"ContainerStarted","Data":"a37e234777090dc47a4c40d9f4fdbe723fb978e9000e85be81c2a1554e517289"} Apr 16 17:57:47.828670 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:57:47.828625 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schej66sh" event={"ID":"92093a23-ac7d-4b93-9a22-4985c792fe1d","Type":"ContainerStarted","Data":"cfb1bd91f9069c93e7ba053923d15218739b55abb7d3b48a385fb9d01a55500d"} Apr 16 17:57:47.829098 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:57:47.828679 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schej66sh" event={"ID":"92093a23-ac7d-4b93-9a22-4985c792fe1d","Type":"ContainerStarted","Data":"1abedbf4c1e1dc2c6f3c45d983960d3d59a2e3ff07034b9b45f5832c9085fed1"} Apr 16 17:57:47.829098 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:57:47.828777 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schej66sh" Apr 16 17:57:47.883351 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:57:47.883291 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schej66sh" podStartSLOduration=2.883266362 podStartE2EDuration="2.883266362s" podCreationTimestamp="2026-04-16 17:57:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 17:57:47.877945191 +0000 UTC m=+1078.037315473" watchObservedRunningTime="2026-04-16 17:57:47.883266362 +0000 UTC m=+1078.042636646" Apr 16 17:57:49.836517 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:57:49.836481 2574 generic.go:358] "Generic (PLEG): container finished" podID="5c25f100-adff-45c0-b7f2-1e0046d5081e" containerID="ecb6eb6b797d46c47570708993e2d1587db4f39a0cbd31655abbf500aef3d5a1" exitCode=0 Apr 16 17:57:49.836906 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:57:49.836529 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-857c96cf596mn5t" event={"ID":"5c25f100-adff-45c0-b7f2-1e0046d5081e","Type":"ContainerDied","Data":"ecb6eb6b797d46c47570708993e2d1587db4f39a0cbd31655abbf500aef3d5a1"} Apr 16 17:57:55.682072 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:57:55.682033 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schej66sh" Apr 16 17:57:55.684052 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:57:55.682753 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schej66sh" Apr 16 17:57:55.685118 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:57:55.684805 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schej66sh" podUID="92093a23-ac7d-4b93-9a22-4985c792fe1d" containerName="tokenizer" probeResult="failure" output="Get \"http://10.132.0.28:8082/healthz\": dial tcp 10.132.0.28:8082: connect: connection refused" Apr 16 17:57:56.441515 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:57:56.441472 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-f674f99bb-m7t8f"] Apr 16 17:57:56.441907 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:57:56.441875 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-f674f99bb-m7t8f" podUID="f43070d1-0aca-470c-8964-f29ed7887e95" containerName="main" containerID="cri-o://0704aa4667e715420a563ae3dfd10351a04ef668b51b649d9f93d44da21cfbe3" gracePeriod=30 Apr 16 17:58:01.890004 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:58:01.889958 2574 generic.go:358] "Generic (PLEG): container finished" podID="f43070d1-0aca-470c-8964-f29ed7887e95" containerID="0704aa4667e715420a563ae3dfd10351a04ef668b51b649d9f93d44da21cfbe3" exitCode=0 Apr 16 17:58:01.890518 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:58:01.890045 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-f674f99bb-m7t8f" event={"ID":"f43070d1-0aca-470c-8964-f29ed7887e95","Type":"ContainerDied","Data":"0704aa4667e715420a563ae3dfd10351a04ef668b51b649d9f93d44da21cfbe3"} Apr 16 17:58:02.484466 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:58:02.484439 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-f674f99bb-m7t8f" Apr 16 17:58:02.574063 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:58:02.574025 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d5fg6\" (UniqueName: \"kubernetes.io/projected/f43070d1-0aca-470c-8964-f29ed7887e95-kube-api-access-d5fg6\") pod \"f43070d1-0aca-470c-8964-f29ed7887e95\" (UID: \"f43070d1-0aca-470c-8964-f29ed7887e95\") " Apr 16 17:58:02.574246 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:58:02.574082 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f43070d1-0aca-470c-8964-f29ed7887e95-kserve-provision-location\") pod \"f43070d1-0aca-470c-8964-f29ed7887e95\" (UID: \"f43070d1-0aca-470c-8964-f29ed7887e95\") " Apr 16 17:58:02.574246 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:58:02.574135 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f43070d1-0aca-470c-8964-f29ed7887e95-tls-certs\") pod \"f43070d1-0aca-470c-8964-f29ed7887e95\" (UID: \"f43070d1-0aca-470c-8964-f29ed7887e95\") " Apr 16 17:58:02.574246 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:58:02.574196 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/f43070d1-0aca-470c-8964-f29ed7887e95-home\") pod \"f43070d1-0aca-470c-8964-f29ed7887e95\" (UID: \"f43070d1-0aca-470c-8964-f29ed7887e95\") " Apr 16 17:58:02.574246 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:58:02.574224 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/f43070d1-0aca-470c-8964-f29ed7887e95-dshm\") pod \"f43070d1-0aca-470c-8964-f29ed7887e95\" (UID: \"f43070d1-0aca-470c-8964-f29ed7887e95\") " Apr 16 17:58:02.574458 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:58:02.574251 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/f43070d1-0aca-470c-8964-f29ed7887e95-model-cache\") pod \"f43070d1-0aca-470c-8964-f29ed7887e95\" (UID: \"f43070d1-0aca-470c-8964-f29ed7887e95\") " Apr 16 17:58:02.574771 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:58:02.574737 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f43070d1-0aca-470c-8964-f29ed7887e95-model-cache" (OuterVolumeSpecName: "model-cache") pod "f43070d1-0aca-470c-8964-f29ed7887e95" (UID: "f43070d1-0aca-470c-8964-f29ed7887e95"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 17:58:02.574909 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:58:02.574789 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f43070d1-0aca-470c-8964-f29ed7887e95-home" (OuterVolumeSpecName: "home") pod "f43070d1-0aca-470c-8964-f29ed7887e95" (UID: "f43070d1-0aca-470c-8964-f29ed7887e95"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 17:58:02.576982 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:58:02.576950 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f43070d1-0aca-470c-8964-f29ed7887e95-kube-api-access-d5fg6" (OuterVolumeSpecName: "kube-api-access-d5fg6") pod "f43070d1-0aca-470c-8964-f29ed7887e95" (UID: "f43070d1-0aca-470c-8964-f29ed7887e95"). InnerVolumeSpecName "kube-api-access-d5fg6". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 17:58:02.577169 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:58:02.577134 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f43070d1-0aca-470c-8964-f29ed7887e95-dshm" (OuterVolumeSpecName: "dshm") pod "f43070d1-0aca-470c-8964-f29ed7887e95" (UID: "f43070d1-0aca-470c-8964-f29ed7887e95"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 17:58:02.577274 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:58:02.577184 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f43070d1-0aca-470c-8964-f29ed7887e95-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "f43070d1-0aca-470c-8964-f29ed7887e95" (UID: "f43070d1-0aca-470c-8964-f29ed7887e95"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 17:58:02.643951 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:58:02.643887 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f43070d1-0aca-470c-8964-f29ed7887e95-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "f43070d1-0aca-470c-8964-f29ed7887e95" (UID: "f43070d1-0aca-470c-8964-f29ed7887e95"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 17:58:02.675753 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:58:02.675720 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-d5fg6\" (UniqueName: \"kubernetes.io/projected/f43070d1-0aca-470c-8964-f29ed7887e95-kube-api-access-d5fg6\") on node \"ip-10-0-138-45.ec2.internal\" DevicePath \"\"" Apr 16 17:58:02.675753 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:58:02.675756 2574 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f43070d1-0aca-470c-8964-f29ed7887e95-kserve-provision-location\") on node \"ip-10-0-138-45.ec2.internal\" DevicePath \"\"" Apr 16 17:58:02.675979 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:58:02.675771 2574 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f43070d1-0aca-470c-8964-f29ed7887e95-tls-certs\") on node \"ip-10-0-138-45.ec2.internal\" DevicePath \"\"" Apr 16 17:58:02.675979 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:58:02.675786 2574 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/f43070d1-0aca-470c-8964-f29ed7887e95-home\") on node \"ip-10-0-138-45.ec2.internal\" DevicePath \"\"" Apr 16 17:58:02.675979 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:58:02.675799 2574 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/f43070d1-0aca-470c-8964-f29ed7887e95-dshm\") on node \"ip-10-0-138-45.ec2.internal\" DevicePath \"\"" Apr 16 17:58:02.675979 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:58:02.675811 2574 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/f43070d1-0aca-470c-8964-f29ed7887e95-model-cache\") on node \"ip-10-0-138-45.ec2.internal\" DevicePath \"\"" Apr 16 17:58:02.895840 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:58:02.895755 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-f674f99bb-m7t8f" event={"ID":"f43070d1-0aca-470c-8964-f29ed7887e95","Type":"ContainerDied","Data":"a5ec77452e064825dcbeafecd4c5f12c9f1a8942e0af9784c81a00bee90fe52b"} Apr 16 17:58:02.895840 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:58:02.895805 2574 scope.go:117] "RemoveContainer" containerID="0704aa4667e715420a563ae3dfd10351a04ef668b51b649d9f93d44da21cfbe3" Apr 16 17:58:02.895840 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:58:02.895826 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-f674f99bb-m7t8f" Apr 16 17:58:02.905487 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:58:02.905377 2574 scope.go:117] "RemoveContainer" containerID="3aa7fa930cb20d0da868863ca69e5c54da6e593ba966451716236eed98f68f05" Apr 16 17:58:02.939548 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:58:02.939508 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-f674f99bb-m7t8f"] Apr 16 17:58:02.947257 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:58:02.947230 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-f674f99bb-m7t8f"] Apr 16 17:58:04.455971 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:58:04.455937 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f43070d1-0aca-470c-8964-f29ed7887e95" path="/var/lib/kubelet/pods/f43070d1-0aca-470c-8964-f29ed7887e95/volumes" Apr 16 17:58:05.683775 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:58:05.683744 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schej66sh" Apr 16 17:58:05.685031 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:58:05.685010 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schej66sh" Apr 16 17:58:17.950440 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:58:17.950404 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-857c96cf596mn5t" event={"ID":"5c25f100-adff-45c0-b7f2-1e0046d5081e","Type":"ContainerStarted","Data":"f99a1b530c2207183972e1ecb2997354756d3ec1a1933a46dbcf3ce2b3f9a5c3"} Apr 16 17:58:17.985677 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:58:17.985622 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-857c96cf596mn5t" podStartSLOduration=6.802378594 podStartE2EDuration="33.985605173s" podCreationTimestamp="2026-04-16 17:57:44 +0000 UTC" firstStartedPulling="2026-04-16 17:57:49.837808982 +0000 UTC m=+1079.997179242" lastFinishedPulling="2026-04-16 17:58:17.021035562 +0000 UTC m=+1107.180405821" observedRunningTime="2026-04-16 17:58:17.984662326 +0000 UTC m=+1108.144032608" watchObservedRunningTime="2026-04-16 17:58:17.985605173 +0000 UTC m=+1108.144975455" Apr 16 17:58:25.242891 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:58:25.242843 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-857c96cf596mn5t" Apr 16 17:58:25.243285 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:58:25.242983 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-857c96cf596mn5t" Apr 16 17:58:25.244393 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:58:25.244366 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-857c96cf596mn5t" podUID="5c25f100-adff-45c0-b7f2-1e0046d5081e" containerName="main" probeResult="failure" output="Get \"https://10.132.0.27:8000/health\": dial tcp 10.132.0.27:8000: connect: connection refused" Apr 16 17:58:25.909759 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:58:25.909728 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schej66sh" Apr 16 17:58:35.243389 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:58:35.243343 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-857c96cf596mn5t" podUID="5c25f100-adff-45c0-b7f2-1e0046d5081e" containerName="main" probeResult="failure" output="Get \"https://10.132.0.27:8000/health\": dial tcp 10.132.0.27:8000: connect: connection refused" Apr 16 17:58:45.244029 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:58:45.243968 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-857c96cf596mn5t" podUID="5c25f100-adff-45c0-b7f2-1e0046d5081e" containerName="main" probeResult="failure" output="Get \"https://10.132.0.27:8000/health\": dial tcp 10.132.0.27:8000: connect: connection refused" Apr 16 17:58:55.243106 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:58:55.243063 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-857c96cf596mn5t" podUID="5c25f100-adff-45c0-b7f2-1e0046d5081e" containerName="main" probeResult="failure" output="Get \"https://10.132.0.27:8000/health\": dial tcp 10.132.0.27:8000: connect: connection refused" Apr 16 17:59:02.016670 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:59:02.016626 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-844cb9bffc-rqlk9"] Apr 16 17:59:02.017037 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:59:02.016973 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f43070d1-0aca-470c-8964-f29ed7887e95" containerName="storage-initializer" Apr 16 17:59:02.017037 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:59:02.016984 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="f43070d1-0aca-470c-8964-f29ed7887e95" containerName="storage-initializer" Apr 16 17:59:02.017037 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:59:02.016998 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f43070d1-0aca-470c-8964-f29ed7887e95" containerName="main" Apr 16 17:59:02.017037 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:59:02.017003 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="f43070d1-0aca-470c-8964-f29ed7887e95" containerName="main" Apr 16 17:59:02.017167 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:59:02.017074 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="f43070d1-0aca-470c-8964-f29ed7887e95" containerName="main" Apr 16 17:59:02.025868 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:59:02.025842 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-844cb9bffc-rqlk9" Apr 16 17:59:02.030293 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:59:02.030268 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"stop-feature-test-kserve-self-signed-certs\"" Apr 16 17:59:02.047675 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:59:02.047644 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-844cb9bffc-rqlk9"] Apr 16 17:59:02.068383 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:59:02.068350 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xf7tw\" (UniqueName: \"kubernetes.io/projected/0fccc588-e184-4c53-98b9-e204d906eb32-kube-api-access-xf7tw\") pod \"stop-feature-test-kserve-844cb9bffc-rqlk9\" (UID: \"0fccc588-e184-4c53-98b9-e204d906eb32\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-844cb9bffc-rqlk9" Apr 16 17:59:02.068528 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:59:02.068465 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/0fccc588-e184-4c53-98b9-e204d906eb32-home\") pod \"stop-feature-test-kserve-844cb9bffc-rqlk9\" (UID: \"0fccc588-e184-4c53-98b9-e204d906eb32\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-844cb9bffc-rqlk9" Apr 16 17:59:02.068528 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:59:02.068504 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0fccc588-e184-4c53-98b9-e204d906eb32-kserve-provision-location\") pod \"stop-feature-test-kserve-844cb9bffc-rqlk9\" (UID: \"0fccc588-e184-4c53-98b9-e204d906eb32\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-844cb9bffc-rqlk9" Apr 16 17:59:02.068639 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:59:02.068529 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/0fccc588-e184-4c53-98b9-e204d906eb32-dshm\") pod \"stop-feature-test-kserve-844cb9bffc-rqlk9\" (UID: \"0fccc588-e184-4c53-98b9-e204d906eb32\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-844cb9bffc-rqlk9" Apr 16 17:59:02.068639 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:59:02.068545 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/0fccc588-e184-4c53-98b9-e204d906eb32-tls-certs\") pod \"stop-feature-test-kserve-844cb9bffc-rqlk9\" (UID: \"0fccc588-e184-4c53-98b9-e204d906eb32\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-844cb9bffc-rqlk9" Apr 16 17:59:02.068736 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:59:02.068630 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/0fccc588-e184-4c53-98b9-e204d906eb32-model-cache\") pod \"stop-feature-test-kserve-844cb9bffc-rqlk9\" (UID: \"0fccc588-e184-4c53-98b9-e204d906eb32\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-844cb9bffc-rqlk9" Apr 16 17:59:02.170167 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:59:02.170110 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0fccc588-e184-4c53-98b9-e204d906eb32-kserve-provision-location\") pod \"stop-feature-test-kserve-844cb9bffc-rqlk9\" (UID: \"0fccc588-e184-4c53-98b9-e204d906eb32\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-844cb9bffc-rqlk9" Apr 16 17:59:02.170167 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:59:02.170172 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/0fccc588-e184-4c53-98b9-e204d906eb32-dshm\") pod \"stop-feature-test-kserve-844cb9bffc-rqlk9\" (UID: \"0fccc588-e184-4c53-98b9-e204d906eb32\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-844cb9bffc-rqlk9" Apr 16 17:59:02.170463 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:59:02.170207 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/0fccc588-e184-4c53-98b9-e204d906eb32-tls-certs\") pod \"stop-feature-test-kserve-844cb9bffc-rqlk9\" (UID: \"0fccc588-e184-4c53-98b9-e204d906eb32\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-844cb9bffc-rqlk9" Apr 16 17:59:02.170463 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:59:02.170242 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/0fccc588-e184-4c53-98b9-e204d906eb32-model-cache\") pod \"stop-feature-test-kserve-844cb9bffc-rqlk9\" (UID: \"0fccc588-e184-4c53-98b9-e204d906eb32\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-844cb9bffc-rqlk9" Apr 16 17:59:02.170463 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:59:02.170283 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xf7tw\" (UniqueName: \"kubernetes.io/projected/0fccc588-e184-4c53-98b9-e204d906eb32-kube-api-access-xf7tw\") pod \"stop-feature-test-kserve-844cb9bffc-rqlk9\" (UID: \"0fccc588-e184-4c53-98b9-e204d906eb32\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-844cb9bffc-rqlk9" Apr 16 17:59:02.170463 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:59:02.170378 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/0fccc588-e184-4c53-98b9-e204d906eb32-home\") pod \"stop-feature-test-kserve-844cb9bffc-rqlk9\" (UID: \"0fccc588-e184-4c53-98b9-e204d906eb32\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-844cb9bffc-rqlk9" Apr 16 17:59:02.170739 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:59:02.170707 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0fccc588-e184-4c53-98b9-e204d906eb32-kserve-provision-location\") pod \"stop-feature-test-kserve-844cb9bffc-rqlk9\" (UID: \"0fccc588-e184-4c53-98b9-e204d906eb32\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-844cb9bffc-rqlk9" Apr 16 17:59:02.170871 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:59:02.170813 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/0fccc588-e184-4c53-98b9-e204d906eb32-home\") pod \"stop-feature-test-kserve-844cb9bffc-rqlk9\" (UID: \"0fccc588-e184-4c53-98b9-e204d906eb32\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-844cb9bffc-rqlk9" Apr 16 17:59:02.171044 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:59:02.171015 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/0fccc588-e184-4c53-98b9-e204d906eb32-model-cache\") pod \"stop-feature-test-kserve-844cb9bffc-rqlk9\" (UID: \"0fccc588-e184-4c53-98b9-e204d906eb32\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-844cb9bffc-rqlk9" Apr 16 17:59:02.173772 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:59:02.173735 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/0fccc588-e184-4c53-98b9-e204d906eb32-dshm\") pod \"stop-feature-test-kserve-844cb9bffc-rqlk9\" (UID: \"0fccc588-e184-4c53-98b9-e204d906eb32\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-844cb9bffc-rqlk9" Apr 16 17:59:02.174448 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:59:02.174423 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/0fccc588-e184-4c53-98b9-e204d906eb32-tls-certs\") pod \"stop-feature-test-kserve-844cb9bffc-rqlk9\" (UID: \"0fccc588-e184-4c53-98b9-e204d906eb32\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-844cb9bffc-rqlk9" Apr 16 17:59:02.189154 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:59:02.189119 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xf7tw\" (UniqueName: \"kubernetes.io/projected/0fccc588-e184-4c53-98b9-e204d906eb32-kube-api-access-xf7tw\") pod \"stop-feature-test-kserve-844cb9bffc-rqlk9\" (UID: \"0fccc588-e184-4c53-98b9-e204d906eb32\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-844cb9bffc-rqlk9" Apr 16 17:59:02.336122 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:59:02.336038 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-844cb9bffc-rqlk9" Apr 16 17:59:02.489760 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:59:02.489726 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-844cb9bffc-rqlk9"] Apr 16 17:59:02.493771 ip-10-0-138-45 kubenswrapper[2574]: W0416 17:59:02.493742 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0fccc588_e184_4c53_98b9_e204d906eb32.slice/crio-efad6bac649f4a1f8674fd44c92b0ebd14874b99700fee01e91a3cb352657f74 WatchSource:0}: Error finding container efad6bac649f4a1f8674fd44c92b0ebd14874b99700fee01e91a3cb352657f74: Status 404 returned error can't find the container with id efad6bac649f4a1f8674fd44c92b0ebd14874b99700fee01e91a3cb352657f74 Apr 16 17:59:03.108985 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:59:03.108948 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-844cb9bffc-rqlk9" event={"ID":"0fccc588-e184-4c53-98b9-e204d906eb32","Type":"ContainerStarted","Data":"1b4d17d32cf198f8249631ef619dfa1bb4ebb079293ded64eecbe8dd550b9085"} Apr 16 17:59:03.108985 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:59:03.108985 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-844cb9bffc-rqlk9" event={"ID":"0fccc588-e184-4c53-98b9-e204d906eb32","Type":"ContainerStarted","Data":"efad6bac649f4a1f8674fd44c92b0ebd14874b99700fee01e91a3cb352657f74"} Apr 16 17:59:05.242972 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:59:05.242931 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-857c96cf596mn5t" podUID="5c25f100-adff-45c0-b7f2-1e0046d5081e" containerName="main" probeResult="failure" output="Get \"https://10.132.0.27:8000/health\": dial tcp 10.132.0.27:8000: connect: connection refused" Apr 16 17:59:07.125033 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:59:07.124937 2574 generic.go:358] "Generic (PLEG): container finished" podID="0fccc588-e184-4c53-98b9-e204d906eb32" containerID="1b4d17d32cf198f8249631ef619dfa1bb4ebb079293ded64eecbe8dd550b9085" exitCode=0 Apr 16 17:59:07.125033 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:59:07.125011 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-844cb9bffc-rqlk9" event={"ID":"0fccc588-e184-4c53-98b9-e204d906eb32","Type":"ContainerDied","Data":"1b4d17d32cf198f8249631ef619dfa1bb4ebb079293ded64eecbe8dd550b9085"} Apr 16 17:59:08.131106 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:59:08.131006 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-844cb9bffc-rqlk9" event={"ID":"0fccc588-e184-4c53-98b9-e204d906eb32","Type":"ContainerStarted","Data":"f270e25a802a7cd69f4a4992d97e5c4871696c23fba9817709c702235c078687"} Apr 16 17:59:08.204727 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:59:08.204670 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/stop-feature-test-kserve-844cb9bffc-rqlk9" podStartSLOduration=7.204651621 podStartE2EDuration="7.204651621s" podCreationTimestamp="2026-04-16 17:59:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 17:59:08.204467119 +0000 UTC m=+1158.363837404" watchObservedRunningTime="2026-04-16 17:59:08.204651621 +0000 UTC m=+1158.364021902" Apr 16 17:59:12.337147 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:59:12.337089 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/stop-feature-test-kserve-844cb9bffc-rqlk9" Apr 16 17:59:12.337147 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:59:12.337147 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-844cb9bffc-rqlk9" Apr 16 17:59:12.338890 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:59:12.338853 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-844cb9bffc-rqlk9" podUID="0fccc588-e184-4c53-98b9-e204d906eb32" containerName="main" probeResult="failure" output="Get \"https://10.132.0.29:8000/health\": dial tcp 10.132.0.29:8000: connect: connection refused" Apr 16 17:59:15.243417 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:59:15.243370 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-857c96cf596mn5t" podUID="5c25f100-adff-45c0-b7f2-1e0046d5081e" containerName="main" probeResult="failure" output="Get \"https://10.132.0.27:8000/health\": dial tcp 10.132.0.27:8000: connect: connection refused" Apr 16 17:59:22.337002 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:59:22.336954 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-844cb9bffc-rqlk9" podUID="0fccc588-e184-4c53-98b9-e204d906eb32" containerName="main" probeResult="failure" output="Get \"https://10.132.0.29:8000/health\": dial tcp 10.132.0.29:8000: connect: connection refused" Apr 16 17:59:25.242953 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:59:25.242908 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-857c96cf596mn5t" podUID="5c25f100-adff-45c0-b7f2-1e0046d5081e" containerName="main" probeResult="failure" output="Get \"https://10.132.0.27:8000/health\": dial tcp 10.132.0.27:8000: connect: connection refused" Apr 16 17:59:32.336983 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:59:32.336940 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-844cb9bffc-rqlk9" podUID="0fccc588-e184-4c53-98b9-e204d906eb32" containerName="main" probeResult="failure" output="Get \"https://10.132.0.29:8000/health\": dial tcp 10.132.0.29:8000: connect: connection refused" Apr 16 17:59:35.243892 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:59:35.243837 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-857c96cf596mn5t" podUID="5c25f100-adff-45c0-b7f2-1e0046d5081e" containerName="main" probeResult="failure" output="Get \"https://10.132.0.27:8000/health\": dial tcp 10.132.0.27:8000: connect: connection refused" Apr 16 17:59:42.337517 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:59:42.337468 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-844cb9bffc-rqlk9" podUID="0fccc588-e184-4c53-98b9-e204d906eb32" containerName="main" probeResult="failure" output="Get \"https://10.132.0.29:8000/health\": dial tcp 10.132.0.29:8000: connect: connection refused" Apr 16 17:59:45.243794 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:59:45.243750 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-857c96cf596mn5t" podUID="5c25f100-adff-45c0-b7f2-1e0046d5081e" containerName="main" probeResult="failure" output="Get \"https://10.132.0.27:8000/health\": dial tcp 10.132.0.27:8000: connect: connection refused" Apr 16 17:59:50.422038 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:59:50.422003 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-zcvg5_05d7dd61-bda6-4c4d-87b6-86d7c20fe0cd/console-operator/1.log" Apr 16 17:59:50.422486 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:59:50.422422 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-zcvg5_05d7dd61-bda6-4c4d-87b6-86d7c20fe0cd/console-operator/1.log" Apr 16 17:59:50.434318 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:59:50.434292 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-42vwn_2ee69609-b777-4f06-988b-8e74dd389f13/ovn-acl-logging/0.log" Apr 16 17:59:50.434511 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:59:50.434491 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-42vwn_2ee69609-b777-4f06-988b-8e74dd389f13/ovn-acl-logging/0.log" Apr 16 17:59:52.337044 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:59:52.337005 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-844cb9bffc-rqlk9" podUID="0fccc588-e184-4c53-98b9-e204d906eb32" containerName="main" probeResult="failure" output="Get \"https://10.132.0.29:8000/health\": dial tcp 10.132.0.29:8000: connect: connection refused" Apr 16 17:59:55.252888 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:59:55.252858 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-857c96cf596mn5t" Apr 16 17:59:55.260627 ip-10-0-138-45 kubenswrapper[2574]: I0416 17:59:55.260601 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-857c96cf596mn5t" Apr 16 18:00:02.336923 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:00:02.336863 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-844cb9bffc-rqlk9" podUID="0fccc588-e184-4c53-98b9-e204d906eb32" containerName="main" probeResult="failure" output="Get \"https://10.132.0.29:8000/health\": dial tcp 10.132.0.29:8000: connect: connection refused" Apr 16 18:00:02.618052 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:00:02.616650 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schej66sh"] Apr 16 18:00:02.618052 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:00:02.617060 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schej66sh" podUID="92093a23-ac7d-4b93-9a22-4985c792fe1d" containerName="main" containerID="cri-o://1abedbf4c1e1dc2c6f3c45d983960d3d59a2e3ff07034b9b45f5832c9085fed1" gracePeriod=30 Apr 16 18:00:02.618052 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:00:02.617441 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schej66sh" podUID="92093a23-ac7d-4b93-9a22-4985c792fe1d" containerName="tokenizer" containerID="cri-o://cfb1bd91f9069c93e7ba053923d15218739b55abb7d3b48a385fb9d01a55500d" gracePeriod=30 Apr 16 18:00:02.618941 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:00:02.618477 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-857c96cf596mn5t"] Apr 16 18:00:02.618941 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:00:02.618853 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-857c96cf596mn5t" podUID="5c25f100-adff-45c0-b7f2-1e0046d5081e" containerName="main" containerID="cri-o://f99a1b530c2207183972e1ecb2997354756d3ec1a1933a46dbcf3ce2b3f9a5c3" gracePeriod=30 Apr 16 18:00:03.327530 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:00:03.327493 2574 generic.go:358] "Generic (PLEG): container finished" podID="92093a23-ac7d-4b93-9a22-4985c792fe1d" containerID="1abedbf4c1e1dc2c6f3c45d983960d3d59a2e3ff07034b9b45f5832c9085fed1" exitCode=0 Apr 16 18:00:03.327714 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:00:03.327592 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schej66sh" event={"ID":"92093a23-ac7d-4b93-9a22-4985c792fe1d","Type":"ContainerDied","Data":"1abedbf4c1e1dc2c6f3c45d983960d3d59a2e3ff07034b9b45f5832c9085fed1"} Apr 16 18:00:03.978324 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:00:03.978300 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schej66sh" Apr 16 18:00:04.136273 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:00:04.136179 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/92093a23-ac7d-4b93-9a22-4985c792fe1d-tokenizer-cache\") pod \"92093a23-ac7d-4b93-9a22-4985c792fe1d\" (UID: \"92093a23-ac7d-4b93-9a22-4985c792fe1d\") " Apr 16 18:00:04.136273 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:00:04.136253 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hgm4q\" (UniqueName: \"kubernetes.io/projected/92093a23-ac7d-4b93-9a22-4985c792fe1d-kube-api-access-hgm4q\") pod \"92093a23-ac7d-4b93-9a22-4985c792fe1d\" (UID: \"92093a23-ac7d-4b93-9a22-4985c792fe1d\") " Apr 16 18:00:04.136507 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:00:04.136292 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/92093a23-ac7d-4b93-9a22-4985c792fe1d-kserve-provision-location\") pod \"92093a23-ac7d-4b93-9a22-4985c792fe1d\" (UID: \"92093a23-ac7d-4b93-9a22-4985c792fe1d\") " Apr 16 18:00:04.136507 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:00:04.136312 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/92093a23-ac7d-4b93-9a22-4985c792fe1d-tokenizer-uds\") pod \"92093a23-ac7d-4b93-9a22-4985c792fe1d\" (UID: \"92093a23-ac7d-4b93-9a22-4985c792fe1d\") " Apr 16 18:00:04.136507 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:00:04.136371 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/92093a23-ac7d-4b93-9a22-4985c792fe1d-tokenizer-tmp\") pod \"92093a23-ac7d-4b93-9a22-4985c792fe1d\" (UID: \"92093a23-ac7d-4b93-9a22-4985c792fe1d\") " Apr 16 18:00:04.136507 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:00:04.136407 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/92093a23-ac7d-4b93-9a22-4985c792fe1d-tls-certs\") pod \"92093a23-ac7d-4b93-9a22-4985c792fe1d\" (UID: \"92093a23-ac7d-4b93-9a22-4985c792fe1d\") " Apr 16 18:00:04.136735 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:00:04.136598 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/92093a23-ac7d-4b93-9a22-4985c792fe1d-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "92093a23-ac7d-4b93-9a22-4985c792fe1d" (UID: "92093a23-ac7d-4b93-9a22-4985c792fe1d"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:00:04.136735 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:00:04.136648 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/92093a23-ac7d-4b93-9a22-4985c792fe1d-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "92093a23-ac7d-4b93-9a22-4985c792fe1d" (UID: "92093a23-ac7d-4b93-9a22-4985c792fe1d"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:00:04.136837 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:00:04.136801 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/92093a23-ac7d-4b93-9a22-4985c792fe1d-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "92093a23-ac7d-4b93-9a22-4985c792fe1d" (UID: "92093a23-ac7d-4b93-9a22-4985c792fe1d"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:00:04.137099 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:00:04.137076 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/92093a23-ac7d-4b93-9a22-4985c792fe1d-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "92093a23-ac7d-4b93-9a22-4985c792fe1d" (UID: "92093a23-ac7d-4b93-9a22-4985c792fe1d"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:00:04.138638 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:00:04.138615 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/92093a23-ac7d-4b93-9a22-4985c792fe1d-kube-api-access-hgm4q" (OuterVolumeSpecName: "kube-api-access-hgm4q") pod "92093a23-ac7d-4b93-9a22-4985c792fe1d" (UID: "92093a23-ac7d-4b93-9a22-4985c792fe1d"). InnerVolumeSpecName "kube-api-access-hgm4q". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:00:04.138829 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:00:04.138809 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92093a23-ac7d-4b93-9a22-4985c792fe1d-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "92093a23-ac7d-4b93-9a22-4985c792fe1d" (UID: "92093a23-ac7d-4b93-9a22-4985c792fe1d"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:00:04.237840 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:00:04.237799 2574 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/92093a23-ac7d-4b93-9a22-4985c792fe1d-kserve-provision-location\") on node \"ip-10-0-138-45.ec2.internal\" DevicePath \"\"" Apr 16 18:00:04.237840 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:00:04.237836 2574 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/92093a23-ac7d-4b93-9a22-4985c792fe1d-tokenizer-uds\") on node \"ip-10-0-138-45.ec2.internal\" DevicePath \"\"" Apr 16 18:00:04.237840 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:00:04.237849 2574 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/92093a23-ac7d-4b93-9a22-4985c792fe1d-tokenizer-tmp\") on node \"ip-10-0-138-45.ec2.internal\" DevicePath \"\"" Apr 16 18:00:04.238079 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:00:04.237860 2574 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/92093a23-ac7d-4b93-9a22-4985c792fe1d-tls-certs\") on node \"ip-10-0-138-45.ec2.internal\" DevicePath \"\"" Apr 16 18:00:04.238079 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:00:04.237871 2574 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/92093a23-ac7d-4b93-9a22-4985c792fe1d-tokenizer-cache\") on node \"ip-10-0-138-45.ec2.internal\" DevicePath \"\"" Apr 16 18:00:04.238079 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:00:04.237883 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-hgm4q\" (UniqueName: \"kubernetes.io/projected/92093a23-ac7d-4b93-9a22-4985c792fe1d-kube-api-access-hgm4q\") on node \"ip-10-0-138-45.ec2.internal\" DevicePath \"\"" Apr 16 18:00:04.333157 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:00:04.333128 2574 generic.go:358] "Generic (PLEG): container finished" podID="92093a23-ac7d-4b93-9a22-4985c792fe1d" containerID="cfb1bd91f9069c93e7ba053923d15218739b55abb7d3b48a385fb9d01a55500d" exitCode=0 Apr 16 18:00:04.333351 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:00:04.333218 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schej66sh" event={"ID":"92093a23-ac7d-4b93-9a22-4985c792fe1d","Type":"ContainerDied","Data":"cfb1bd91f9069c93e7ba053923d15218739b55abb7d3b48a385fb9d01a55500d"} Apr 16 18:00:04.333351 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:00:04.333241 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schej66sh" event={"ID":"92093a23-ac7d-4b93-9a22-4985c792fe1d","Type":"ContainerDied","Data":"a37e234777090dc47a4c40d9f4fdbe723fb978e9000e85be81c2a1554e517289"} Apr 16 18:00:04.333351 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:00:04.333267 2574 scope.go:117] "RemoveContainer" containerID="cfb1bd91f9069c93e7ba053923d15218739b55abb7d3b48a385fb9d01a55500d" Apr 16 18:00:04.333351 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:00:04.333280 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schej66sh" Apr 16 18:00:04.342305 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:00:04.342284 2574 scope.go:117] "RemoveContainer" containerID="1abedbf4c1e1dc2c6f3c45d983960d3d59a2e3ff07034b9b45f5832c9085fed1" Apr 16 18:00:04.350530 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:00:04.350512 2574 scope.go:117] "RemoveContainer" containerID="7e5c1ea818e6c1097c6cd75e0ba5bc60b48d8f9ef1ebcba91140d14db25e0085" Apr 16 18:00:04.358144 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:00:04.358123 2574 scope.go:117] "RemoveContainer" containerID="cfb1bd91f9069c93e7ba053923d15218739b55abb7d3b48a385fb9d01a55500d" Apr 16 18:00:04.358434 ip-10-0-138-45 kubenswrapper[2574]: E0416 18:00:04.358411 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cfb1bd91f9069c93e7ba053923d15218739b55abb7d3b48a385fb9d01a55500d\": container with ID starting with cfb1bd91f9069c93e7ba053923d15218739b55abb7d3b48a385fb9d01a55500d not found: ID does not exist" containerID="cfb1bd91f9069c93e7ba053923d15218739b55abb7d3b48a385fb9d01a55500d" Apr 16 18:00:04.358506 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:00:04.358448 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cfb1bd91f9069c93e7ba053923d15218739b55abb7d3b48a385fb9d01a55500d"} err="failed to get container status \"cfb1bd91f9069c93e7ba053923d15218739b55abb7d3b48a385fb9d01a55500d\": rpc error: code = NotFound desc = could not find container \"cfb1bd91f9069c93e7ba053923d15218739b55abb7d3b48a385fb9d01a55500d\": container with ID starting with cfb1bd91f9069c93e7ba053923d15218739b55abb7d3b48a385fb9d01a55500d not found: ID does not exist" Apr 16 18:00:04.358506 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:00:04.358474 2574 scope.go:117] "RemoveContainer" containerID="1abedbf4c1e1dc2c6f3c45d983960d3d59a2e3ff07034b9b45f5832c9085fed1" Apr 16 18:00:04.358766 ip-10-0-138-45 kubenswrapper[2574]: E0416 18:00:04.358747 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1abedbf4c1e1dc2c6f3c45d983960d3d59a2e3ff07034b9b45f5832c9085fed1\": container with ID starting with 1abedbf4c1e1dc2c6f3c45d983960d3d59a2e3ff07034b9b45f5832c9085fed1 not found: ID does not exist" containerID="1abedbf4c1e1dc2c6f3c45d983960d3d59a2e3ff07034b9b45f5832c9085fed1" Apr 16 18:00:04.358887 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:00:04.358773 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1abedbf4c1e1dc2c6f3c45d983960d3d59a2e3ff07034b9b45f5832c9085fed1"} err="failed to get container status \"1abedbf4c1e1dc2c6f3c45d983960d3d59a2e3ff07034b9b45f5832c9085fed1\": rpc error: code = NotFound desc = could not find container \"1abedbf4c1e1dc2c6f3c45d983960d3d59a2e3ff07034b9b45f5832c9085fed1\": container with ID starting with 1abedbf4c1e1dc2c6f3c45d983960d3d59a2e3ff07034b9b45f5832c9085fed1 not found: ID does not exist" Apr 16 18:00:04.358887 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:00:04.358789 2574 scope.go:117] "RemoveContainer" containerID="7e5c1ea818e6c1097c6cd75e0ba5bc60b48d8f9ef1ebcba91140d14db25e0085" Apr 16 18:00:04.359046 ip-10-0-138-45 kubenswrapper[2574]: E0416 18:00:04.359030 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7e5c1ea818e6c1097c6cd75e0ba5bc60b48d8f9ef1ebcba91140d14db25e0085\": container with ID starting with 7e5c1ea818e6c1097c6cd75e0ba5bc60b48d8f9ef1ebcba91140d14db25e0085 not found: ID does not exist" containerID="7e5c1ea818e6c1097c6cd75e0ba5bc60b48d8f9ef1ebcba91140d14db25e0085" Apr 16 18:00:04.359085 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:00:04.359050 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e5c1ea818e6c1097c6cd75e0ba5bc60b48d8f9ef1ebcba91140d14db25e0085"} err="failed to get container status \"7e5c1ea818e6c1097c6cd75e0ba5bc60b48d8f9ef1ebcba91140d14db25e0085\": rpc error: code = NotFound desc = could not find container \"7e5c1ea818e6c1097c6cd75e0ba5bc60b48d8f9ef1ebcba91140d14db25e0085\": container with ID starting with 7e5c1ea818e6c1097c6cd75e0ba5bc60b48d8f9ef1ebcba91140d14db25e0085 not found: ID does not exist" Apr 16 18:00:04.368280 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:00:04.368259 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schej66sh"] Apr 16 18:00:04.376257 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:00:04.376236 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schej66sh"] Apr 16 18:00:04.456019 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:00:04.455985 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="92093a23-ac7d-4b93-9a22-4985c792fe1d" path="/var/lib/kubelet/pods/92093a23-ac7d-4b93-9a22-4985c792fe1d/volumes" Apr 16 18:00:12.336870 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:00:12.336823 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-844cb9bffc-rqlk9" podUID="0fccc588-e184-4c53-98b9-e204d906eb32" containerName="main" probeResult="failure" output="Get \"https://10.132.0.29:8000/health\": dial tcp 10.132.0.29:8000: connect: connection refused" Apr 16 18:00:12.756248 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:00:12.756207 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-6b8bb6c469-762fg"] Apr 16 18:00:12.756731 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:00:12.756704 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="92093a23-ac7d-4b93-9a22-4985c792fe1d" containerName="storage-initializer" Apr 16 18:00:12.756731 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:00:12.756730 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="92093a23-ac7d-4b93-9a22-4985c792fe1d" containerName="storage-initializer" Apr 16 18:00:12.756934 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:00:12.756752 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="92093a23-ac7d-4b93-9a22-4985c792fe1d" containerName="main" Apr 16 18:00:12.756934 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:00:12.756762 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="92093a23-ac7d-4b93-9a22-4985c792fe1d" containerName="main" Apr 16 18:00:12.756934 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:00:12.756783 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="92093a23-ac7d-4b93-9a22-4985c792fe1d" containerName="tokenizer" Apr 16 18:00:12.756934 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:00:12.756793 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="92093a23-ac7d-4b93-9a22-4985c792fe1d" containerName="tokenizer" Apr 16 18:00:12.756934 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:00:12.756875 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="92093a23-ac7d-4b93-9a22-4985c792fe1d" containerName="tokenizer" Apr 16 18:00:12.756934 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:00:12.756888 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="92093a23-ac7d-4b93-9a22-4985c792fe1d" containerName="main" Apr 16 18:00:12.762232 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:00:12.762204 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-6b8bb6c469-762fg" Apr 16 18:00:12.764919 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:00:12.764897 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"custom-route-timeout-test-kserve-self-signed-certs\"" Apr 16 18:00:12.784983 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:00:12.784939 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-6b8bb6c469-762fg"] Apr 16 18:00:12.810136 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:00:12.810098 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/b4f70356-c88a-42a5-9f75-c75b44ed08b9-model-cache\") pod \"custom-route-timeout-test-kserve-6b8bb6c469-762fg\" (UID: \"b4f70356-c88a-42a5-9f75-c75b44ed08b9\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-6b8bb6c469-762fg" Apr 16 18:00:12.810136 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:00:12.810141 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/b4f70356-c88a-42a5-9f75-c75b44ed08b9-home\") pod \"custom-route-timeout-test-kserve-6b8bb6c469-762fg\" (UID: \"b4f70356-c88a-42a5-9f75-c75b44ed08b9\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-6b8bb6c469-762fg" Apr 16 18:00:12.810353 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:00:12.810168 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b4f70356-c88a-42a5-9f75-c75b44ed08b9-kserve-provision-location\") pod \"custom-route-timeout-test-kserve-6b8bb6c469-762fg\" (UID: \"b4f70356-c88a-42a5-9f75-c75b44ed08b9\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-6b8bb6c469-762fg" Apr 16 18:00:12.810353 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:00:12.810202 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/b4f70356-c88a-42a5-9f75-c75b44ed08b9-dshm\") pod \"custom-route-timeout-test-kserve-6b8bb6c469-762fg\" (UID: \"b4f70356-c88a-42a5-9f75-c75b44ed08b9\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-6b8bb6c469-762fg" Apr 16 18:00:12.810353 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:00:12.810314 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b4f70356-c88a-42a5-9f75-c75b44ed08b9-tls-certs\") pod \"custom-route-timeout-test-kserve-6b8bb6c469-762fg\" (UID: \"b4f70356-c88a-42a5-9f75-c75b44ed08b9\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-6b8bb6c469-762fg" Apr 16 18:00:12.810464 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:00:12.810363 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7wj57\" (UniqueName: \"kubernetes.io/projected/b4f70356-c88a-42a5-9f75-c75b44ed08b9-kube-api-access-7wj57\") pod \"custom-route-timeout-test-kserve-6b8bb6c469-762fg\" (UID: \"b4f70356-c88a-42a5-9f75-c75b44ed08b9\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-6b8bb6c469-762fg" Apr 16 18:00:12.910880 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:00:12.910840 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7wj57\" (UniqueName: \"kubernetes.io/projected/b4f70356-c88a-42a5-9f75-c75b44ed08b9-kube-api-access-7wj57\") pod \"custom-route-timeout-test-kserve-6b8bb6c469-762fg\" (UID: \"b4f70356-c88a-42a5-9f75-c75b44ed08b9\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-6b8bb6c469-762fg" Apr 16 18:00:12.910880 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:00:12.910889 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/b4f70356-c88a-42a5-9f75-c75b44ed08b9-model-cache\") pod \"custom-route-timeout-test-kserve-6b8bb6c469-762fg\" (UID: \"b4f70356-c88a-42a5-9f75-c75b44ed08b9\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-6b8bb6c469-762fg" Apr 16 18:00:12.911130 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:00:12.910918 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/b4f70356-c88a-42a5-9f75-c75b44ed08b9-home\") pod \"custom-route-timeout-test-kserve-6b8bb6c469-762fg\" (UID: \"b4f70356-c88a-42a5-9f75-c75b44ed08b9\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-6b8bb6c469-762fg" Apr 16 18:00:12.911130 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:00:12.910943 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b4f70356-c88a-42a5-9f75-c75b44ed08b9-kserve-provision-location\") pod \"custom-route-timeout-test-kserve-6b8bb6c469-762fg\" (UID: \"b4f70356-c88a-42a5-9f75-c75b44ed08b9\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-6b8bb6c469-762fg" Apr 16 18:00:12.911130 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:00:12.910977 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/b4f70356-c88a-42a5-9f75-c75b44ed08b9-dshm\") pod \"custom-route-timeout-test-kserve-6b8bb6c469-762fg\" (UID: \"b4f70356-c88a-42a5-9f75-c75b44ed08b9\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-6b8bb6c469-762fg" Apr 16 18:00:12.911130 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:00:12.911043 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b4f70356-c88a-42a5-9f75-c75b44ed08b9-tls-certs\") pod \"custom-route-timeout-test-kserve-6b8bb6c469-762fg\" (UID: \"b4f70356-c88a-42a5-9f75-c75b44ed08b9\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-6b8bb6c469-762fg" Apr 16 18:00:12.911359 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:00:12.911329 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/b4f70356-c88a-42a5-9f75-c75b44ed08b9-model-cache\") pod \"custom-route-timeout-test-kserve-6b8bb6c469-762fg\" (UID: \"b4f70356-c88a-42a5-9f75-c75b44ed08b9\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-6b8bb6c469-762fg" Apr 16 18:00:12.911412 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:00:12.911381 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/b4f70356-c88a-42a5-9f75-c75b44ed08b9-home\") pod \"custom-route-timeout-test-kserve-6b8bb6c469-762fg\" (UID: \"b4f70356-c88a-42a5-9f75-c75b44ed08b9\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-6b8bb6c469-762fg" Apr 16 18:00:12.911515 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:00:12.911492 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b4f70356-c88a-42a5-9f75-c75b44ed08b9-kserve-provision-location\") pod \"custom-route-timeout-test-kserve-6b8bb6c469-762fg\" (UID: \"b4f70356-c88a-42a5-9f75-c75b44ed08b9\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-6b8bb6c469-762fg" Apr 16 18:00:12.913346 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:00:12.913323 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/b4f70356-c88a-42a5-9f75-c75b44ed08b9-dshm\") pod \"custom-route-timeout-test-kserve-6b8bb6c469-762fg\" (UID: \"b4f70356-c88a-42a5-9f75-c75b44ed08b9\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-6b8bb6c469-762fg" Apr 16 18:00:12.913603 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:00:12.913586 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b4f70356-c88a-42a5-9f75-c75b44ed08b9-tls-certs\") pod \"custom-route-timeout-test-kserve-6b8bb6c469-762fg\" (UID: \"b4f70356-c88a-42a5-9f75-c75b44ed08b9\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-6b8bb6c469-762fg" Apr 16 18:00:12.925160 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:00:12.925136 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7wj57\" (UniqueName: \"kubernetes.io/projected/b4f70356-c88a-42a5-9f75-c75b44ed08b9-kube-api-access-7wj57\") pod \"custom-route-timeout-test-kserve-6b8bb6c469-762fg\" (UID: \"b4f70356-c88a-42a5-9f75-c75b44ed08b9\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-6b8bb6c469-762fg" Apr 16 18:00:13.075369 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:00:13.075283 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-6b8bb6c469-762fg" Apr 16 18:00:13.229438 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:00:13.229409 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-6b8bb6c469-762fg"] Apr 16 18:00:13.231011 ip-10-0-138-45 kubenswrapper[2574]: W0416 18:00:13.230978 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb4f70356_c88a_42a5_9f75_c75b44ed08b9.slice/crio-06e70f646b9f41a573679ed789fe3e310fc1bc876d9dc7e338e4b191b279c815 WatchSource:0}: Error finding container 06e70f646b9f41a573679ed789fe3e310fc1bc876d9dc7e338e4b191b279c815: Status 404 returned error can't find the container with id 06e70f646b9f41a573679ed789fe3e310fc1bc876d9dc7e338e4b191b279c815 Apr 16 18:00:13.367317 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:00:13.367220 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-6b8bb6c469-762fg" event={"ID":"b4f70356-c88a-42a5-9f75-c75b44ed08b9","Type":"ContainerStarted","Data":"6d499040663efc7eb5a786cea9047ce2b5c99354d835468ab14dd00fd7042e95"} Apr 16 18:00:13.367317 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:00:13.367263 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-6b8bb6c469-762fg" event={"ID":"b4f70356-c88a-42a5-9f75-c75b44ed08b9","Type":"ContainerStarted","Data":"06e70f646b9f41a573679ed789fe3e310fc1bc876d9dc7e338e4b191b279c815"} Apr 16 18:00:18.387382 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:00:18.387349 2574 generic.go:358] "Generic (PLEG): container finished" podID="b4f70356-c88a-42a5-9f75-c75b44ed08b9" containerID="6d499040663efc7eb5a786cea9047ce2b5c99354d835468ab14dd00fd7042e95" exitCode=0 Apr 16 18:00:18.387915 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:00:18.387412 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-6b8bb6c469-762fg" event={"ID":"b4f70356-c88a-42a5-9f75-c75b44ed08b9","Type":"ContainerDied","Data":"6d499040663efc7eb5a786cea9047ce2b5c99354d835468ab14dd00fd7042e95"} Apr 16 18:00:19.399985 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:00:19.399950 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-6b8bb6c469-762fg" event={"ID":"b4f70356-c88a-42a5-9f75-c75b44ed08b9","Type":"ContainerStarted","Data":"a8200aede958632965dbdc97dde98031ee41b19adbb2583d0ad4d50f6e9187f5"} Apr 16 18:00:19.426197 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:00:19.426132 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-6b8bb6c469-762fg" podStartSLOduration=7.426111656 podStartE2EDuration="7.426111656s" podCreationTimestamp="2026-04-16 18:00:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:00:19.423158135 +0000 UTC m=+1229.582528418" watchObservedRunningTime="2026-04-16 18:00:19.426111656 +0000 UTC m=+1229.585481939" Apr 16 18:00:22.337212 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:00:22.337163 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-844cb9bffc-rqlk9" podUID="0fccc588-e184-4c53-98b9-e204d906eb32" containerName="main" probeResult="failure" output="Get \"https://10.132.0.29:8000/health\": dial tcp 10.132.0.29:8000: connect: connection refused" Apr 16 18:00:23.076353 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:00:23.076291 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-6b8bb6c469-762fg" Apr 16 18:00:23.076561 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:00:23.076462 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-6b8bb6c469-762fg" Apr 16 18:00:23.077885 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:00:23.077859 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-6b8bb6c469-762fg" podUID="b4f70356-c88a-42a5-9f75-c75b44ed08b9" containerName="main" probeResult="failure" output="Get \"https://10.132.0.30:8000/health\": dial tcp 10.132.0.30:8000: connect: connection refused" Apr 16 18:00:32.336884 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:00:32.336834 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-844cb9bffc-rqlk9" podUID="0fccc588-e184-4c53-98b9-e204d906eb32" containerName="main" probeResult="failure" output="Get \"https://10.132.0.29:8000/health\": dial tcp 10.132.0.29:8000: connect: connection refused" Apr 16 18:00:32.940231 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:00:32.940201 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-857c96cf596mn5t_5c25f100-adff-45c0-b7f2-1e0046d5081e/main/0.log" Apr 16 18:00:32.940664 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:00:32.940644 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-857c96cf596mn5t" Apr 16 18:00:32.991069 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:00:32.991036 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/5c25f100-adff-45c0-b7f2-1e0046d5081e-model-cache\") pod \"5c25f100-adff-45c0-b7f2-1e0046d5081e\" (UID: \"5c25f100-adff-45c0-b7f2-1e0046d5081e\") " Apr 16 18:00:32.991069 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:00:32.991073 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/5c25f100-adff-45c0-b7f2-1e0046d5081e-home\") pod \"5c25f100-adff-45c0-b7f2-1e0046d5081e\" (UID: \"5c25f100-adff-45c0-b7f2-1e0046d5081e\") " Apr 16 18:00:32.991320 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:00:32.991099 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/5c25f100-adff-45c0-b7f2-1e0046d5081e-tls-certs\") pod \"5c25f100-adff-45c0-b7f2-1e0046d5081e\" (UID: \"5c25f100-adff-45c0-b7f2-1e0046d5081e\") " Apr 16 18:00:32.991320 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:00:32.991147 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5c25f100-adff-45c0-b7f2-1e0046d5081e-kserve-provision-location\") pod \"5c25f100-adff-45c0-b7f2-1e0046d5081e\" (UID: \"5c25f100-adff-45c0-b7f2-1e0046d5081e\") " Apr 16 18:00:32.991320 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:00:32.991183 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nmkb7\" (UniqueName: \"kubernetes.io/projected/5c25f100-adff-45c0-b7f2-1e0046d5081e-kube-api-access-nmkb7\") pod \"5c25f100-adff-45c0-b7f2-1e0046d5081e\" (UID: \"5c25f100-adff-45c0-b7f2-1e0046d5081e\") " Apr 16 18:00:32.991320 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:00:32.991230 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/5c25f100-adff-45c0-b7f2-1e0046d5081e-dshm\") pod \"5c25f100-adff-45c0-b7f2-1e0046d5081e\" (UID: \"5c25f100-adff-45c0-b7f2-1e0046d5081e\") " Apr 16 18:00:32.991525 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:00:32.991341 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5c25f100-adff-45c0-b7f2-1e0046d5081e-model-cache" (OuterVolumeSpecName: "model-cache") pod "5c25f100-adff-45c0-b7f2-1e0046d5081e" (UID: "5c25f100-adff-45c0-b7f2-1e0046d5081e"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:00:32.991525 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:00:32.991423 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5c25f100-adff-45c0-b7f2-1e0046d5081e-home" (OuterVolumeSpecName: "home") pod "5c25f100-adff-45c0-b7f2-1e0046d5081e" (UID: "5c25f100-adff-45c0-b7f2-1e0046d5081e"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:00:32.991525 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:00:32.991452 2574 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/5c25f100-adff-45c0-b7f2-1e0046d5081e-model-cache\") on node \"ip-10-0-138-45.ec2.internal\" DevicePath \"\"" Apr 16 18:00:32.994024 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:00:32.993990 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c25f100-adff-45c0-b7f2-1e0046d5081e-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "5c25f100-adff-45c0-b7f2-1e0046d5081e" (UID: "5c25f100-adff-45c0-b7f2-1e0046d5081e"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:00:32.994312 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:00:32.994283 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c25f100-adff-45c0-b7f2-1e0046d5081e-kube-api-access-nmkb7" (OuterVolumeSpecName: "kube-api-access-nmkb7") pod "5c25f100-adff-45c0-b7f2-1e0046d5081e" (UID: "5c25f100-adff-45c0-b7f2-1e0046d5081e"). InnerVolumeSpecName "kube-api-access-nmkb7". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:00:32.994416 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:00:32.994336 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5c25f100-adff-45c0-b7f2-1e0046d5081e-dshm" (OuterVolumeSpecName: "dshm") pod "5c25f100-adff-45c0-b7f2-1e0046d5081e" (UID: "5c25f100-adff-45c0-b7f2-1e0046d5081e"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:00:33.057081 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:00:33.057012 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5c25f100-adff-45c0-b7f2-1e0046d5081e-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "5c25f100-adff-45c0-b7f2-1e0046d5081e" (UID: "5c25f100-adff-45c0-b7f2-1e0046d5081e"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:00:33.076326 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:00:33.076279 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-6b8bb6c469-762fg" podUID="b4f70356-c88a-42a5-9f75-c75b44ed08b9" containerName="main" probeResult="failure" output="Get \"https://10.132.0.30:8000/health\": dial tcp 10.132.0.30:8000: connect: connection refused" Apr 16 18:00:33.092457 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:00:33.092427 2574 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/5c25f100-adff-45c0-b7f2-1e0046d5081e-home\") on node \"ip-10-0-138-45.ec2.internal\" DevicePath \"\"" Apr 16 18:00:33.092457 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:00:33.092457 2574 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/5c25f100-adff-45c0-b7f2-1e0046d5081e-tls-certs\") on node \"ip-10-0-138-45.ec2.internal\" DevicePath \"\"" Apr 16 18:00:33.092642 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:00:33.092473 2574 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5c25f100-adff-45c0-b7f2-1e0046d5081e-kserve-provision-location\") on node \"ip-10-0-138-45.ec2.internal\" DevicePath \"\"" Apr 16 18:00:33.092642 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:00:33.092483 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-nmkb7\" (UniqueName: \"kubernetes.io/projected/5c25f100-adff-45c0-b7f2-1e0046d5081e-kube-api-access-nmkb7\") on node \"ip-10-0-138-45.ec2.internal\" DevicePath \"\"" Apr 16 18:00:33.092642 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:00:33.092492 2574 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/5c25f100-adff-45c0-b7f2-1e0046d5081e-dshm\") on node \"ip-10-0-138-45.ec2.internal\" DevicePath \"\"" Apr 16 18:00:33.449539 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:00:33.449510 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-857c96cf596mn5t_5c25f100-adff-45c0-b7f2-1e0046d5081e/main/0.log" Apr 16 18:00:33.450027 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:00:33.449983 2574 generic.go:358] "Generic (PLEG): container finished" podID="5c25f100-adff-45c0-b7f2-1e0046d5081e" containerID="f99a1b530c2207183972e1ecb2997354756d3ec1a1933a46dbcf3ce2b3f9a5c3" exitCode=137 Apr 16 18:00:33.450093 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:00:33.450043 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-857c96cf596mn5t" event={"ID":"5c25f100-adff-45c0-b7f2-1e0046d5081e","Type":"ContainerDied","Data":"f99a1b530c2207183972e1ecb2997354756d3ec1a1933a46dbcf3ce2b3f9a5c3"} Apr 16 18:00:33.450093 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:00:33.450079 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-857c96cf596mn5t" event={"ID":"5c25f100-adff-45c0-b7f2-1e0046d5081e","Type":"ContainerDied","Data":"a0ded536e1c309a9dd820ef60b9af020d2c0b92f9b90aa7556bd0d053f9422db"} Apr 16 18:00:33.450214 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:00:33.450101 2574 scope.go:117] "RemoveContainer" containerID="f99a1b530c2207183972e1ecb2997354756d3ec1a1933a46dbcf3ce2b3f9a5c3" Apr 16 18:00:33.450214 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:00:33.450109 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-857c96cf596mn5t" Apr 16 18:00:33.470895 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:00:33.470872 2574 scope.go:117] "RemoveContainer" containerID="ecb6eb6b797d46c47570708993e2d1587db4f39a0cbd31655abbf500aef3d5a1" Apr 16 18:00:33.481925 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:00:33.481902 2574 scope.go:117] "RemoveContainer" containerID="f99a1b530c2207183972e1ecb2997354756d3ec1a1933a46dbcf3ce2b3f9a5c3" Apr 16 18:00:33.482422 ip-10-0-138-45 kubenswrapper[2574]: E0416 18:00:33.482393 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f99a1b530c2207183972e1ecb2997354756d3ec1a1933a46dbcf3ce2b3f9a5c3\": container with ID starting with f99a1b530c2207183972e1ecb2997354756d3ec1a1933a46dbcf3ce2b3f9a5c3 not found: ID does not exist" containerID="f99a1b530c2207183972e1ecb2997354756d3ec1a1933a46dbcf3ce2b3f9a5c3" Apr 16 18:00:33.482526 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:00:33.482436 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f99a1b530c2207183972e1ecb2997354756d3ec1a1933a46dbcf3ce2b3f9a5c3"} err="failed to get container status \"f99a1b530c2207183972e1ecb2997354756d3ec1a1933a46dbcf3ce2b3f9a5c3\": rpc error: code = NotFound desc = could not find container \"f99a1b530c2207183972e1ecb2997354756d3ec1a1933a46dbcf3ce2b3f9a5c3\": container with ID starting with f99a1b530c2207183972e1ecb2997354756d3ec1a1933a46dbcf3ce2b3f9a5c3 not found: ID does not exist" Apr 16 18:00:33.482526 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:00:33.482464 2574 scope.go:117] "RemoveContainer" containerID="ecb6eb6b797d46c47570708993e2d1587db4f39a0cbd31655abbf500aef3d5a1" Apr 16 18:00:33.483043 ip-10-0-138-45 kubenswrapper[2574]: E0416 18:00:33.483003 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ecb6eb6b797d46c47570708993e2d1587db4f39a0cbd31655abbf500aef3d5a1\": container with ID starting with ecb6eb6b797d46c47570708993e2d1587db4f39a0cbd31655abbf500aef3d5a1 not found: ID does not exist" containerID="ecb6eb6b797d46c47570708993e2d1587db4f39a0cbd31655abbf500aef3d5a1" Apr 16 18:00:33.483171 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:00:33.483047 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ecb6eb6b797d46c47570708993e2d1587db4f39a0cbd31655abbf500aef3d5a1"} err="failed to get container status \"ecb6eb6b797d46c47570708993e2d1587db4f39a0cbd31655abbf500aef3d5a1\": rpc error: code = NotFound desc = could not find container \"ecb6eb6b797d46c47570708993e2d1587db4f39a0cbd31655abbf500aef3d5a1\": container with ID starting with ecb6eb6b797d46c47570708993e2d1587db4f39a0cbd31655abbf500aef3d5a1 not found: ID does not exist" Apr 16 18:00:33.483171 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:00:33.483022 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-857c96cf596mn5t"] Apr 16 18:00:33.490009 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:00:33.489977 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-857c96cf596mn5t"] Apr 16 18:00:34.455685 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:00:34.455644 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5c25f100-adff-45c0-b7f2-1e0046d5081e" path="/var/lib/kubelet/pods/5c25f100-adff-45c0-b7f2-1e0046d5081e/volumes" Apr 16 18:00:42.336587 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:00:42.336458 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-844cb9bffc-rqlk9" podUID="0fccc588-e184-4c53-98b9-e204d906eb32" containerName="main" probeResult="failure" output="Get \"https://10.132.0.29:8000/health\": dial tcp 10.132.0.29:8000: connect: connection refused" Apr 16 18:00:43.076643 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:00:43.076592 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-6b8bb6c469-762fg" podUID="b4f70356-c88a-42a5-9f75-c75b44ed08b9" containerName="main" probeResult="failure" output="Get \"https://10.132.0.30:8000/health\": dial tcp 10.132.0.30:8000: connect: connection refused" Apr 16 18:00:52.346996 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:00:52.346956 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/stop-feature-test-kserve-844cb9bffc-rqlk9" Apr 16 18:00:52.354808 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:00:52.354779 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-844cb9bffc-rqlk9" Apr 16 18:00:53.076447 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:00:53.076408 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-6b8bb6c469-762fg" podUID="b4f70356-c88a-42a5-9f75-c75b44ed08b9" containerName="main" probeResult="failure" output="Get \"https://10.132.0.30:8000/health\": dial tcp 10.132.0.30:8000: connect: connection refused" Apr 16 18:00:53.547201 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:00:53.547163 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-844cb9bffc-rqlk9"] Apr 16 18:00:53.547630 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:00:53.547399 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/stop-feature-test-kserve-844cb9bffc-rqlk9" podUID="0fccc588-e184-4c53-98b9-e204d906eb32" containerName="main" containerID="cri-o://f270e25a802a7cd69f4a4992d97e5c4871696c23fba9817709c702235c078687" gracePeriod=30 Apr 16 18:01:03.076242 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:01:03.076185 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-6b8bb6c469-762fg" podUID="b4f70356-c88a-42a5-9f75-c75b44ed08b9" containerName="main" probeResult="failure" output="Get \"https://10.132.0.30:8000/health\": dial tcp 10.132.0.30:8000: connect: connection refused" Apr 16 18:01:13.076242 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:01:13.076195 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-6b8bb6c469-762fg" podUID="b4f70356-c88a-42a5-9f75-c75b44ed08b9" containerName="main" probeResult="failure" output="Get \"https://10.132.0.30:8000/health\": dial tcp 10.132.0.30:8000: connect: connection refused" Apr 16 18:01:21.639529 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:01:21.639491 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-844cb9bffc-9b6c8"] Apr 16 18:01:21.639935 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:01:21.639870 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5c25f100-adff-45c0-b7f2-1e0046d5081e" containerName="storage-initializer" Apr 16 18:01:21.639935 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:01:21.639885 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c25f100-adff-45c0-b7f2-1e0046d5081e" containerName="storage-initializer" Apr 16 18:01:21.639935 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:01:21.639897 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5c25f100-adff-45c0-b7f2-1e0046d5081e" containerName="main" Apr 16 18:01:21.639935 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:01:21.639904 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c25f100-adff-45c0-b7f2-1e0046d5081e" containerName="main" Apr 16 18:01:21.640079 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:01:21.639972 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="5c25f100-adff-45c0-b7f2-1e0046d5081e" containerName="main" Apr 16 18:01:21.643025 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:01:21.643009 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-844cb9bffc-9b6c8" Apr 16 18:01:21.653457 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:01:21.653431 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-844cb9bffc-9b6c8"] Apr 16 18:01:21.706165 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:01:21.706129 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/a8f2030d-fb34-4d00-bd41-c06db5d1543d-dshm\") pod \"stop-feature-test-kserve-844cb9bffc-9b6c8\" (UID: \"a8f2030d-fb34-4d00-bd41-c06db5d1543d\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-844cb9bffc-9b6c8" Apr 16 18:01:21.706165 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:01:21.706169 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/a8f2030d-fb34-4d00-bd41-c06db5d1543d-home\") pod \"stop-feature-test-kserve-844cb9bffc-9b6c8\" (UID: \"a8f2030d-fb34-4d00-bd41-c06db5d1543d\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-844cb9bffc-9b6c8" Apr 16 18:01:21.706458 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:01:21.706203 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/a8f2030d-fb34-4d00-bd41-c06db5d1543d-model-cache\") pod \"stop-feature-test-kserve-844cb9bffc-9b6c8\" (UID: \"a8f2030d-fb34-4d00-bd41-c06db5d1543d\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-844cb9bffc-9b6c8" Apr 16 18:01:21.706458 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:01:21.706253 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6jczg\" (UniqueName: \"kubernetes.io/projected/a8f2030d-fb34-4d00-bd41-c06db5d1543d-kube-api-access-6jczg\") pod \"stop-feature-test-kserve-844cb9bffc-9b6c8\" (UID: \"a8f2030d-fb34-4d00-bd41-c06db5d1543d\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-844cb9bffc-9b6c8" Apr 16 18:01:21.706458 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:01:21.706279 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a8f2030d-fb34-4d00-bd41-c06db5d1543d-kserve-provision-location\") pod \"stop-feature-test-kserve-844cb9bffc-9b6c8\" (UID: \"a8f2030d-fb34-4d00-bd41-c06db5d1543d\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-844cb9bffc-9b6c8" Apr 16 18:01:21.706458 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:01:21.706376 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a8f2030d-fb34-4d00-bd41-c06db5d1543d-tls-certs\") pod \"stop-feature-test-kserve-844cb9bffc-9b6c8\" (UID: \"a8f2030d-fb34-4d00-bd41-c06db5d1543d\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-844cb9bffc-9b6c8" Apr 16 18:01:21.806776 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:01:21.806739 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a8f2030d-fb34-4d00-bd41-c06db5d1543d-tls-certs\") pod \"stop-feature-test-kserve-844cb9bffc-9b6c8\" (UID: \"a8f2030d-fb34-4d00-bd41-c06db5d1543d\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-844cb9bffc-9b6c8" Apr 16 18:01:21.807026 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:01:21.806807 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/a8f2030d-fb34-4d00-bd41-c06db5d1543d-dshm\") pod \"stop-feature-test-kserve-844cb9bffc-9b6c8\" (UID: \"a8f2030d-fb34-4d00-bd41-c06db5d1543d\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-844cb9bffc-9b6c8" Apr 16 18:01:21.807026 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:01:21.806835 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/a8f2030d-fb34-4d00-bd41-c06db5d1543d-home\") pod \"stop-feature-test-kserve-844cb9bffc-9b6c8\" (UID: \"a8f2030d-fb34-4d00-bd41-c06db5d1543d\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-844cb9bffc-9b6c8" Apr 16 18:01:21.807026 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:01:21.806871 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/a8f2030d-fb34-4d00-bd41-c06db5d1543d-model-cache\") pod \"stop-feature-test-kserve-844cb9bffc-9b6c8\" (UID: \"a8f2030d-fb34-4d00-bd41-c06db5d1543d\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-844cb9bffc-9b6c8" Apr 16 18:01:21.807026 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:01:21.806900 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6jczg\" (UniqueName: \"kubernetes.io/projected/a8f2030d-fb34-4d00-bd41-c06db5d1543d-kube-api-access-6jczg\") pod \"stop-feature-test-kserve-844cb9bffc-9b6c8\" (UID: \"a8f2030d-fb34-4d00-bd41-c06db5d1543d\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-844cb9bffc-9b6c8" Apr 16 18:01:21.807026 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:01:21.806931 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a8f2030d-fb34-4d00-bd41-c06db5d1543d-kserve-provision-location\") pod \"stop-feature-test-kserve-844cb9bffc-9b6c8\" (UID: \"a8f2030d-fb34-4d00-bd41-c06db5d1543d\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-844cb9bffc-9b6c8" Apr 16 18:01:21.807387 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:01:21.807333 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/a8f2030d-fb34-4d00-bd41-c06db5d1543d-home\") pod \"stop-feature-test-kserve-844cb9bffc-9b6c8\" (UID: \"a8f2030d-fb34-4d00-bd41-c06db5d1543d\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-844cb9bffc-9b6c8" Apr 16 18:01:21.807387 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:01:21.807367 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a8f2030d-fb34-4d00-bd41-c06db5d1543d-kserve-provision-location\") pod \"stop-feature-test-kserve-844cb9bffc-9b6c8\" (UID: \"a8f2030d-fb34-4d00-bd41-c06db5d1543d\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-844cb9bffc-9b6c8" Apr 16 18:01:21.807535 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:01:21.807432 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/a8f2030d-fb34-4d00-bd41-c06db5d1543d-model-cache\") pod \"stop-feature-test-kserve-844cb9bffc-9b6c8\" (UID: \"a8f2030d-fb34-4d00-bd41-c06db5d1543d\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-844cb9bffc-9b6c8" Apr 16 18:01:21.809450 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:01:21.809416 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/a8f2030d-fb34-4d00-bd41-c06db5d1543d-dshm\") pod \"stop-feature-test-kserve-844cb9bffc-9b6c8\" (UID: \"a8f2030d-fb34-4d00-bd41-c06db5d1543d\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-844cb9bffc-9b6c8" Apr 16 18:01:21.809828 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:01:21.809807 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a8f2030d-fb34-4d00-bd41-c06db5d1543d-tls-certs\") pod \"stop-feature-test-kserve-844cb9bffc-9b6c8\" (UID: \"a8f2030d-fb34-4d00-bd41-c06db5d1543d\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-844cb9bffc-9b6c8" Apr 16 18:01:21.817162 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:01:21.817137 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6jczg\" (UniqueName: \"kubernetes.io/projected/a8f2030d-fb34-4d00-bd41-c06db5d1543d-kube-api-access-6jczg\") pod \"stop-feature-test-kserve-844cb9bffc-9b6c8\" (UID: \"a8f2030d-fb34-4d00-bd41-c06db5d1543d\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-844cb9bffc-9b6c8" Apr 16 18:01:21.952854 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:01:21.952818 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-844cb9bffc-9b6c8" Apr 16 18:01:22.102660 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:01:22.102631 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-844cb9bffc-9b6c8"] Apr 16 18:01:22.104917 ip-10-0-138-45 kubenswrapper[2574]: W0416 18:01:22.104881 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda8f2030d_fb34_4d00_bd41_c06db5d1543d.slice/crio-333c4c4bf70da9b916d92d7c49ef9dc4c104b04f335835a4ddf5825b6350d38a WatchSource:0}: Error finding container 333c4c4bf70da9b916d92d7c49ef9dc4c104b04f335835a4ddf5825b6350d38a: Status 404 returned error can't find the container with id 333c4c4bf70da9b916d92d7c49ef9dc4c104b04f335835a4ddf5825b6350d38a Apr 16 18:01:22.106728 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:01:22.106707 2574 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 18:01:22.625229 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:01:22.625187 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-844cb9bffc-9b6c8" event={"ID":"a8f2030d-fb34-4d00-bd41-c06db5d1543d","Type":"ContainerStarted","Data":"e853d55cade868b8d007b8476bd4a22f0ba788d398f5487f7aa3eac32c4b1fcb"} Apr 16 18:01:22.625229 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:01:22.625235 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-844cb9bffc-9b6c8" event={"ID":"a8f2030d-fb34-4d00-bd41-c06db5d1543d","Type":"ContainerStarted","Data":"333c4c4bf70da9b916d92d7c49ef9dc4c104b04f335835a4ddf5825b6350d38a"} Apr 16 18:01:23.076349 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:01:23.076309 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-6b8bb6c469-762fg" podUID="b4f70356-c88a-42a5-9f75-c75b44ed08b9" containerName="main" probeResult="failure" output="Get \"https://10.132.0.30:8000/health\": dial tcp 10.132.0.30:8000: connect: connection refused" Apr 16 18:01:23.944460 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:01:23.944436 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-844cb9bffc-rqlk9_0fccc588-e184-4c53-98b9-e204d906eb32/main/0.log" Apr 16 18:01:23.944888 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:01:23.944870 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-844cb9bffc-rqlk9" Apr 16 18:01:24.028366 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:01:24.028331 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0fccc588-e184-4c53-98b9-e204d906eb32-kserve-provision-location\") pod \"0fccc588-e184-4c53-98b9-e204d906eb32\" (UID: \"0fccc588-e184-4c53-98b9-e204d906eb32\") " Apr 16 18:01:24.028543 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:01:24.028404 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/0fccc588-e184-4c53-98b9-e204d906eb32-home\") pod \"0fccc588-e184-4c53-98b9-e204d906eb32\" (UID: \"0fccc588-e184-4c53-98b9-e204d906eb32\") " Apr 16 18:01:24.028543 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:01:24.028445 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/0fccc588-e184-4c53-98b9-e204d906eb32-dshm\") pod \"0fccc588-e184-4c53-98b9-e204d906eb32\" (UID: \"0fccc588-e184-4c53-98b9-e204d906eb32\") " Apr 16 18:01:24.028543 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:01:24.028487 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/0fccc588-e184-4c53-98b9-e204d906eb32-tls-certs\") pod \"0fccc588-e184-4c53-98b9-e204d906eb32\" (UID: \"0fccc588-e184-4c53-98b9-e204d906eb32\") " Apr 16 18:01:24.028543 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:01:24.028529 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xf7tw\" (UniqueName: \"kubernetes.io/projected/0fccc588-e184-4c53-98b9-e204d906eb32-kube-api-access-xf7tw\") pod \"0fccc588-e184-4c53-98b9-e204d906eb32\" (UID: \"0fccc588-e184-4c53-98b9-e204d906eb32\") " Apr 16 18:01:24.028779 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:01:24.028560 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/0fccc588-e184-4c53-98b9-e204d906eb32-model-cache\") pod \"0fccc588-e184-4c53-98b9-e204d906eb32\" (UID: \"0fccc588-e184-4c53-98b9-e204d906eb32\") " Apr 16 18:01:24.028889 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:01:24.028837 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0fccc588-e184-4c53-98b9-e204d906eb32-home" (OuterVolumeSpecName: "home") pod "0fccc588-e184-4c53-98b9-e204d906eb32" (UID: "0fccc588-e184-4c53-98b9-e204d906eb32"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:01:24.029019 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:01:24.028987 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0fccc588-e184-4c53-98b9-e204d906eb32-model-cache" (OuterVolumeSpecName: "model-cache") pod "0fccc588-e184-4c53-98b9-e204d906eb32" (UID: "0fccc588-e184-4c53-98b9-e204d906eb32"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:01:24.030984 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:01:24.030925 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0fccc588-e184-4c53-98b9-e204d906eb32-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "0fccc588-e184-4c53-98b9-e204d906eb32" (UID: "0fccc588-e184-4c53-98b9-e204d906eb32"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:01:24.030984 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:01:24.030923 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0fccc588-e184-4c53-98b9-e204d906eb32-dshm" (OuterVolumeSpecName: "dshm") pod "0fccc588-e184-4c53-98b9-e204d906eb32" (UID: "0fccc588-e184-4c53-98b9-e204d906eb32"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:01:24.031640 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:01:24.031608 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0fccc588-e184-4c53-98b9-e204d906eb32-kube-api-access-xf7tw" (OuterVolumeSpecName: "kube-api-access-xf7tw") pod "0fccc588-e184-4c53-98b9-e204d906eb32" (UID: "0fccc588-e184-4c53-98b9-e204d906eb32"). InnerVolumeSpecName "kube-api-access-xf7tw". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:01:24.104559 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:01:24.104465 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0fccc588-e184-4c53-98b9-e204d906eb32-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "0fccc588-e184-4c53-98b9-e204d906eb32" (UID: "0fccc588-e184-4c53-98b9-e204d906eb32"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:01:24.129794 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:01:24.129752 2574 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/0fccc588-e184-4c53-98b9-e204d906eb32-tls-certs\") on node \"ip-10-0-138-45.ec2.internal\" DevicePath \"\"" Apr 16 18:01:24.129794 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:01:24.129790 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xf7tw\" (UniqueName: \"kubernetes.io/projected/0fccc588-e184-4c53-98b9-e204d906eb32-kube-api-access-xf7tw\") on node \"ip-10-0-138-45.ec2.internal\" DevicePath \"\"" Apr 16 18:01:24.129983 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:01:24.129809 2574 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/0fccc588-e184-4c53-98b9-e204d906eb32-model-cache\") on node \"ip-10-0-138-45.ec2.internal\" DevicePath \"\"" Apr 16 18:01:24.129983 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:01:24.129825 2574 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0fccc588-e184-4c53-98b9-e204d906eb32-kserve-provision-location\") on node \"ip-10-0-138-45.ec2.internal\" DevicePath \"\"" Apr 16 18:01:24.129983 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:01:24.129840 2574 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/0fccc588-e184-4c53-98b9-e204d906eb32-home\") on node \"ip-10-0-138-45.ec2.internal\" DevicePath \"\"" Apr 16 18:01:24.129983 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:01:24.129854 2574 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/0fccc588-e184-4c53-98b9-e204d906eb32-dshm\") on node \"ip-10-0-138-45.ec2.internal\" DevicePath \"\"" Apr 16 18:01:24.635708 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:01:24.635677 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-844cb9bffc-rqlk9_0fccc588-e184-4c53-98b9-e204d906eb32/main/0.log" Apr 16 18:01:24.636107 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:01:24.636078 2574 generic.go:358] "Generic (PLEG): container finished" podID="0fccc588-e184-4c53-98b9-e204d906eb32" containerID="f270e25a802a7cd69f4a4992d97e5c4871696c23fba9817709c702235c078687" exitCode=137 Apr 16 18:01:24.636183 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:01:24.636169 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-844cb9bffc-rqlk9" event={"ID":"0fccc588-e184-4c53-98b9-e204d906eb32","Type":"ContainerDied","Data":"f270e25a802a7cd69f4a4992d97e5c4871696c23fba9817709c702235c078687"} Apr 16 18:01:24.636219 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:01:24.636202 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-844cb9bffc-rqlk9" event={"ID":"0fccc588-e184-4c53-98b9-e204d906eb32","Type":"ContainerDied","Data":"efad6bac649f4a1f8674fd44c92b0ebd14874b99700fee01e91a3cb352657f74"} Apr 16 18:01:24.636251 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:01:24.636225 2574 scope.go:117] "RemoveContainer" containerID="f270e25a802a7cd69f4a4992d97e5c4871696c23fba9817709c702235c078687" Apr 16 18:01:24.636283 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:01:24.636257 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-844cb9bffc-rqlk9" Apr 16 18:01:24.657944 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:01:24.657878 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-844cb9bffc-rqlk9"] Apr 16 18:01:24.664170 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:01:24.664044 2574 scope.go:117] "RemoveContainer" containerID="1b4d17d32cf198f8249631ef619dfa1bb4ebb079293ded64eecbe8dd550b9085" Apr 16 18:01:24.666052 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:01:24.666017 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-844cb9bffc-rqlk9"] Apr 16 18:01:24.748891 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:01:24.748866 2574 scope.go:117] "RemoveContainer" containerID="f270e25a802a7cd69f4a4992d97e5c4871696c23fba9817709c702235c078687" Apr 16 18:01:24.749458 ip-10-0-138-45 kubenswrapper[2574]: E0416 18:01:24.749422 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f270e25a802a7cd69f4a4992d97e5c4871696c23fba9817709c702235c078687\": container with ID starting with f270e25a802a7cd69f4a4992d97e5c4871696c23fba9817709c702235c078687 not found: ID does not exist" containerID="f270e25a802a7cd69f4a4992d97e5c4871696c23fba9817709c702235c078687" Apr 16 18:01:24.749458 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:01:24.749468 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f270e25a802a7cd69f4a4992d97e5c4871696c23fba9817709c702235c078687"} err="failed to get container status \"f270e25a802a7cd69f4a4992d97e5c4871696c23fba9817709c702235c078687\": rpc error: code = NotFound desc = could not find container \"f270e25a802a7cd69f4a4992d97e5c4871696c23fba9817709c702235c078687\": container with ID starting with f270e25a802a7cd69f4a4992d97e5c4871696c23fba9817709c702235c078687 not found: ID does not exist" Apr 16 18:01:24.749767 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:01:24.749495 2574 scope.go:117] "RemoveContainer" containerID="1b4d17d32cf198f8249631ef619dfa1bb4ebb079293ded64eecbe8dd550b9085" Apr 16 18:01:24.749866 ip-10-0-138-45 kubenswrapper[2574]: E0416 18:01:24.749839 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1b4d17d32cf198f8249631ef619dfa1bb4ebb079293ded64eecbe8dd550b9085\": container with ID starting with 1b4d17d32cf198f8249631ef619dfa1bb4ebb079293ded64eecbe8dd550b9085 not found: ID does not exist" containerID="1b4d17d32cf198f8249631ef619dfa1bb4ebb079293ded64eecbe8dd550b9085" Apr 16 18:01:24.749922 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:01:24.749877 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1b4d17d32cf198f8249631ef619dfa1bb4ebb079293ded64eecbe8dd550b9085"} err="failed to get container status \"1b4d17d32cf198f8249631ef619dfa1bb4ebb079293ded64eecbe8dd550b9085\": rpc error: code = NotFound desc = could not find container \"1b4d17d32cf198f8249631ef619dfa1bb4ebb079293ded64eecbe8dd550b9085\": container with ID starting with 1b4d17d32cf198f8249631ef619dfa1bb4ebb079293ded64eecbe8dd550b9085 not found: ID does not exist" Apr 16 18:01:26.455032 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:01:26.454995 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0fccc588-e184-4c53-98b9-e204d906eb32" path="/var/lib/kubelet/pods/0fccc588-e184-4c53-98b9-e204d906eb32/volumes" Apr 16 18:01:26.647123 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:01:26.647029 2574 generic.go:358] "Generic (PLEG): container finished" podID="a8f2030d-fb34-4d00-bd41-c06db5d1543d" containerID="e853d55cade868b8d007b8476bd4a22f0ba788d398f5487f7aa3eac32c4b1fcb" exitCode=0 Apr 16 18:01:26.647123 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:01:26.647102 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-844cb9bffc-9b6c8" event={"ID":"a8f2030d-fb34-4d00-bd41-c06db5d1543d","Type":"ContainerDied","Data":"e853d55cade868b8d007b8476bd4a22f0ba788d398f5487f7aa3eac32c4b1fcb"} Apr 16 18:01:27.653303 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:01:27.653264 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-844cb9bffc-9b6c8" event={"ID":"a8f2030d-fb34-4d00-bd41-c06db5d1543d","Type":"ContainerStarted","Data":"49fc6aa76424cebf12ea639706ee7e8db3f0ab542223d99e658cea642aadd586"} Apr 16 18:01:27.679682 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:01:27.679611 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/stop-feature-test-kserve-844cb9bffc-9b6c8" podStartSLOduration=6.679552771 podStartE2EDuration="6.679552771s" podCreationTimestamp="2026-04-16 18:01:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:01:27.677461817 +0000 UTC m=+1297.836832114" watchObservedRunningTime="2026-04-16 18:01:27.679552771 +0000 UTC m=+1297.838923058" Apr 16 18:01:31.953083 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:01:31.953043 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-844cb9bffc-9b6c8" Apr 16 18:01:31.953083 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:01:31.953093 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/stop-feature-test-kserve-844cb9bffc-9b6c8" Apr 16 18:01:31.954841 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:01:31.954808 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-844cb9bffc-9b6c8" podUID="a8f2030d-fb34-4d00-bd41-c06db5d1543d" containerName="main" probeResult="failure" output="Get \"https://10.132.0.31:8000/health\": dial tcp 10.132.0.31:8000: connect: connection refused" Apr 16 18:01:33.076813 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:01:33.076760 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-6b8bb6c469-762fg" podUID="b4f70356-c88a-42a5-9f75-c75b44ed08b9" containerName="main" probeResult="failure" output="Get \"https://10.132.0.30:8000/health\": dial tcp 10.132.0.30:8000: connect: connection refused" Apr 16 18:01:41.953484 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:01:41.953442 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-844cb9bffc-9b6c8" podUID="a8f2030d-fb34-4d00-bd41-c06db5d1543d" containerName="main" probeResult="failure" output="Get \"https://10.132.0.31:8000/health\": dial tcp 10.132.0.31:8000: connect: connection refused" Apr 16 18:01:43.076003 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:01:43.075946 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-6b8bb6c469-762fg" podUID="b4f70356-c88a-42a5-9f75-c75b44ed08b9" containerName="main" probeResult="failure" output="Get \"https://10.132.0.30:8000/health\": dial tcp 10.132.0.30:8000: connect: connection refused" Apr 16 18:01:51.954234 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:01:51.954195 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-844cb9bffc-9b6c8" podUID="a8f2030d-fb34-4d00-bd41-c06db5d1543d" containerName="main" probeResult="failure" output="Get \"https://10.132.0.31:8000/health\": dial tcp 10.132.0.31:8000: connect: connection refused" Apr 16 18:01:53.076046 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:01:53.076001 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-6b8bb6c469-762fg" podUID="b4f70356-c88a-42a5-9f75-c75b44ed08b9" containerName="main" probeResult="failure" output="Get \"https://10.132.0.30:8000/health\": dial tcp 10.132.0.30:8000: connect: connection refused" Apr 16 18:02:01.953489 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:02:01.953439 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-844cb9bffc-9b6c8" podUID="a8f2030d-fb34-4d00-bd41-c06db5d1543d" containerName="main" probeResult="failure" output="Get \"https://10.132.0.31:8000/health\": dial tcp 10.132.0.31:8000: connect: connection refused" Apr 16 18:02:03.086246 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:02:03.086215 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-6b8bb6c469-762fg" Apr 16 18:02:03.094133 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:02:03.094092 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-6b8bb6c469-762fg" Apr 16 18:02:10.098885 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:02:10.098805 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-6b8bb6c469-762fg"] Apr 16 18:02:10.099400 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:02:10.099124 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-6b8bb6c469-762fg" podUID="b4f70356-c88a-42a5-9f75-c75b44ed08b9" containerName="main" containerID="cri-o://a8200aede958632965dbdc97dde98031ee41b19adbb2583d0ad4d50f6e9187f5" gracePeriod=30 Apr 16 18:02:11.953937 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:02:11.953887 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-844cb9bffc-9b6c8" podUID="a8f2030d-fb34-4d00-bd41-c06db5d1543d" containerName="main" probeResult="failure" output="Get \"https://10.132.0.31:8000/health\": dial tcp 10.132.0.31:8000: connect: connection refused" Apr 16 18:02:21.953988 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:02:21.953941 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-844cb9bffc-9b6c8" podUID="a8f2030d-fb34-4d00-bd41-c06db5d1543d" containerName="main" probeResult="failure" output="Get \"https://10.132.0.31:8000/health\": dial tcp 10.132.0.31:8000: connect: connection refused" Apr 16 18:02:29.940731 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:02:29.940693 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-577c8d7bbf-sjzck"] Apr 16 18:02:29.941209 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:02:29.941160 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0fccc588-e184-4c53-98b9-e204d906eb32" containerName="main" Apr 16 18:02:29.941209 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:02:29.941177 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="0fccc588-e184-4c53-98b9-e204d906eb32" containerName="main" Apr 16 18:02:29.941209 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:02:29.941203 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0fccc588-e184-4c53-98b9-e204d906eb32" containerName="storage-initializer" Apr 16 18:02:29.941209 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:02:29.941211 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="0fccc588-e184-4c53-98b9-e204d906eb32" containerName="storage-initializer" Apr 16 18:02:29.941421 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:02:29.941302 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="0fccc588-e184-4c53-98b9-e204d906eb32" containerName="main" Apr 16 18:02:29.944788 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:02:29.944768 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-577c8d7bbf-sjzck" Apr 16 18:02:29.947105 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:02:29.947073 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-with-refs-test-kserve-self-signed-certs\"" Apr 16 18:02:29.957264 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:02:29.957240 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-577c8d7bbf-sjzck"] Apr 16 18:02:30.120449 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:02:30.120397 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/4f0eb385-59d8-4ad1-a6ed-60d189133c04-tls-certs\") pod \"router-with-refs-test-kserve-577c8d7bbf-sjzck\" (UID: \"4f0eb385-59d8-4ad1-a6ed-60d189133c04\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-577c8d7bbf-sjzck" Apr 16 18:02:30.120659 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:02:30.120484 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/4f0eb385-59d8-4ad1-a6ed-60d189133c04-model-cache\") pod \"router-with-refs-test-kserve-577c8d7bbf-sjzck\" (UID: \"4f0eb385-59d8-4ad1-a6ed-60d189133c04\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-577c8d7bbf-sjzck" Apr 16 18:02:30.120659 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:02:30.120516 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/4f0eb385-59d8-4ad1-a6ed-60d189133c04-home\") pod \"router-with-refs-test-kserve-577c8d7bbf-sjzck\" (UID: \"4f0eb385-59d8-4ad1-a6ed-60d189133c04\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-577c8d7bbf-sjzck" Apr 16 18:02:30.120659 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:02:30.120618 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4f0eb385-59d8-4ad1-a6ed-60d189133c04-kserve-provision-location\") pod \"router-with-refs-test-kserve-577c8d7bbf-sjzck\" (UID: \"4f0eb385-59d8-4ad1-a6ed-60d189133c04\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-577c8d7bbf-sjzck" Apr 16 18:02:30.120659 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:02:30.120654 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-chc7p\" (UniqueName: \"kubernetes.io/projected/4f0eb385-59d8-4ad1-a6ed-60d189133c04-kube-api-access-chc7p\") pod \"router-with-refs-test-kserve-577c8d7bbf-sjzck\" (UID: \"4f0eb385-59d8-4ad1-a6ed-60d189133c04\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-577c8d7bbf-sjzck" Apr 16 18:02:30.120826 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:02:30.120749 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/4f0eb385-59d8-4ad1-a6ed-60d189133c04-dshm\") pod \"router-with-refs-test-kserve-577c8d7bbf-sjzck\" (UID: \"4f0eb385-59d8-4ad1-a6ed-60d189133c04\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-577c8d7bbf-sjzck" Apr 16 18:02:30.221587 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:02:30.221471 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4f0eb385-59d8-4ad1-a6ed-60d189133c04-kserve-provision-location\") pod \"router-with-refs-test-kserve-577c8d7bbf-sjzck\" (UID: \"4f0eb385-59d8-4ad1-a6ed-60d189133c04\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-577c8d7bbf-sjzck" Apr 16 18:02:30.221587 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:02:30.221554 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-chc7p\" (UniqueName: \"kubernetes.io/projected/4f0eb385-59d8-4ad1-a6ed-60d189133c04-kube-api-access-chc7p\") pod \"router-with-refs-test-kserve-577c8d7bbf-sjzck\" (UID: \"4f0eb385-59d8-4ad1-a6ed-60d189133c04\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-577c8d7bbf-sjzck" Apr 16 18:02:30.221809 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:02:30.221683 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/4f0eb385-59d8-4ad1-a6ed-60d189133c04-dshm\") pod \"router-with-refs-test-kserve-577c8d7bbf-sjzck\" (UID: \"4f0eb385-59d8-4ad1-a6ed-60d189133c04\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-577c8d7bbf-sjzck" Apr 16 18:02:30.221809 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:02:30.221734 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/4f0eb385-59d8-4ad1-a6ed-60d189133c04-tls-certs\") pod \"router-with-refs-test-kserve-577c8d7bbf-sjzck\" (UID: \"4f0eb385-59d8-4ad1-a6ed-60d189133c04\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-577c8d7bbf-sjzck" Apr 16 18:02:30.221809 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:02:30.221777 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/4f0eb385-59d8-4ad1-a6ed-60d189133c04-model-cache\") pod \"router-with-refs-test-kserve-577c8d7bbf-sjzck\" (UID: \"4f0eb385-59d8-4ad1-a6ed-60d189133c04\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-577c8d7bbf-sjzck" Apr 16 18:02:30.222001 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:02:30.221810 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/4f0eb385-59d8-4ad1-a6ed-60d189133c04-home\") pod \"router-with-refs-test-kserve-577c8d7bbf-sjzck\" (UID: \"4f0eb385-59d8-4ad1-a6ed-60d189133c04\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-577c8d7bbf-sjzck" Apr 16 18:02:30.222001 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:02:30.221835 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4f0eb385-59d8-4ad1-a6ed-60d189133c04-kserve-provision-location\") pod \"router-with-refs-test-kserve-577c8d7bbf-sjzck\" (UID: \"4f0eb385-59d8-4ad1-a6ed-60d189133c04\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-577c8d7bbf-sjzck" Apr 16 18:02:30.222103 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:02:30.222089 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/4f0eb385-59d8-4ad1-a6ed-60d189133c04-model-cache\") pod \"router-with-refs-test-kserve-577c8d7bbf-sjzck\" (UID: \"4f0eb385-59d8-4ad1-a6ed-60d189133c04\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-577c8d7bbf-sjzck" Apr 16 18:02:30.222213 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:02:30.222191 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/4f0eb385-59d8-4ad1-a6ed-60d189133c04-home\") pod \"router-with-refs-test-kserve-577c8d7bbf-sjzck\" (UID: \"4f0eb385-59d8-4ad1-a6ed-60d189133c04\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-577c8d7bbf-sjzck" Apr 16 18:02:30.224169 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:02:30.224145 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/4f0eb385-59d8-4ad1-a6ed-60d189133c04-dshm\") pod \"router-with-refs-test-kserve-577c8d7bbf-sjzck\" (UID: \"4f0eb385-59d8-4ad1-a6ed-60d189133c04\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-577c8d7bbf-sjzck" Apr 16 18:02:30.224620 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:02:30.224596 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/4f0eb385-59d8-4ad1-a6ed-60d189133c04-tls-certs\") pod \"router-with-refs-test-kserve-577c8d7bbf-sjzck\" (UID: \"4f0eb385-59d8-4ad1-a6ed-60d189133c04\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-577c8d7bbf-sjzck" Apr 16 18:02:30.230814 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:02:30.230786 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-chc7p\" (UniqueName: \"kubernetes.io/projected/4f0eb385-59d8-4ad1-a6ed-60d189133c04-kube-api-access-chc7p\") pod \"router-with-refs-test-kserve-577c8d7bbf-sjzck\" (UID: \"4f0eb385-59d8-4ad1-a6ed-60d189133c04\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-577c8d7bbf-sjzck" Apr 16 18:02:30.256497 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:02:30.256454 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-577c8d7bbf-sjzck" Apr 16 18:02:30.390088 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:02:30.390016 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-577c8d7bbf-sjzck"] Apr 16 18:02:30.392727 ip-10-0-138-45 kubenswrapper[2574]: W0416 18:02:30.392697 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4f0eb385_59d8_4ad1_a6ed_60d189133c04.slice/crio-1bfd96620ba1296bd284b1ea0ab784a677d2cf5bab997cafc6db3e377f46f26e WatchSource:0}: Error finding container 1bfd96620ba1296bd284b1ea0ab784a677d2cf5bab997cafc6db3e377f46f26e: Status 404 returned error can't find the container with id 1bfd96620ba1296bd284b1ea0ab784a677d2cf5bab997cafc6db3e377f46f26e Apr 16 18:02:30.906461 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:02:30.906412 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-577c8d7bbf-sjzck" event={"ID":"4f0eb385-59d8-4ad1-a6ed-60d189133c04","Type":"ContainerStarted","Data":"29f012fb58b65ff98c21a1ae640506cf6d9a9808f761ca49e862c259bb93e16e"} Apr 16 18:02:30.906461 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:02:30.906458 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-577c8d7bbf-sjzck" event={"ID":"4f0eb385-59d8-4ad1-a6ed-60d189133c04","Type":"ContainerStarted","Data":"1bfd96620ba1296bd284b1ea0ab784a677d2cf5bab997cafc6db3e377f46f26e"} Apr 16 18:02:31.953480 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:02:31.953434 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-844cb9bffc-9b6c8" podUID="a8f2030d-fb34-4d00-bd41-c06db5d1543d" containerName="main" probeResult="failure" output="Get \"https://10.132.0.31:8000/health\": dial tcp 10.132.0.31:8000: connect: connection refused" Apr 16 18:02:34.923182 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:02:34.923149 2574 generic.go:358] "Generic (PLEG): container finished" podID="4f0eb385-59d8-4ad1-a6ed-60d189133c04" containerID="29f012fb58b65ff98c21a1ae640506cf6d9a9808f761ca49e862c259bb93e16e" exitCode=0 Apr 16 18:02:34.923744 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:02:34.923217 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-577c8d7bbf-sjzck" event={"ID":"4f0eb385-59d8-4ad1-a6ed-60d189133c04","Type":"ContainerDied","Data":"29f012fb58b65ff98c21a1ae640506cf6d9a9808f761ca49e862c259bb93e16e"} Apr 16 18:02:35.929564 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:02:35.929531 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-577c8d7bbf-sjzck" event={"ID":"4f0eb385-59d8-4ad1-a6ed-60d189133c04","Type":"ContainerStarted","Data":"fb0b9657cd7f3329b8aae39b87ab36da94330d336b3735bb9e8ca97c8c3765fe"} Apr 16 18:02:35.959702 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:02:35.959630 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-577c8d7bbf-sjzck" podStartSLOduration=6.959607159 podStartE2EDuration="6.959607159s" podCreationTimestamp="2026-04-16 18:02:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:02:35.955370021 +0000 UTC m=+1366.114740302" watchObservedRunningTime="2026-04-16 18:02:35.959607159 +0000 UTC m=+1366.118977445" Apr 16 18:02:40.256842 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:02:40.256805 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-577c8d7bbf-sjzck" Apr 16 18:02:40.257293 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:02:40.256855 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-577c8d7bbf-sjzck" Apr 16 18:02:40.258369 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:02:40.258332 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-577c8d7bbf-sjzck" podUID="4f0eb385-59d8-4ad1-a6ed-60d189133c04" containerName="main" probeResult="failure" output="Get \"https://10.132.0.32:8000/health\": dial tcp 10.132.0.32:8000: connect: connection refused" Apr 16 18:02:40.399060 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:02:40.398945 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_custom-route-timeout-test-kserve-6b8bb6c469-762fg_b4f70356-c88a-42a5-9f75-c75b44ed08b9/main/0.log" Apr 16 18:02:40.399845 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:02:40.399817 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-6b8bb6c469-762fg" Apr 16 18:02:40.511658 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:02:40.511607 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b4f70356-c88a-42a5-9f75-c75b44ed08b9-kserve-provision-location\") pod \"b4f70356-c88a-42a5-9f75-c75b44ed08b9\" (UID: \"b4f70356-c88a-42a5-9f75-c75b44ed08b9\") " Apr 16 18:02:40.511658 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:02:40.511648 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/b4f70356-c88a-42a5-9f75-c75b44ed08b9-model-cache\") pod \"b4f70356-c88a-42a5-9f75-c75b44ed08b9\" (UID: \"b4f70356-c88a-42a5-9f75-c75b44ed08b9\") " Apr 16 18:02:40.511917 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:02:40.511684 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/b4f70356-c88a-42a5-9f75-c75b44ed08b9-dshm\") pod \"b4f70356-c88a-42a5-9f75-c75b44ed08b9\" (UID: \"b4f70356-c88a-42a5-9f75-c75b44ed08b9\") " Apr 16 18:02:40.511917 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:02:40.511709 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7wj57\" (UniqueName: \"kubernetes.io/projected/b4f70356-c88a-42a5-9f75-c75b44ed08b9-kube-api-access-7wj57\") pod \"b4f70356-c88a-42a5-9f75-c75b44ed08b9\" (UID: \"b4f70356-c88a-42a5-9f75-c75b44ed08b9\") " Apr 16 18:02:40.511917 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:02:40.511761 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b4f70356-c88a-42a5-9f75-c75b44ed08b9-tls-certs\") pod \"b4f70356-c88a-42a5-9f75-c75b44ed08b9\" (UID: \"b4f70356-c88a-42a5-9f75-c75b44ed08b9\") " Apr 16 18:02:40.511917 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:02:40.511808 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/b4f70356-c88a-42a5-9f75-c75b44ed08b9-home\") pod \"b4f70356-c88a-42a5-9f75-c75b44ed08b9\" (UID: \"b4f70356-c88a-42a5-9f75-c75b44ed08b9\") " Apr 16 18:02:40.512138 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:02:40.511986 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b4f70356-c88a-42a5-9f75-c75b44ed08b9-model-cache" (OuterVolumeSpecName: "model-cache") pod "b4f70356-c88a-42a5-9f75-c75b44ed08b9" (UID: "b4f70356-c88a-42a5-9f75-c75b44ed08b9"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:02:40.512138 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:02:40.512111 2574 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/b4f70356-c88a-42a5-9f75-c75b44ed08b9-model-cache\") on node \"ip-10-0-138-45.ec2.internal\" DevicePath \"\"" Apr 16 18:02:40.512322 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:02:40.512294 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b4f70356-c88a-42a5-9f75-c75b44ed08b9-home" (OuterVolumeSpecName: "home") pod "b4f70356-c88a-42a5-9f75-c75b44ed08b9" (UID: "b4f70356-c88a-42a5-9f75-c75b44ed08b9"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:02:40.514622 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:02:40.514555 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b4f70356-c88a-42a5-9f75-c75b44ed08b9-dshm" (OuterVolumeSpecName: "dshm") pod "b4f70356-c88a-42a5-9f75-c75b44ed08b9" (UID: "b4f70356-c88a-42a5-9f75-c75b44ed08b9"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:02:40.514749 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:02:40.514632 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b4f70356-c88a-42a5-9f75-c75b44ed08b9-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "b4f70356-c88a-42a5-9f75-c75b44ed08b9" (UID: "b4f70356-c88a-42a5-9f75-c75b44ed08b9"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:02:40.514749 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:02:40.514692 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b4f70356-c88a-42a5-9f75-c75b44ed08b9-kube-api-access-7wj57" (OuterVolumeSpecName: "kube-api-access-7wj57") pod "b4f70356-c88a-42a5-9f75-c75b44ed08b9" (UID: "b4f70356-c88a-42a5-9f75-c75b44ed08b9"). InnerVolumeSpecName "kube-api-access-7wj57". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:02:40.574921 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:02:40.574868 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b4f70356-c88a-42a5-9f75-c75b44ed08b9-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "b4f70356-c88a-42a5-9f75-c75b44ed08b9" (UID: "b4f70356-c88a-42a5-9f75-c75b44ed08b9"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:02:40.612777 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:02:40.612736 2574 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b4f70356-c88a-42a5-9f75-c75b44ed08b9-kserve-provision-location\") on node \"ip-10-0-138-45.ec2.internal\" DevicePath \"\"" Apr 16 18:02:40.612777 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:02:40.612765 2574 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/b4f70356-c88a-42a5-9f75-c75b44ed08b9-dshm\") on node \"ip-10-0-138-45.ec2.internal\" DevicePath \"\"" Apr 16 18:02:40.612777 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:02:40.612777 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-7wj57\" (UniqueName: \"kubernetes.io/projected/b4f70356-c88a-42a5-9f75-c75b44ed08b9-kube-api-access-7wj57\") on node \"ip-10-0-138-45.ec2.internal\" DevicePath \"\"" Apr 16 18:02:40.612777 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:02:40.612786 2574 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b4f70356-c88a-42a5-9f75-c75b44ed08b9-tls-certs\") on node \"ip-10-0-138-45.ec2.internal\" DevicePath \"\"" Apr 16 18:02:40.613100 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:02:40.612795 2574 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/b4f70356-c88a-42a5-9f75-c75b44ed08b9-home\") on node \"ip-10-0-138-45.ec2.internal\" DevicePath \"\"" Apr 16 18:02:40.947774 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:02:40.947739 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_custom-route-timeout-test-kserve-6b8bb6c469-762fg_b4f70356-c88a-42a5-9f75-c75b44ed08b9/main/0.log" Apr 16 18:02:40.948120 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:02:40.948094 2574 generic.go:358] "Generic (PLEG): container finished" podID="b4f70356-c88a-42a5-9f75-c75b44ed08b9" containerID="a8200aede958632965dbdc97dde98031ee41b19adbb2583d0ad4d50f6e9187f5" exitCode=137 Apr 16 18:02:40.948195 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:02:40.948139 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-6b8bb6c469-762fg" event={"ID":"b4f70356-c88a-42a5-9f75-c75b44ed08b9","Type":"ContainerDied","Data":"a8200aede958632965dbdc97dde98031ee41b19adbb2583d0ad4d50f6e9187f5"} Apr 16 18:02:40.948195 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:02:40.948162 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-6b8bb6c469-762fg" event={"ID":"b4f70356-c88a-42a5-9f75-c75b44ed08b9","Type":"ContainerDied","Data":"06e70f646b9f41a573679ed789fe3e310fc1bc876d9dc7e338e4b191b279c815"} Apr 16 18:02:40.948195 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:02:40.948176 2574 scope.go:117] "RemoveContainer" containerID="a8200aede958632965dbdc97dde98031ee41b19adbb2583d0ad4d50f6e9187f5" Apr 16 18:02:40.948354 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:02:40.948220 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-6b8bb6c469-762fg" Apr 16 18:02:40.969054 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:02:40.969033 2574 scope.go:117] "RemoveContainer" containerID="6d499040663efc7eb5a786cea9047ce2b5c99354d835468ab14dd00fd7042e95" Apr 16 18:02:40.979215 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:02:40.979184 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-6b8bb6c469-762fg"] Apr 16 18:02:40.980508 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:02:40.980481 2574 scope.go:117] "RemoveContainer" containerID="a8200aede958632965dbdc97dde98031ee41b19adbb2583d0ad4d50f6e9187f5" Apr 16 18:02:40.980932 ip-10-0-138-45 kubenswrapper[2574]: E0416 18:02:40.980910 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a8200aede958632965dbdc97dde98031ee41b19adbb2583d0ad4d50f6e9187f5\": container with ID starting with a8200aede958632965dbdc97dde98031ee41b19adbb2583d0ad4d50f6e9187f5 not found: ID does not exist" containerID="a8200aede958632965dbdc97dde98031ee41b19adbb2583d0ad4d50f6e9187f5" Apr 16 18:02:40.981027 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:02:40.980942 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a8200aede958632965dbdc97dde98031ee41b19adbb2583d0ad4d50f6e9187f5"} err="failed to get container status \"a8200aede958632965dbdc97dde98031ee41b19adbb2583d0ad4d50f6e9187f5\": rpc error: code = NotFound desc = could not find container \"a8200aede958632965dbdc97dde98031ee41b19adbb2583d0ad4d50f6e9187f5\": container with ID starting with a8200aede958632965dbdc97dde98031ee41b19adbb2583d0ad4d50f6e9187f5 not found: ID does not exist" Apr 16 18:02:40.981027 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:02:40.980961 2574 scope.go:117] "RemoveContainer" containerID="6d499040663efc7eb5a786cea9047ce2b5c99354d835468ab14dd00fd7042e95" Apr 16 18:02:40.981252 ip-10-0-138-45 kubenswrapper[2574]: E0416 18:02:40.981235 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6d499040663efc7eb5a786cea9047ce2b5c99354d835468ab14dd00fd7042e95\": container with ID starting with 6d499040663efc7eb5a786cea9047ce2b5c99354d835468ab14dd00fd7042e95 not found: ID does not exist" containerID="6d499040663efc7eb5a786cea9047ce2b5c99354d835468ab14dd00fd7042e95" Apr 16 18:02:40.981318 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:02:40.981260 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d499040663efc7eb5a786cea9047ce2b5c99354d835468ab14dd00fd7042e95"} err="failed to get container status \"6d499040663efc7eb5a786cea9047ce2b5c99354d835468ab14dd00fd7042e95\": rpc error: code = NotFound desc = could not find container \"6d499040663efc7eb5a786cea9047ce2b5c99354d835468ab14dd00fd7042e95\": container with ID starting with 6d499040663efc7eb5a786cea9047ce2b5c99354d835468ab14dd00fd7042e95 not found: ID does not exist" Apr 16 18:02:40.995823 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:02:40.995776 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-6b8bb6c469-762fg"] Apr 16 18:02:41.953369 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:02:41.953322 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-844cb9bffc-9b6c8" podUID="a8f2030d-fb34-4d00-bd41-c06db5d1543d" containerName="main" probeResult="failure" output="Get \"https://10.132.0.31:8000/health\": dial tcp 10.132.0.31:8000: connect: connection refused" Apr 16 18:02:42.455202 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:02:42.455163 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b4f70356-c88a-42a5-9f75-c75b44ed08b9" path="/var/lib/kubelet/pods/b4f70356-c88a-42a5-9f75-c75b44ed08b9/volumes" Apr 16 18:02:50.257130 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:02:50.257076 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-577c8d7bbf-sjzck" podUID="4f0eb385-59d8-4ad1-a6ed-60d189133c04" containerName="main" probeResult="failure" output="Get \"https://10.132.0.32:8000/health\": dial tcp 10.132.0.32:8000: connect: connection refused" Apr 16 18:02:51.954202 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:02:51.954149 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-844cb9bffc-9b6c8" podUID="a8f2030d-fb34-4d00-bd41-c06db5d1543d" containerName="main" probeResult="failure" output="Get \"https://10.132.0.31:8000/health\": dial tcp 10.132.0.31:8000: connect: connection refused" Apr 16 18:03:00.257740 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:03:00.257694 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-577c8d7bbf-sjzck" podUID="4f0eb385-59d8-4ad1-a6ed-60d189133c04" containerName="main" probeResult="failure" output="Get \"https://10.132.0.32:8000/health\": dial tcp 10.132.0.32:8000: connect: connection refused" Apr 16 18:03:01.954044 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:03:01.953999 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-844cb9bffc-9b6c8" podUID="a8f2030d-fb34-4d00-bd41-c06db5d1543d" containerName="main" probeResult="failure" output="Get \"https://10.132.0.31:8000/health\": dial tcp 10.132.0.31:8000: connect: connection refused" Apr 16 18:03:10.257807 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:03:10.257762 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-577c8d7bbf-sjzck" podUID="4f0eb385-59d8-4ad1-a6ed-60d189133c04" containerName="main" probeResult="failure" output="Get \"https://10.132.0.32:8000/health\": dial tcp 10.132.0.32:8000: connect: connection refused" Apr 16 18:03:11.963536 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:03:11.963500 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/stop-feature-test-kserve-844cb9bffc-9b6c8" Apr 16 18:03:11.971781 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:03:11.971750 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-844cb9bffc-9b6c8" Apr 16 18:03:13.132137 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:03:13.132100 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-844cb9bffc-9b6c8"] Apr 16 18:03:13.132661 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:03:13.132419 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/stop-feature-test-kserve-844cb9bffc-9b6c8" podUID="a8f2030d-fb34-4d00-bd41-c06db5d1543d" containerName="main" containerID="cri-o://49fc6aa76424cebf12ea639706ee7e8db3f0ab542223d99e658cea642aadd586" gracePeriod=30 Apr 16 18:03:20.257446 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:03:20.257402 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-577c8d7bbf-sjzck" podUID="4f0eb385-59d8-4ad1-a6ed-60d189133c04" containerName="main" probeResult="failure" output="Get \"https://10.132.0.32:8000/health\": dial tcp 10.132.0.32:8000: connect: connection refused" Apr 16 18:03:30.257416 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:03:30.257360 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-577c8d7bbf-sjzck" podUID="4f0eb385-59d8-4ad1-a6ed-60d189133c04" containerName="main" probeResult="failure" output="Get \"https://10.132.0.32:8000/health\": dial tcp 10.132.0.32:8000: connect: connection refused" Apr 16 18:03:40.257144 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:03:40.257024 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-577c8d7bbf-sjzck" podUID="4f0eb385-59d8-4ad1-a6ed-60d189133c04" containerName="main" probeResult="failure" output="Get \"https://10.132.0.32:8000/health\": dial tcp 10.132.0.32:8000: connect: connection refused" Apr 16 18:03:43.419753 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:03:43.419706 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-844cb9bffc-9b6c8_a8f2030d-fb34-4d00-bd41-c06db5d1543d/main/0.log" Apr 16 18:03:43.420149 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:03:43.420134 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-844cb9bffc-9b6c8" Apr 16 18:03:43.459528 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:03:43.459494 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6jczg\" (UniqueName: \"kubernetes.io/projected/a8f2030d-fb34-4d00-bd41-c06db5d1543d-kube-api-access-6jczg\") pod \"a8f2030d-fb34-4d00-bd41-c06db5d1543d\" (UID: \"a8f2030d-fb34-4d00-bd41-c06db5d1543d\") " Apr 16 18:03:43.459734 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:03:43.459550 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/a8f2030d-fb34-4d00-bd41-c06db5d1543d-home\") pod \"a8f2030d-fb34-4d00-bd41-c06db5d1543d\" (UID: \"a8f2030d-fb34-4d00-bd41-c06db5d1543d\") " Apr 16 18:03:43.459734 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:03:43.459602 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a8f2030d-fb34-4d00-bd41-c06db5d1543d-tls-certs\") pod \"a8f2030d-fb34-4d00-bd41-c06db5d1543d\" (UID: \"a8f2030d-fb34-4d00-bd41-c06db5d1543d\") " Apr 16 18:03:43.459734 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:03:43.459651 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/a8f2030d-fb34-4d00-bd41-c06db5d1543d-model-cache\") pod \"a8f2030d-fb34-4d00-bd41-c06db5d1543d\" (UID: \"a8f2030d-fb34-4d00-bd41-c06db5d1543d\") " Apr 16 18:03:43.459908 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:03:43.459743 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a8f2030d-fb34-4d00-bd41-c06db5d1543d-kserve-provision-location\") pod \"a8f2030d-fb34-4d00-bd41-c06db5d1543d\" (UID: \"a8f2030d-fb34-4d00-bd41-c06db5d1543d\") " Apr 16 18:03:43.459908 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:03:43.459772 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/a8f2030d-fb34-4d00-bd41-c06db5d1543d-dshm\") pod \"a8f2030d-fb34-4d00-bd41-c06db5d1543d\" (UID: \"a8f2030d-fb34-4d00-bd41-c06db5d1543d\") " Apr 16 18:03:43.460111 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:03:43.460017 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a8f2030d-fb34-4d00-bd41-c06db5d1543d-home" (OuterVolumeSpecName: "home") pod "a8f2030d-fb34-4d00-bd41-c06db5d1543d" (UID: "a8f2030d-fb34-4d00-bd41-c06db5d1543d"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:03:43.460509 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:03:43.460473 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a8f2030d-fb34-4d00-bd41-c06db5d1543d-model-cache" (OuterVolumeSpecName: "model-cache") pod "a8f2030d-fb34-4d00-bd41-c06db5d1543d" (UID: "a8f2030d-fb34-4d00-bd41-c06db5d1543d"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:03:43.462406 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:03:43.462375 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a8f2030d-fb34-4d00-bd41-c06db5d1543d-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "a8f2030d-fb34-4d00-bd41-c06db5d1543d" (UID: "a8f2030d-fb34-4d00-bd41-c06db5d1543d"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:03:43.462566 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:03:43.462544 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a8f2030d-fb34-4d00-bd41-c06db5d1543d-dshm" (OuterVolumeSpecName: "dshm") pod "a8f2030d-fb34-4d00-bd41-c06db5d1543d" (UID: "a8f2030d-fb34-4d00-bd41-c06db5d1543d"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:03:43.462653 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:03:43.462557 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a8f2030d-fb34-4d00-bd41-c06db5d1543d-kube-api-access-6jczg" (OuterVolumeSpecName: "kube-api-access-6jczg") pod "a8f2030d-fb34-4d00-bd41-c06db5d1543d" (UID: "a8f2030d-fb34-4d00-bd41-c06db5d1543d"). InnerVolumeSpecName "kube-api-access-6jczg". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:03:43.522930 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:03:43.522881 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a8f2030d-fb34-4d00-bd41-c06db5d1543d-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "a8f2030d-fb34-4d00-bd41-c06db5d1543d" (UID: "a8f2030d-fb34-4d00-bd41-c06db5d1543d"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:03:43.560698 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:03:43.560662 2574 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a8f2030d-fb34-4d00-bd41-c06db5d1543d-kserve-provision-location\") on node \"ip-10-0-138-45.ec2.internal\" DevicePath \"\"" Apr 16 18:03:43.560698 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:03:43.560695 2574 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/a8f2030d-fb34-4d00-bd41-c06db5d1543d-dshm\") on node \"ip-10-0-138-45.ec2.internal\" DevicePath \"\"" Apr 16 18:03:43.560858 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:03:43.560708 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-6jczg\" (UniqueName: \"kubernetes.io/projected/a8f2030d-fb34-4d00-bd41-c06db5d1543d-kube-api-access-6jczg\") on node \"ip-10-0-138-45.ec2.internal\" DevicePath \"\"" Apr 16 18:03:43.560858 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:03:43.560718 2574 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/a8f2030d-fb34-4d00-bd41-c06db5d1543d-home\") on node \"ip-10-0-138-45.ec2.internal\" DevicePath \"\"" Apr 16 18:03:43.560858 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:03:43.560726 2574 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a8f2030d-fb34-4d00-bd41-c06db5d1543d-tls-certs\") on node \"ip-10-0-138-45.ec2.internal\" DevicePath \"\"" Apr 16 18:03:43.560858 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:03:43.560735 2574 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/a8f2030d-fb34-4d00-bd41-c06db5d1543d-model-cache\") on node \"ip-10-0-138-45.ec2.internal\" DevicePath \"\"" Apr 16 18:03:44.164172 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:03:44.164145 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-844cb9bffc-9b6c8_a8f2030d-fb34-4d00-bd41-c06db5d1543d/main/0.log" Apr 16 18:03:44.164511 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:03:44.164486 2574 generic.go:358] "Generic (PLEG): container finished" podID="a8f2030d-fb34-4d00-bd41-c06db5d1543d" containerID="49fc6aa76424cebf12ea639706ee7e8db3f0ab542223d99e658cea642aadd586" exitCode=137 Apr 16 18:03:44.164611 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:03:44.164563 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-844cb9bffc-9b6c8" event={"ID":"a8f2030d-fb34-4d00-bd41-c06db5d1543d","Type":"ContainerDied","Data":"49fc6aa76424cebf12ea639706ee7e8db3f0ab542223d99e658cea642aadd586"} Apr 16 18:03:44.164611 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:03:44.164597 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-844cb9bffc-9b6c8" Apr 16 18:03:44.164727 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:03:44.164620 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-844cb9bffc-9b6c8" event={"ID":"a8f2030d-fb34-4d00-bd41-c06db5d1543d","Type":"ContainerDied","Data":"333c4c4bf70da9b916d92d7c49ef9dc4c104b04f335835a4ddf5825b6350d38a"} Apr 16 18:03:44.164727 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:03:44.164637 2574 scope.go:117] "RemoveContainer" containerID="49fc6aa76424cebf12ea639706ee7e8db3f0ab542223d99e658cea642aadd586" Apr 16 18:03:44.186161 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:03:44.186133 2574 scope.go:117] "RemoveContainer" containerID="e853d55cade868b8d007b8476bd4a22f0ba788d398f5487f7aa3eac32c4b1fcb" Apr 16 18:03:44.195616 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:03:44.195563 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-844cb9bffc-9b6c8"] Apr 16 18:03:44.197428 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:03:44.197383 2574 scope.go:117] "RemoveContainer" containerID="49fc6aa76424cebf12ea639706ee7e8db3f0ab542223d99e658cea642aadd586" Apr 16 18:03:44.197865 ip-10-0-138-45 kubenswrapper[2574]: E0416 18:03:44.197839 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"49fc6aa76424cebf12ea639706ee7e8db3f0ab542223d99e658cea642aadd586\": container with ID starting with 49fc6aa76424cebf12ea639706ee7e8db3f0ab542223d99e658cea642aadd586 not found: ID does not exist" containerID="49fc6aa76424cebf12ea639706ee7e8db3f0ab542223d99e658cea642aadd586" Apr 16 18:03:44.197983 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:03:44.197878 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"49fc6aa76424cebf12ea639706ee7e8db3f0ab542223d99e658cea642aadd586"} err="failed to get container status \"49fc6aa76424cebf12ea639706ee7e8db3f0ab542223d99e658cea642aadd586\": rpc error: code = NotFound desc = could not find container \"49fc6aa76424cebf12ea639706ee7e8db3f0ab542223d99e658cea642aadd586\": container with ID starting with 49fc6aa76424cebf12ea639706ee7e8db3f0ab542223d99e658cea642aadd586 not found: ID does not exist" Apr 16 18:03:44.197983 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:03:44.197905 2574 scope.go:117] "RemoveContainer" containerID="e853d55cade868b8d007b8476bd4a22f0ba788d398f5487f7aa3eac32c4b1fcb" Apr 16 18:03:44.198243 ip-10-0-138-45 kubenswrapper[2574]: E0416 18:03:44.198206 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e853d55cade868b8d007b8476bd4a22f0ba788d398f5487f7aa3eac32c4b1fcb\": container with ID starting with e853d55cade868b8d007b8476bd4a22f0ba788d398f5487f7aa3eac32c4b1fcb not found: ID does not exist" containerID="e853d55cade868b8d007b8476bd4a22f0ba788d398f5487f7aa3eac32c4b1fcb" Apr 16 18:03:44.198310 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:03:44.198254 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e853d55cade868b8d007b8476bd4a22f0ba788d398f5487f7aa3eac32c4b1fcb"} err="failed to get container status \"e853d55cade868b8d007b8476bd4a22f0ba788d398f5487f7aa3eac32c4b1fcb\": rpc error: code = NotFound desc = could not find container \"e853d55cade868b8d007b8476bd4a22f0ba788d398f5487f7aa3eac32c4b1fcb\": container with ID starting with e853d55cade868b8d007b8476bd4a22f0ba788d398f5487f7aa3eac32c4b1fcb not found: ID does not exist" Apr 16 18:03:44.199179 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:03:44.199160 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-844cb9bffc-9b6c8"] Apr 16 18:03:44.455125 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:03:44.455084 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a8f2030d-fb34-4d00-bd41-c06db5d1543d" path="/var/lib/kubelet/pods/a8f2030d-fb34-4d00-bd41-c06db5d1543d/volumes" Apr 16 18:03:50.257931 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:03:50.257892 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-577c8d7bbf-sjzck" podUID="4f0eb385-59d8-4ad1-a6ed-60d189133c04" containerName="main" probeResult="failure" output="Get \"https://10.132.0.32:8000/health\": dial tcp 10.132.0.32:8000: connect: connection refused" Apr 16 18:04:00.256916 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:04:00.256862 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-577c8d7bbf-sjzck" podUID="4f0eb385-59d8-4ad1-a6ed-60d189133c04" containerName="main" probeResult="failure" output="Get \"https://10.132.0.32:8000/health\": dial tcp 10.132.0.32:8000: connect: connection refused" Apr 16 18:04:10.267003 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:04:10.266971 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-577c8d7bbf-sjzck" Apr 16 18:04:10.274539 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:04:10.274517 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-577c8d7bbf-sjzck" Apr 16 18:04:18.287751 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:04:18.287713 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-577c8d7bbf-sjzck"] Apr 16 18:04:18.288151 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:04:18.288102 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-577c8d7bbf-sjzck" podUID="4f0eb385-59d8-4ad1-a6ed-60d189133c04" containerName="main" containerID="cri-o://fb0b9657cd7f3329b8aae39b87ab36da94330d336b3735bb9e8ca97c8c3765fe" gracePeriod=30 Apr 16 18:04:24.843408 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:04:24.843362 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-69c958dbdd8kvv7"] Apr 16 18:04:24.843913 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:04:24.843800 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a8f2030d-fb34-4d00-bd41-c06db5d1543d" containerName="storage-initializer" Apr 16 18:04:24.843913 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:04:24.843815 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8f2030d-fb34-4d00-bd41-c06db5d1543d" containerName="storage-initializer" Apr 16 18:04:24.843913 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:04:24.843860 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b4f70356-c88a-42a5-9f75-c75b44ed08b9" containerName="main" Apr 16 18:04:24.843913 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:04:24.843869 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4f70356-c88a-42a5-9f75-c75b44ed08b9" containerName="main" Apr 16 18:04:24.843913 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:04:24.843884 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a8f2030d-fb34-4d00-bd41-c06db5d1543d" containerName="main" Apr 16 18:04:24.843913 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:04:24.843892 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8f2030d-fb34-4d00-bd41-c06db5d1543d" containerName="main" Apr 16 18:04:24.843913 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:04:24.843902 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b4f70356-c88a-42a5-9f75-c75b44ed08b9" containerName="storage-initializer" Apr 16 18:04:24.843913 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:04:24.843910 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4f70356-c88a-42a5-9f75-c75b44ed08b9" containerName="storage-initializer" Apr 16 18:04:24.844282 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:04:24.843992 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="a8f2030d-fb34-4d00-bd41-c06db5d1543d" containerName="main" Apr 16 18:04:24.844282 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:04:24.844004 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="b4f70356-c88a-42a5-9f75-c75b44ed08b9" containerName="main" Apr 16 18:04:24.846827 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:04:24.846806 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-69c958dbdd8kvv7" Apr 16 18:04:24.859457 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:04:24.854306 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisv8f1a6f044e8c7a4d31a250e0c4861caf-kserve-self-signed-certs\"" Apr 16 18:04:24.859457 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:04:24.854665 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-dockercfg-ckt4h\"" Apr 16 18:04:24.859457 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:04:24.856296 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-694dqdq"] Apr 16 18:04:24.863518 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:04:24.863482 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-694dqdq" Apr 16 18:04:24.864214 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:04:24.864186 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-69c958dbdd8kvv7"] Apr 16 18:04:24.874070 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:04:24.874047 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-694dqdq"] Apr 16 18:04:24.895226 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:04:24.895188 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1f60138e-b980-49ed-a234-2ad31edecde5-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-69c958dbdd8kvv7\" (UID: \"1f60138e-b980-49ed-a234-2ad31edecde5\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-69c958dbdd8kvv7" Apr 16 18:04:24.895386 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:04:24.895231 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/32b8df8f-1d64-437d-a608-8fc7a5c1f8ff-home\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-694dqdq\" (UID: \"32b8df8f-1d64-437d-a608-8fc7a5c1f8ff\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-694dqdq" Apr 16 18:04:24.895386 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:04:24.895296 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/1f60138e-b980-49ed-a234-2ad31edecde5-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-69c958dbdd8kvv7\" (UID: \"1f60138e-b980-49ed-a234-2ad31edecde5\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-69c958dbdd8kvv7" Apr 16 18:04:24.895386 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:04:24.895344 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/32b8df8f-1d64-437d-a608-8fc7a5c1f8ff-dshm\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-694dqdq\" (UID: \"32b8df8f-1d64-437d-a608-8fc7a5c1f8ff\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-694dqdq" Apr 16 18:04:24.895386 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:04:24.895380 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/32b8df8f-1d64-437d-a608-8fc7a5c1f8ff-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-694dqdq\" (UID: \"32b8df8f-1d64-437d-a608-8fc7a5c1f8ff\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-694dqdq" Apr 16 18:04:24.895617 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:04:24.895397 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/1f60138e-b980-49ed-a234-2ad31edecde5-dshm\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-69c958dbdd8kvv7\" (UID: \"1f60138e-b980-49ed-a234-2ad31edecde5\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-69c958dbdd8kvv7" Apr 16 18:04:24.895617 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:04:24.895417 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/1f60138e-b980-49ed-a234-2ad31edecde5-home\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-69c958dbdd8kvv7\" (UID: \"1f60138e-b980-49ed-a234-2ad31edecde5\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-69c958dbdd8kvv7" Apr 16 18:04:24.895617 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:04:24.895477 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/32b8df8f-1d64-437d-a608-8fc7a5c1f8ff-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-694dqdq\" (UID: \"32b8df8f-1d64-437d-a608-8fc7a5c1f8ff\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-694dqdq" Apr 16 18:04:24.895617 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:04:24.895534 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gmtxn\" (UniqueName: \"kubernetes.io/projected/32b8df8f-1d64-437d-a608-8fc7a5c1f8ff-kube-api-access-gmtxn\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-694dqdq\" (UID: \"32b8df8f-1d64-437d-a608-8fc7a5c1f8ff\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-694dqdq" Apr 16 18:04:24.895617 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:04:24.895564 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6vq9l\" (UniqueName: \"kubernetes.io/projected/1f60138e-b980-49ed-a234-2ad31edecde5-kube-api-access-6vq9l\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-69c958dbdd8kvv7\" (UID: \"1f60138e-b980-49ed-a234-2ad31edecde5\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-69c958dbdd8kvv7" Apr 16 18:04:24.895831 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:04:24.895633 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/32b8df8f-1d64-437d-a608-8fc7a5c1f8ff-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-694dqdq\" (UID: \"32b8df8f-1d64-437d-a608-8fc7a5c1f8ff\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-694dqdq" Apr 16 18:04:24.895831 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:04:24.895666 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/1f60138e-b980-49ed-a234-2ad31edecde5-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-69c958dbdd8kvv7\" (UID: \"1f60138e-b980-49ed-a234-2ad31edecde5\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-69c958dbdd8kvv7" Apr 16 18:04:24.997022 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:04:24.996986 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/32b8df8f-1d64-437d-a608-8fc7a5c1f8ff-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-694dqdq\" (UID: \"32b8df8f-1d64-437d-a608-8fc7a5c1f8ff\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-694dqdq" Apr 16 18:04:24.997022 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:04:24.997029 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/1f60138e-b980-49ed-a234-2ad31edecde5-dshm\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-69c958dbdd8kvv7\" (UID: \"1f60138e-b980-49ed-a234-2ad31edecde5\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-69c958dbdd8kvv7" Apr 16 18:04:24.997284 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:04:24.997058 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/1f60138e-b980-49ed-a234-2ad31edecde5-home\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-69c958dbdd8kvv7\" (UID: \"1f60138e-b980-49ed-a234-2ad31edecde5\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-69c958dbdd8kvv7" Apr 16 18:04:24.997284 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:04:24.997088 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/32b8df8f-1d64-437d-a608-8fc7a5c1f8ff-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-694dqdq\" (UID: \"32b8df8f-1d64-437d-a608-8fc7a5c1f8ff\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-694dqdq" Apr 16 18:04:24.997284 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:04:24.997129 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gmtxn\" (UniqueName: \"kubernetes.io/projected/32b8df8f-1d64-437d-a608-8fc7a5c1f8ff-kube-api-access-gmtxn\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-694dqdq\" (UID: \"32b8df8f-1d64-437d-a608-8fc7a5c1f8ff\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-694dqdq" Apr 16 18:04:24.997284 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:04:24.997152 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6vq9l\" (UniqueName: \"kubernetes.io/projected/1f60138e-b980-49ed-a234-2ad31edecde5-kube-api-access-6vq9l\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-69c958dbdd8kvv7\" (UID: \"1f60138e-b980-49ed-a234-2ad31edecde5\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-69c958dbdd8kvv7" Apr 16 18:04:24.997284 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:04:24.997181 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/32b8df8f-1d64-437d-a608-8fc7a5c1f8ff-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-694dqdq\" (UID: \"32b8df8f-1d64-437d-a608-8fc7a5c1f8ff\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-694dqdq" Apr 16 18:04:24.997284 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:04:24.997231 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/1f60138e-b980-49ed-a234-2ad31edecde5-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-69c958dbdd8kvv7\" (UID: \"1f60138e-b980-49ed-a234-2ad31edecde5\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-69c958dbdd8kvv7" Apr 16 18:04:24.997617 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:04:24.997441 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/32b8df8f-1d64-437d-a608-8fc7a5c1f8ff-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-694dqdq\" (UID: \"32b8df8f-1d64-437d-a608-8fc7a5c1f8ff\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-694dqdq" Apr 16 18:04:24.997617 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:04:24.997491 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/1f60138e-b980-49ed-a234-2ad31edecde5-home\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-69c958dbdd8kvv7\" (UID: \"1f60138e-b980-49ed-a234-2ad31edecde5\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-69c958dbdd8kvv7" Apr 16 18:04:24.997617 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:04:24.997508 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/32b8df8f-1d64-437d-a608-8fc7a5c1f8ff-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-694dqdq\" (UID: \"32b8df8f-1d64-437d-a608-8fc7a5c1f8ff\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-694dqdq" Apr 16 18:04:24.997770 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:04:24.997675 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/1f60138e-b980-49ed-a234-2ad31edecde5-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-69c958dbdd8kvv7\" (UID: \"1f60138e-b980-49ed-a234-2ad31edecde5\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-69c958dbdd8kvv7" Apr 16 18:04:24.997818 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:04:24.997782 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1f60138e-b980-49ed-a234-2ad31edecde5-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-69c958dbdd8kvv7\" (UID: \"1f60138e-b980-49ed-a234-2ad31edecde5\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-69c958dbdd8kvv7" Apr 16 18:04:24.997875 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:04:24.997824 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/32b8df8f-1d64-437d-a608-8fc7a5c1f8ff-home\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-694dqdq\" (UID: \"32b8df8f-1d64-437d-a608-8fc7a5c1f8ff\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-694dqdq" Apr 16 18:04:24.997875 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:04:24.997842 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/1f60138e-b980-49ed-a234-2ad31edecde5-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-69c958dbdd8kvv7\" (UID: \"1f60138e-b980-49ed-a234-2ad31edecde5\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-69c958dbdd8kvv7" Apr 16 18:04:24.997875 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:04:24.997870 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/32b8df8f-1d64-437d-a608-8fc7a5c1f8ff-dshm\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-694dqdq\" (UID: \"32b8df8f-1d64-437d-a608-8fc7a5c1f8ff\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-694dqdq" Apr 16 18:04:24.998156 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:04:24.998136 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/32b8df8f-1d64-437d-a608-8fc7a5c1f8ff-home\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-694dqdq\" (UID: \"32b8df8f-1d64-437d-a608-8fc7a5c1f8ff\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-694dqdq" Apr 16 18:04:24.998259 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:04:24.998235 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1f60138e-b980-49ed-a234-2ad31edecde5-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-69c958dbdd8kvv7\" (UID: \"1f60138e-b980-49ed-a234-2ad31edecde5\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-69c958dbdd8kvv7" Apr 16 18:04:24.999732 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:04:24.999706 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/1f60138e-b980-49ed-a234-2ad31edecde5-dshm\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-69c958dbdd8kvv7\" (UID: \"1f60138e-b980-49ed-a234-2ad31edecde5\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-69c958dbdd8kvv7" Apr 16 18:04:24.999920 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:04:24.999900 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/32b8df8f-1d64-437d-a608-8fc7a5c1f8ff-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-694dqdq\" (UID: \"32b8df8f-1d64-437d-a608-8fc7a5c1f8ff\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-694dqdq" Apr 16 18:04:25.000039 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:04:25.000023 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/32b8df8f-1d64-437d-a608-8fc7a5c1f8ff-dshm\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-694dqdq\" (UID: \"32b8df8f-1d64-437d-a608-8fc7a5c1f8ff\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-694dqdq" Apr 16 18:04:25.000184 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:04:25.000169 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/1f60138e-b980-49ed-a234-2ad31edecde5-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-69c958dbdd8kvv7\" (UID: \"1f60138e-b980-49ed-a234-2ad31edecde5\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-69c958dbdd8kvv7" Apr 16 18:04:25.016311 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:04:25.016278 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gmtxn\" (UniqueName: \"kubernetes.io/projected/32b8df8f-1d64-437d-a608-8fc7a5c1f8ff-kube-api-access-gmtxn\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-694dqdq\" (UID: \"32b8df8f-1d64-437d-a608-8fc7a5c1f8ff\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-694dqdq" Apr 16 18:04:25.016827 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:04:25.016804 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6vq9l\" (UniqueName: \"kubernetes.io/projected/1f60138e-b980-49ed-a234-2ad31edecde5-kube-api-access-6vq9l\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-69c958dbdd8kvv7\" (UID: \"1f60138e-b980-49ed-a234-2ad31edecde5\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-69c958dbdd8kvv7" Apr 16 18:04:25.166307 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:04:25.166215 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-69c958dbdd8kvv7" Apr 16 18:04:25.176233 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:04:25.176100 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-694dqdq" Apr 16 18:04:25.331029 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:04:25.330999 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-69c958dbdd8kvv7"] Apr 16 18:04:25.332774 ip-10-0-138-45 kubenswrapper[2574]: W0416 18:04:25.332748 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1f60138e_b980_49ed_a234_2ad31edecde5.slice/crio-2727482bba0e56cc1dee7db3d52b16b5fdc53392b01435737136f974bb5b40e0 WatchSource:0}: Error finding container 2727482bba0e56cc1dee7db3d52b16b5fdc53392b01435737136f974bb5b40e0: Status 404 returned error can't find the container with id 2727482bba0e56cc1dee7db3d52b16b5fdc53392b01435737136f974bb5b40e0 Apr 16 18:04:25.342473 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:04:25.342442 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-694dqdq"] Apr 16 18:04:25.345861 ip-10-0-138-45 kubenswrapper[2574]: W0416 18:04:25.345836 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod32b8df8f_1d64_437d_a608_8fc7a5c1f8ff.slice/crio-c829227308cffaac99c1ba91be858d74971d1e9cc13df198f918e19323ea84a1 WatchSource:0}: Error finding container c829227308cffaac99c1ba91be858d74971d1e9cc13df198f918e19323ea84a1: Status 404 returned error can't find the container with id c829227308cffaac99c1ba91be858d74971d1e9cc13df198f918e19323ea84a1 Apr 16 18:04:26.294852 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:04:26.294819 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-69c958dbdd8kvv7" event={"ID":"1f60138e-b980-49ed-a234-2ad31edecde5","Type":"ContainerStarted","Data":"2727482bba0e56cc1dee7db3d52b16b5fdc53392b01435737136f974bb5b40e0"} Apr 16 18:04:26.296408 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:04:26.296372 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-694dqdq" event={"ID":"32b8df8f-1d64-437d-a608-8fc7a5c1f8ff","Type":"ContainerStarted","Data":"db7cd248dc334b24d56727a5c7130790b05b808418e974a217f97d46811787a4"} Apr 16 18:04:26.296408 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:04:26.296408 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-694dqdq" event={"ID":"32b8df8f-1d64-437d-a608-8fc7a5c1f8ff","Type":"ContainerStarted","Data":"c829227308cffaac99c1ba91be858d74971d1e9cc13df198f918e19323ea84a1"} Apr 16 18:04:27.301232 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:04:27.301196 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-69c958dbdd8kvv7" event={"ID":"1f60138e-b980-49ed-a234-2ad31edecde5","Type":"ContainerStarted","Data":"736ff6af064da9cc7b62dfe11cd1ccb8fce897f8021b14d8e52d3136ecb611d0"} Apr 16 18:04:27.301648 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:04:27.301411 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-69c958dbdd8kvv7" Apr 16 18:04:28.308379 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:04:28.308297 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-69c958dbdd8kvv7" event={"ID":"1f60138e-b980-49ed-a234-2ad31edecde5","Type":"ContainerStarted","Data":"70540704f041cf34db1afc3ec4a83cfb91a65e6745819786f0010ed429edb66f"} Apr 16 18:04:30.316977 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:04:30.316947 2574 generic.go:358] "Generic (PLEG): container finished" podID="32b8df8f-1d64-437d-a608-8fc7a5c1f8ff" containerID="db7cd248dc334b24d56727a5c7130790b05b808418e974a217f97d46811787a4" exitCode=0 Apr 16 18:04:30.317366 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:04:30.317015 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-694dqdq" event={"ID":"32b8df8f-1d64-437d-a608-8fc7a5c1f8ff","Type":"ContainerDied","Data":"db7cd248dc334b24d56727a5c7130790b05b808418e974a217f97d46811787a4"} Apr 16 18:04:31.321724 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:04:31.321689 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-694dqdq" event={"ID":"32b8df8f-1d64-437d-a608-8fc7a5c1f8ff","Type":"ContainerStarted","Data":"253bf90db35c1a22d1c3853c7f7de28123d7dc1827e97ad0bf8e5af3b43194ee"} Apr 16 18:04:31.350080 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:04:31.350020 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-694dqdq" podStartSLOduration=7.349998795 podStartE2EDuration="7.349998795s" podCreationTimestamp="2026-04-16 18:04:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:04:31.347851273 +0000 UTC m=+1481.507221554" watchObservedRunningTime="2026-04-16 18:04:31.349998795 +0000 UTC m=+1481.509369079" Apr 16 18:04:32.329879 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:04:32.329839 2574 generic.go:358] "Generic (PLEG): container finished" podID="1f60138e-b980-49ed-a234-2ad31edecde5" containerID="70540704f041cf34db1afc3ec4a83cfb91a65e6745819786f0010ed429edb66f" exitCode=0 Apr 16 18:04:32.329879 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:04:32.329878 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-69c958dbdd8kvv7" event={"ID":"1f60138e-b980-49ed-a234-2ad31edecde5","Type":"ContainerDied","Data":"70540704f041cf34db1afc3ec4a83cfb91a65e6745819786f0010ed429edb66f"} Apr 16 18:04:33.336294 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:04:33.336248 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-69c958dbdd8kvv7" event={"ID":"1f60138e-b980-49ed-a234-2ad31edecde5","Type":"ContainerStarted","Data":"0384b3d7e14a571e82aadced8cecd9716460630aa3f6995ab0829ef1ba6a132d"} Apr 16 18:04:33.363237 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:04:33.363172 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-69c958dbdd8kvv7" podStartSLOduration=8.452751556 podStartE2EDuration="9.363154605s" podCreationTimestamp="2026-04-16 18:04:24 +0000 UTC" firstStartedPulling="2026-04-16 18:04:25.334537194 +0000 UTC m=+1475.493907458" lastFinishedPulling="2026-04-16 18:04:26.244940244 +0000 UTC m=+1476.404310507" observedRunningTime="2026-04-16 18:04:33.360954731 +0000 UTC m=+1483.520325013" watchObservedRunningTime="2026-04-16 18:04:33.363154605 +0000 UTC m=+1483.522524888" Apr 16 18:04:35.166792 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:04:35.166736 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-69c958dbdd8kvv7" Apr 16 18:04:35.167285 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:04:35.166814 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-69c958dbdd8kvv7" Apr 16 18:04:35.168301 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:04:35.168272 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-69c958dbdd8kvv7" podUID="1f60138e-b980-49ed-a234-2ad31edecde5" containerName="main" probeResult="failure" output="Get \"https://10.132.0.33:8001/health\": dial tcp 10.132.0.33:8001: connect: connection refused" Apr 16 18:04:35.176656 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:04:35.176628 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-694dqdq" Apr 16 18:04:35.176812 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:04:35.176666 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-694dqdq" Apr 16 18:04:35.178009 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:04:35.177984 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-694dqdq" podUID="32b8df8f-1d64-437d-a608-8fc7a5c1f8ff" containerName="main" probeResult="failure" output="Get \"https://10.132.0.34:8000/health\": dial tcp 10.132.0.34:8000: connect: connection refused" Apr 16 18:04:39.624593 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:04:39.624531 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-994df879-rhgvs"] Apr 16 18:04:39.656919 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:04:39.656875 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-994df879-rhgvs"] Apr 16 18:04:39.657086 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:04:39.657009 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-994df879-rhgvs" Apr 16 18:04:39.659464 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:04:39.659435 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisv8de1d74aab16d9cabd8b5aafeb5248e8-kserve-self-signed-certs\"" Apr 16 18:04:39.732805 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:04:39.732754 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/eb8915a4-4e38-4d49-88d7-129343c09d6a-home\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-994df879-rhgvs\" (UID: \"eb8915a4-4e38-4d49-88d7-129343c09d6a\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-994df879-rhgvs" Apr 16 18:04:39.732805 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:04:39.732800 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/eb8915a4-4e38-4d49-88d7-129343c09d6a-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-994df879-rhgvs\" (UID: \"eb8915a4-4e38-4d49-88d7-129343c09d6a\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-994df879-rhgvs" Apr 16 18:04:39.733078 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:04:39.732909 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/eb8915a4-4e38-4d49-88d7-129343c09d6a-dshm\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-994df879-rhgvs\" (UID: \"eb8915a4-4e38-4d49-88d7-129343c09d6a\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-994df879-rhgvs" Apr 16 18:04:39.733078 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:04:39.732951 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/eb8915a4-4e38-4d49-88d7-129343c09d6a-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-994df879-rhgvs\" (UID: \"eb8915a4-4e38-4d49-88d7-129343c09d6a\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-994df879-rhgvs" Apr 16 18:04:39.733078 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:04:39.733004 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/eb8915a4-4e38-4d49-88d7-129343c09d6a-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-994df879-rhgvs\" (UID: \"eb8915a4-4e38-4d49-88d7-129343c09d6a\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-994df879-rhgvs" Apr 16 18:04:39.733078 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:04:39.733044 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hs47b\" (UniqueName: \"kubernetes.io/projected/eb8915a4-4e38-4d49-88d7-129343c09d6a-kube-api-access-hs47b\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-994df879-rhgvs\" (UID: \"eb8915a4-4e38-4d49-88d7-129343c09d6a\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-994df879-rhgvs" Apr 16 18:04:39.834322 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:04:39.834285 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/eb8915a4-4e38-4d49-88d7-129343c09d6a-home\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-994df879-rhgvs\" (UID: \"eb8915a4-4e38-4d49-88d7-129343c09d6a\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-994df879-rhgvs" Apr 16 18:04:39.834322 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:04:39.834323 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/eb8915a4-4e38-4d49-88d7-129343c09d6a-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-994df879-rhgvs\" (UID: \"eb8915a4-4e38-4d49-88d7-129343c09d6a\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-994df879-rhgvs" Apr 16 18:04:39.834609 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:04:39.834360 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/eb8915a4-4e38-4d49-88d7-129343c09d6a-dshm\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-994df879-rhgvs\" (UID: \"eb8915a4-4e38-4d49-88d7-129343c09d6a\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-994df879-rhgvs" Apr 16 18:04:39.834609 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:04:39.834385 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/eb8915a4-4e38-4d49-88d7-129343c09d6a-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-994df879-rhgvs\" (UID: \"eb8915a4-4e38-4d49-88d7-129343c09d6a\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-994df879-rhgvs" Apr 16 18:04:39.834609 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:04:39.834414 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/eb8915a4-4e38-4d49-88d7-129343c09d6a-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-994df879-rhgvs\" (UID: \"eb8915a4-4e38-4d49-88d7-129343c09d6a\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-994df879-rhgvs" Apr 16 18:04:39.834796 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:04:39.834748 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hs47b\" (UniqueName: \"kubernetes.io/projected/eb8915a4-4e38-4d49-88d7-129343c09d6a-kube-api-access-hs47b\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-994df879-rhgvs\" (UID: \"eb8915a4-4e38-4d49-88d7-129343c09d6a\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-994df879-rhgvs" Apr 16 18:04:39.834853 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:04:39.834825 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/eb8915a4-4e38-4d49-88d7-129343c09d6a-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-994df879-rhgvs\" (UID: \"eb8915a4-4e38-4d49-88d7-129343c09d6a\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-994df879-rhgvs" Apr 16 18:04:39.834908 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:04:39.834874 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/eb8915a4-4e38-4d49-88d7-129343c09d6a-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-994df879-rhgvs\" (UID: \"eb8915a4-4e38-4d49-88d7-129343c09d6a\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-994df879-rhgvs" Apr 16 18:04:39.834962 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:04:39.834935 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/eb8915a4-4e38-4d49-88d7-129343c09d6a-home\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-994df879-rhgvs\" (UID: \"eb8915a4-4e38-4d49-88d7-129343c09d6a\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-994df879-rhgvs" Apr 16 18:04:39.836837 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:04:39.836808 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/eb8915a4-4e38-4d49-88d7-129343c09d6a-dshm\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-994df879-rhgvs\" (UID: \"eb8915a4-4e38-4d49-88d7-129343c09d6a\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-994df879-rhgvs" Apr 16 18:04:39.837373 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:04:39.837346 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/eb8915a4-4e38-4d49-88d7-129343c09d6a-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-994df879-rhgvs\" (UID: \"eb8915a4-4e38-4d49-88d7-129343c09d6a\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-994df879-rhgvs" Apr 16 18:04:39.846475 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:04:39.846444 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hs47b\" (UniqueName: \"kubernetes.io/projected/eb8915a4-4e38-4d49-88d7-129343c09d6a-kube-api-access-hs47b\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-994df879-rhgvs\" (UID: \"eb8915a4-4e38-4d49-88d7-129343c09d6a\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-994df879-rhgvs" Apr 16 18:04:39.968307 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:04:39.968263 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-994df879-rhgvs" Apr 16 18:04:40.127274 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:04:40.127244 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-994df879-rhgvs"] Apr 16 18:04:40.129118 ip-10-0-138-45 kubenswrapper[2574]: W0416 18:04:40.129085 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeb8915a4_4e38_4d49_88d7_129343c09d6a.slice/crio-0c8cbac48e06366ce6ed775f62a11ce97893d935d26936375828974c90bc77e1 WatchSource:0}: Error finding container 0c8cbac48e06366ce6ed775f62a11ce97893d935d26936375828974c90bc77e1: Status 404 returned error can't find the container with id 0c8cbac48e06366ce6ed775f62a11ce97893d935d26936375828974c90bc77e1 Apr 16 18:04:40.364915 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:04:40.364880 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-994df879-rhgvs" event={"ID":"eb8915a4-4e38-4d49-88d7-129343c09d6a","Type":"ContainerStarted","Data":"dfc5c1372de7d99babdbad8e4f5122a8add95e259887a74dcfc6d059a899dbd0"} Apr 16 18:04:40.364915 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:04:40.364924 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-994df879-rhgvs" event={"ID":"eb8915a4-4e38-4d49-88d7-129343c09d6a","Type":"ContainerStarted","Data":"0c8cbac48e06366ce6ed775f62a11ce97893d935d26936375828974c90bc77e1"} Apr 16 18:04:45.167744 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:04:45.167689 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-69c958dbdd8kvv7" podUID="1f60138e-b980-49ed-a234-2ad31edecde5" containerName="main" probeResult="failure" output="Get \"https://10.132.0.33:8001/health\": dial tcp 10.132.0.33:8001: connect: connection refused" Apr 16 18:04:45.177239 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:04:45.177202 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-694dqdq" podUID="32b8df8f-1d64-437d-a608-8fc7a5c1f8ff" containerName="main" probeResult="failure" output="Get \"https://10.132.0.34:8000/health\": dial tcp 10.132.0.34:8000: connect: connection refused" Apr 16 18:04:45.362217 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:04:45.362184 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-69c958dbdd8kvv7" Apr 16 18:04:45.388619 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:04:45.388555 2574 generic.go:358] "Generic (PLEG): container finished" podID="eb8915a4-4e38-4d49-88d7-129343c09d6a" containerID="dfc5c1372de7d99babdbad8e4f5122a8add95e259887a74dcfc6d059a899dbd0" exitCode=0 Apr 16 18:04:45.388820 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:04:45.388632 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-994df879-rhgvs" event={"ID":"eb8915a4-4e38-4d49-88d7-129343c09d6a","Type":"ContainerDied","Data":"dfc5c1372de7d99babdbad8e4f5122a8add95e259887a74dcfc6d059a899dbd0"} Apr 16 18:04:46.395126 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:04:46.395086 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-994df879-rhgvs" event={"ID":"eb8915a4-4e38-4d49-88d7-129343c09d6a","Type":"ContainerStarted","Data":"0a1fd15cc0e4239564d15abc34bca357b489474b1411f5bf53cb0859d58d97a0"} Apr 16 18:04:46.419198 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:04:46.419137 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-994df879-rhgvs" podStartSLOduration=7.419114047 podStartE2EDuration="7.419114047s" podCreationTimestamp="2026-04-16 18:04:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:04:46.417714332 +0000 UTC m=+1496.577084616" watchObservedRunningTime="2026-04-16 18:04:46.419114047 +0000 UTC m=+1496.578484330" Apr 16 18:04:48.734962 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:04:48.734925 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-test-kserve-577c8d7bbf-sjzck_4f0eb385-59d8-4ad1-a6ed-60d189133c04/main/0.log" Apr 16 18:04:48.735424 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:04:48.735405 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-577c8d7bbf-sjzck" Apr 16 18:04:48.818916 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:04:48.818872 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/4f0eb385-59d8-4ad1-a6ed-60d189133c04-tls-certs\") pod \"4f0eb385-59d8-4ad1-a6ed-60d189133c04\" (UID: \"4f0eb385-59d8-4ad1-a6ed-60d189133c04\") " Apr 16 18:04:48.819121 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:04:48.818931 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/4f0eb385-59d8-4ad1-a6ed-60d189133c04-model-cache\") pod \"4f0eb385-59d8-4ad1-a6ed-60d189133c04\" (UID: \"4f0eb385-59d8-4ad1-a6ed-60d189133c04\") " Apr 16 18:04:48.819121 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:04:48.818961 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-chc7p\" (UniqueName: \"kubernetes.io/projected/4f0eb385-59d8-4ad1-a6ed-60d189133c04-kube-api-access-chc7p\") pod \"4f0eb385-59d8-4ad1-a6ed-60d189133c04\" (UID: \"4f0eb385-59d8-4ad1-a6ed-60d189133c04\") " Apr 16 18:04:48.819121 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:04:48.819080 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4f0eb385-59d8-4ad1-a6ed-60d189133c04-kserve-provision-location\") pod \"4f0eb385-59d8-4ad1-a6ed-60d189133c04\" (UID: \"4f0eb385-59d8-4ad1-a6ed-60d189133c04\") " Apr 16 18:04:48.819121 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:04:48.819108 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/4f0eb385-59d8-4ad1-a6ed-60d189133c04-home\") pod \"4f0eb385-59d8-4ad1-a6ed-60d189133c04\" (UID: \"4f0eb385-59d8-4ad1-a6ed-60d189133c04\") " Apr 16 18:04:48.819330 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:04:48.819130 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/4f0eb385-59d8-4ad1-a6ed-60d189133c04-dshm\") pod \"4f0eb385-59d8-4ad1-a6ed-60d189133c04\" (UID: \"4f0eb385-59d8-4ad1-a6ed-60d189133c04\") " Apr 16 18:04:48.819330 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:04:48.819182 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4f0eb385-59d8-4ad1-a6ed-60d189133c04-model-cache" (OuterVolumeSpecName: "model-cache") pod "4f0eb385-59d8-4ad1-a6ed-60d189133c04" (UID: "4f0eb385-59d8-4ad1-a6ed-60d189133c04"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:04:48.819431 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:04:48.819387 2574 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/4f0eb385-59d8-4ad1-a6ed-60d189133c04-model-cache\") on node \"ip-10-0-138-45.ec2.internal\" DevicePath \"\"" Apr 16 18:04:48.819733 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:04:48.819710 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4f0eb385-59d8-4ad1-a6ed-60d189133c04-home" (OuterVolumeSpecName: "home") pod "4f0eb385-59d8-4ad1-a6ed-60d189133c04" (UID: "4f0eb385-59d8-4ad1-a6ed-60d189133c04"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:04:48.821971 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:04:48.821917 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4f0eb385-59d8-4ad1-a6ed-60d189133c04-dshm" (OuterVolumeSpecName: "dshm") pod "4f0eb385-59d8-4ad1-a6ed-60d189133c04" (UID: "4f0eb385-59d8-4ad1-a6ed-60d189133c04"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:04:48.822086 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:04:48.822024 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f0eb385-59d8-4ad1-a6ed-60d189133c04-kube-api-access-chc7p" (OuterVolumeSpecName: "kube-api-access-chc7p") pod "4f0eb385-59d8-4ad1-a6ed-60d189133c04" (UID: "4f0eb385-59d8-4ad1-a6ed-60d189133c04"). InnerVolumeSpecName "kube-api-access-chc7p". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:04:48.823734 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:04:48.823699 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f0eb385-59d8-4ad1-a6ed-60d189133c04-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "4f0eb385-59d8-4ad1-a6ed-60d189133c04" (UID: "4f0eb385-59d8-4ad1-a6ed-60d189133c04"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:04:48.911512 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:04:48.911465 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4f0eb385-59d8-4ad1-a6ed-60d189133c04-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "4f0eb385-59d8-4ad1-a6ed-60d189133c04" (UID: "4f0eb385-59d8-4ad1-a6ed-60d189133c04"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:04:48.920156 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:04:48.920121 2574 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4f0eb385-59d8-4ad1-a6ed-60d189133c04-kserve-provision-location\") on node \"ip-10-0-138-45.ec2.internal\" DevicePath \"\"" Apr 16 18:04:48.920317 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:04:48.920162 2574 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/4f0eb385-59d8-4ad1-a6ed-60d189133c04-home\") on node \"ip-10-0-138-45.ec2.internal\" DevicePath \"\"" Apr 16 18:04:48.920317 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:04:48.920178 2574 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/4f0eb385-59d8-4ad1-a6ed-60d189133c04-dshm\") on node \"ip-10-0-138-45.ec2.internal\" DevicePath \"\"" Apr 16 18:04:48.920317 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:04:48.920191 2574 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/4f0eb385-59d8-4ad1-a6ed-60d189133c04-tls-certs\") on node \"ip-10-0-138-45.ec2.internal\" DevicePath \"\"" Apr 16 18:04:48.920317 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:04:48.920208 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-chc7p\" (UniqueName: \"kubernetes.io/projected/4f0eb385-59d8-4ad1-a6ed-60d189133c04-kube-api-access-chc7p\") on node \"ip-10-0-138-45.ec2.internal\" DevicePath \"\"" Apr 16 18:04:49.407629 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:04:49.407595 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-test-kserve-577c8d7bbf-sjzck_4f0eb385-59d8-4ad1-a6ed-60d189133c04/main/0.log" Apr 16 18:04:49.408017 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:04:49.407989 2574 generic.go:358] "Generic (PLEG): container finished" podID="4f0eb385-59d8-4ad1-a6ed-60d189133c04" containerID="fb0b9657cd7f3329b8aae39b87ab36da94330d336b3735bb9e8ca97c8c3765fe" exitCode=137 Apr 16 18:04:49.408132 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:04:49.408073 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-577c8d7bbf-sjzck" event={"ID":"4f0eb385-59d8-4ad1-a6ed-60d189133c04","Type":"ContainerDied","Data":"fb0b9657cd7f3329b8aae39b87ab36da94330d336b3735bb9e8ca97c8c3765fe"} Apr 16 18:04:49.408132 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:04:49.408083 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-577c8d7bbf-sjzck" Apr 16 18:04:49.408132 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:04:49.408107 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-577c8d7bbf-sjzck" event={"ID":"4f0eb385-59d8-4ad1-a6ed-60d189133c04","Type":"ContainerDied","Data":"1bfd96620ba1296bd284b1ea0ab784a677d2cf5bab997cafc6db3e377f46f26e"} Apr 16 18:04:49.408132 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:04:49.408128 2574 scope.go:117] "RemoveContainer" containerID="fb0b9657cd7f3329b8aae39b87ab36da94330d336b3735bb9e8ca97c8c3765fe" Apr 16 18:04:49.433164 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:04:49.433139 2574 scope.go:117] "RemoveContainer" containerID="29f012fb58b65ff98c21a1ae640506cf6d9a9808f761ca49e862c259bb93e16e" Apr 16 18:04:49.443638 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:04:49.443556 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-577c8d7bbf-sjzck"] Apr 16 18:04:49.447449 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:04:49.447417 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-577c8d7bbf-sjzck"] Apr 16 18:04:49.512499 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:04:49.512466 2574 scope.go:117] "RemoveContainer" containerID="fb0b9657cd7f3329b8aae39b87ab36da94330d336b3735bb9e8ca97c8c3765fe" Apr 16 18:04:49.513109 ip-10-0-138-45 kubenswrapper[2574]: E0416 18:04:49.512950 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fb0b9657cd7f3329b8aae39b87ab36da94330d336b3735bb9e8ca97c8c3765fe\": container with ID starting with fb0b9657cd7f3329b8aae39b87ab36da94330d336b3735bb9e8ca97c8c3765fe not found: ID does not exist" containerID="fb0b9657cd7f3329b8aae39b87ab36da94330d336b3735bb9e8ca97c8c3765fe" Apr 16 18:04:49.513109 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:04:49.512994 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fb0b9657cd7f3329b8aae39b87ab36da94330d336b3735bb9e8ca97c8c3765fe"} err="failed to get container status \"fb0b9657cd7f3329b8aae39b87ab36da94330d336b3735bb9e8ca97c8c3765fe\": rpc error: code = NotFound desc = could not find container \"fb0b9657cd7f3329b8aae39b87ab36da94330d336b3735bb9e8ca97c8c3765fe\": container with ID starting with fb0b9657cd7f3329b8aae39b87ab36da94330d336b3735bb9e8ca97c8c3765fe not found: ID does not exist" Apr 16 18:04:49.513109 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:04:49.513024 2574 scope.go:117] "RemoveContainer" containerID="29f012fb58b65ff98c21a1ae640506cf6d9a9808f761ca49e862c259bb93e16e" Apr 16 18:04:49.514070 ip-10-0-138-45 kubenswrapper[2574]: E0416 18:04:49.513978 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"29f012fb58b65ff98c21a1ae640506cf6d9a9808f761ca49e862c259bb93e16e\": container with ID starting with 29f012fb58b65ff98c21a1ae640506cf6d9a9808f761ca49e862c259bb93e16e not found: ID does not exist" containerID="29f012fb58b65ff98c21a1ae640506cf6d9a9808f761ca49e862c259bb93e16e" Apr 16 18:04:49.514070 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:04:49.514013 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"29f012fb58b65ff98c21a1ae640506cf6d9a9808f761ca49e862c259bb93e16e"} err="failed to get container status \"29f012fb58b65ff98c21a1ae640506cf6d9a9808f761ca49e862c259bb93e16e\": rpc error: code = NotFound desc = could not find container \"29f012fb58b65ff98c21a1ae640506cf6d9a9808f761ca49e862c259bb93e16e\": container with ID starting with 29f012fb58b65ff98c21a1ae640506cf6d9a9808f761ca49e862c259bb93e16e not found: ID does not exist" Apr 16 18:04:49.968655 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:04:49.968598 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-994df879-rhgvs" Apr 16 18:04:49.968655 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:04:49.968659 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-994df879-rhgvs" Apr 16 18:04:49.970719 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:04:49.970681 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-994df879-rhgvs" podUID="eb8915a4-4e38-4d49-88d7-129343c09d6a" containerName="main" probeResult="failure" output="Get \"https://10.132.0.35:8000/health\": dial tcp 10.132.0.35:8000: connect: connection refused" Apr 16 18:04:50.459937 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:04:50.459897 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4f0eb385-59d8-4ad1-a6ed-60d189133c04" path="/var/lib/kubelet/pods/4f0eb385-59d8-4ad1-a6ed-60d189133c04/volumes" Apr 16 18:04:50.461771 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:04:50.461743 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-zcvg5_05d7dd61-bda6-4c4d-87b6-86d7c20fe0cd/console-operator/1.log" Apr 16 18:04:50.462387 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:04:50.462366 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-zcvg5_05d7dd61-bda6-4c4d-87b6-86d7c20fe0cd/console-operator/1.log" Apr 16 18:04:50.478254 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:04:50.478225 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-42vwn_2ee69609-b777-4f06-988b-8e74dd389f13/ovn-acl-logging/0.log" Apr 16 18:04:50.478787 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:04:50.478765 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-42vwn_2ee69609-b777-4f06-988b-8e74dd389f13/ovn-acl-logging/0.log" Apr 16 18:04:55.167745 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:04:55.167688 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-69c958dbdd8kvv7" podUID="1f60138e-b980-49ed-a234-2ad31edecde5" containerName="main" probeResult="failure" output="Get \"https://10.132.0.33:8001/health\": dial tcp 10.132.0.33:8001: connect: connection refused" Apr 16 18:04:55.177300 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:04:55.177250 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-694dqdq" podUID="32b8df8f-1d64-437d-a608-8fc7a5c1f8ff" containerName="main" probeResult="failure" output="Get \"https://10.132.0.34:8000/health\": dial tcp 10.132.0.34:8000: connect: connection refused" Apr 16 18:04:59.969094 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:04:59.969044 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-994df879-rhgvs" podUID="eb8915a4-4e38-4d49-88d7-129343c09d6a" containerName="main" probeResult="failure" output="Get \"https://10.132.0.35:8000/health\": dial tcp 10.132.0.35:8000: connect: connection refused" Apr 16 18:05:05.167032 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:05:05.166975 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-69c958dbdd8kvv7" podUID="1f60138e-b980-49ed-a234-2ad31edecde5" containerName="main" probeResult="failure" output="Get \"https://10.132.0.33:8001/health\": dial tcp 10.132.0.33:8001: connect: connection refused" Apr 16 18:05:05.176765 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:05:05.176722 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-694dqdq" podUID="32b8df8f-1d64-437d-a608-8fc7a5c1f8ff" containerName="main" probeResult="failure" output="Get \"https://10.132.0.34:8000/health\": dial tcp 10.132.0.34:8000: connect: connection refused" Apr 16 18:05:09.969025 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:05:09.968913 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-994df879-rhgvs" podUID="eb8915a4-4e38-4d49-88d7-129343c09d6a" containerName="main" probeResult="failure" output="Get \"https://10.132.0.35:8000/health\": dial tcp 10.132.0.35:8000: connect: connection refused" Apr 16 18:05:15.167367 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:05:15.167311 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-69c958dbdd8kvv7" podUID="1f60138e-b980-49ed-a234-2ad31edecde5" containerName="main" probeResult="failure" output="Get \"https://10.132.0.33:8001/health\": dial tcp 10.132.0.33:8001: connect: connection refused" Apr 16 18:05:15.177261 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:05:15.177219 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-694dqdq" podUID="32b8df8f-1d64-437d-a608-8fc7a5c1f8ff" containerName="main" probeResult="failure" output="Get \"https://10.132.0.34:8000/health\": dial tcp 10.132.0.34:8000: connect: connection refused" Apr 16 18:05:19.969534 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:05:19.969482 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-994df879-rhgvs" podUID="eb8915a4-4e38-4d49-88d7-129343c09d6a" containerName="main" probeResult="failure" output="Get \"https://10.132.0.35:8000/health\": dial tcp 10.132.0.35:8000: connect: connection refused" Apr 16 18:05:25.167469 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:05:25.167417 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-69c958dbdd8kvv7" podUID="1f60138e-b980-49ed-a234-2ad31edecde5" containerName="main" probeResult="failure" output="Get \"https://10.132.0.33:8001/health\": dial tcp 10.132.0.33:8001: connect: connection refused" Apr 16 18:05:25.177101 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:05:25.177047 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-694dqdq" podUID="32b8df8f-1d64-437d-a608-8fc7a5c1f8ff" containerName="main" probeResult="failure" output="Get \"https://10.132.0.34:8000/health\": dial tcp 10.132.0.34:8000: connect: connection refused" Apr 16 18:05:29.969344 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:05:29.969297 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-994df879-rhgvs" podUID="eb8915a4-4e38-4d49-88d7-129343c09d6a" containerName="main" probeResult="failure" output="Get \"https://10.132.0.35:8000/health\": dial tcp 10.132.0.35:8000: connect: connection refused" Apr 16 18:05:35.167623 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:05:35.167552 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-69c958dbdd8kvv7" podUID="1f60138e-b980-49ed-a234-2ad31edecde5" containerName="main" probeResult="failure" output="Get \"https://10.132.0.33:8001/health\": dial tcp 10.132.0.33:8001: connect: connection refused" Apr 16 18:05:35.176970 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:05:35.176924 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-694dqdq" podUID="32b8df8f-1d64-437d-a608-8fc7a5c1f8ff" containerName="main" probeResult="failure" output="Get \"https://10.132.0.34:8000/health\": dial tcp 10.132.0.34:8000: connect: connection refused" Apr 16 18:05:39.968846 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:05:39.968796 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-994df879-rhgvs" podUID="eb8915a4-4e38-4d49-88d7-129343c09d6a" containerName="main" probeResult="failure" output="Get \"https://10.132.0.35:8000/health\": dial tcp 10.132.0.35:8000: connect: connection refused" Apr 16 18:05:45.167212 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:05:45.167165 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-69c958dbdd8kvv7" podUID="1f60138e-b980-49ed-a234-2ad31edecde5" containerName="main" probeResult="failure" output="Get \"https://10.132.0.33:8001/health\": dial tcp 10.132.0.33:8001: connect: connection refused" Apr 16 18:05:45.176671 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:05:45.176632 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-694dqdq" podUID="32b8df8f-1d64-437d-a608-8fc7a5c1f8ff" containerName="main" probeResult="failure" output="Get \"https://10.132.0.34:8000/health\": dial tcp 10.132.0.34:8000: connect: connection refused" Apr 16 18:05:49.969748 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:05:49.969678 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-994df879-rhgvs" podUID="eb8915a4-4e38-4d49-88d7-129343c09d6a" containerName="main" probeResult="failure" output="Get \"https://10.132.0.35:8000/health\": dial tcp 10.132.0.35:8000: connect: connection refused" Apr 16 18:05:55.166901 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:05:55.166852 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-69c958dbdd8kvv7" podUID="1f60138e-b980-49ed-a234-2ad31edecde5" containerName="main" probeResult="failure" output="Get \"https://10.132.0.33:8001/health\": dial tcp 10.132.0.33:8001: connect: connection refused" Apr 16 18:05:55.178669 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:05:55.178616 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-694dqdq" podUID="32b8df8f-1d64-437d-a608-8fc7a5c1f8ff" containerName="main" probeResult="failure" output="Get \"https://10.132.0.34:8000/health\": dial tcp 10.132.0.34:8000: connect: connection refused" Apr 16 18:05:59.969395 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:05:59.969347 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-994df879-rhgvs" podUID="eb8915a4-4e38-4d49-88d7-129343c09d6a" containerName="main" probeResult="failure" output="Get \"https://10.132.0.35:8000/health\": dial tcp 10.132.0.35:8000: connect: connection refused" Apr 16 18:06:05.167164 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:06:05.167108 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-69c958dbdd8kvv7" podUID="1f60138e-b980-49ed-a234-2ad31edecde5" containerName="main" probeResult="failure" output="Get \"https://10.132.0.33:8001/health\": dial tcp 10.132.0.33:8001: connect: connection refused" Apr 16 18:06:05.178443 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:06:05.178409 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-694dqdq" podUID="32b8df8f-1d64-437d-a608-8fc7a5c1f8ff" containerName="main" probeResult="failure" output="Get \"https://10.132.0.34:8000/health\": dial tcp 10.132.0.34:8000: connect: connection refused" Apr 16 18:06:09.969086 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:06:09.969032 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-994df879-rhgvs" podUID="eb8915a4-4e38-4d49-88d7-129343c09d6a" containerName="main" probeResult="failure" output="Get \"https://10.132.0.35:8000/health\": dial tcp 10.132.0.35:8000: connect: connection refused" Apr 16 18:06:15.167041 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:06:15.166992 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-69c958dbdd8kvv7" podUID="1f60138e-b980-49ed-a234-2ad31edecde5" containerName="main" probeResult="failure" output="Get \"https://10.132.0.33:8001/health\": dial tcp 10.132.0.33:8001: connect: connection refused" Apr 16 18:06:15.177534 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:06:15.177492 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-694dqdq" podUID="32b8df8f-1d64-437d-a608-8fc7a5c1f8ff" containerName="main" probeResult="failure" output="Get \"https://10.132.0.34:8000/health\": dial tcp 10.132.0.34:8000: connect: connection refused" Apr 16 18:06:19.968853 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:06:19.968813 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-994df879-rhgvs" podUID="eb8915a4-4e38-4d49-88d7-129343c09d6a" containerName="main" probeResult="failure" output="Get \"https://10.132.0.35:8000/health\": dial tcp 10.132.0.35:8000: connect: connection refused" Apr 16 18:06:25.167451 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:06:25.167405 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-69c958dbdd8kvv7" podUID="1f60138e-b980-49ed-a234-2ad31edecde5" containerName="main" probeResult="failure" output="Get \"https://10.132.0.33:8001/health\": dial tcp 10.132.0.33:8001: connect: connection refused" Apr 16 18:06:25.176522 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:06:25.176492 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-694dqdq" podUID="32b8df8f-1d64-437d-a608-8fc7a5c1f8ff" containerName="main" probeResult="failure" output="Get \"https://10.132.0.34:8000/health\": dial tcp 10.132.0.34:8000: connect: connection refused" Apr 16 18:06:29.969205 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:06:29.969156 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-994df879-rhgvs" podUID="eb8915a4-4e38-4d49-88d7-129343c09d6a" containerName="main" probeResult="failure" output="Get \"https://10.132.0.35:8000/health\": dial tcp 10.132.0.35:8000: connect: connection refused" Apr 16 18:06:35.167094 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:06:35.167044 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-69c958dbdd8kvv7" podUID="1f60138e-b980-49ed-a234-2ad31edecde5" containerName="main" probeResult="failure" output="Get \"https://10.132.0.33:8001/health\": dial tcp 10.132.0.33:8001: connect: connection refused" Apr 16 18:06:35.176702 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:06:35.176663 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-694dqdq" podUID="32b8df8f-1d64-437d-a608-8fc7a5c1f8ff" containerName="main" probeResult="failure" output="Get \"https://10.132.0.34:8000/health\": dial tcp 10.132.0.34:8000: connect: connection refused" Apr 16 18:06:39.969276 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:06:39.969184 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-994df879-rhgvs" podUID="eb8915a4-4e38-4d49-88d7-129343c09d6a" containerName="main" probeResult="failure" output="Get \"https://10.132.0.35:8000/health\": dial tcp 10.132.0.35:8000: connect: connection refused" Apr 16 18:06:45.167436 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:06:45.167387 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-69c958dbdd8kvv7" podUID="1f60138e-b980-49ed-a234-2ad31edecde5" containerName="main" probeResult="failure" output="Get \"https://10.132.0.33:8001/health\": dial tcp 10.132.0.33:8001: connect: connection refused" Apr 16 18:06:45.177051 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:06:45.177013 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-694dqdq" podUID="32b8df8f-1d64-437d-a608-8fc7a5c1f8ff" containerName="main" probeResult="failure" output="Get \"https://10.132.0.34:8000/health\": dial tcp 10.132.0.34:8000: connect: connection refused" Apr 16 18:06:49.969737 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:06:49.969686 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-994df879-rhgvs" podUID="eb8915a4-4e38-4d49-88d7-129343c09d6a" containerName="main" probeResult="failure" output="Get \"https://10.132.0.35:8000/health\": dial tcp 10.132.0.35:8000: connect: connection refused" Apr 16 18:06:55.167031 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:06:55.166975 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-69c958dbdd8kvv7" podUID="1f60138e-b980-49ed-a234-2ad31edecde5" containerName="main" probeResult="failure" output="Get \"https://10.132.0.33:8001/health\": dial tcp 10.132.0.33:8001: connect: connection refused" Apr 16 18:06:55.176593 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:06:55.176534 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-694dqdq" podUID="32b8df8f-1d64-437d-a608-8fc7a5c1f8ff" containerName="main" probeResult="failure" output="Get \"https://10.132.0.34:8000/health\": dial tcp 10.132.0.34:8000: connect: connection refused" Apr 16 18:06:59.969155 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:06:59.969102 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-994df879-rhgvs" podUID="eb8915a4-4e38-4d49-88d7-129343c09d6a" containerName="main" probeResult="failure" output="Get \"https://10.132.0.35:8000/health\": dial tcp 10.132.0.35:8000: connect: connection refused" Apr 16 18:07:05.167696 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:07:05.167647 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-69c958dbdd8kvv7" podUID="1f60138e-b980-49ed-a234-2ad31edecde5" containerName="main" probeResult="failure" output="Get \"https://10.132.0.33:8001/health\": dial tcp 10.132.0.33:8001: connect: connection refused" Apr 16 18:07:05.176805 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:07:05.176769 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-694dqdq" podUID="32b8df8f-1d64-437d-a608-8fc7a5c1f8ff" containerName="main" probeResult="failure" output="Get \"https://10.132.0.34:8000/health\": dial tcp 10.132.0.34:8000: connect: connection refused" Apr 16 18:07:09.969638 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:07:09.969590 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-994df879-rhgvs" podUID="eb8915a4-4e38-4d49-88d7-129343c09d6a" containerName="main" probeResult="failure" output="Get \"https://10.132.0.35:8000/health\": dial tcp 10.132.0.35:8000: connect: connection refused" Apr 16 18:07:15.167444 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:07:15.167393 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-69c958dbdd8kvv7" podUID="1f60138e-b980-49ed-a234-2ad31edecde5" containerName="main" probeResult="failure" output="Get \"https://10.132.0.33:8001/health\": dial tcp 10.132.0.33:8001: connect: connection refused" Apr 16 18:07:15.176449 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:07:15.176401 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-694dqdq" podUID="32b8df8f-1d64-437d-a608-8fc7a5c1f8ff" containerName="main" probeResult="failure" output="Get \"https://10.132.0.34:8000/health\": dial tcp 10.132.0.34:8000: connect: connection refused" Apr 16 18:07:19.968824 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:07:19.968775 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-994df879-rhgvs" podUID="eb8915a4-4e38-4d49-88d7-129343c09d6a" containerName="main" probeResult="failure" output="Get \"https://10.132.0.35:8000/health\": dial tcp 10.132.0.35:8000: connect: connection refused" Apr 16 18:07:25.167402 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:07:25.167335 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-69c958dbdd8kvv7" podUID="1f60138e-b980-49ed-a234-2ad31edecde5" containerName="main" probeResult="failure" output="Get \"https://10.132.0.33:8001/health\": dial tcp 10.132.0.33:8001: connect: connection refused" Apr 16 18:07:25.176827 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:07:25.176783 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-694dqdq" podUID="32b8df8f-1d64-437d-a608-8fc7a5c1f8ff" containerName="main" probeResult="failure" output="Get \"https://10.132.0.34:8000/health\": dial tcp 10.132.0.34:8000: connect: connection refused" Apr 16 18:07:29.969540 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:07:29.969490 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-994df879-rhgvs" podUID="eb8915a4-4e38-4d49-88d7-129343c09d6a" containerName="main" probeResult="failure" output="Get \"https://10.132.0.35:8000/health\": dial tcp 10.132.0.35:8000: connect: connection refused" Apr 16 18:07:35.177214 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:07:35.177184 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-69c958dbdd8kvv7" Apr 16 18:07:35.186936 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:07:35.186904 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-694dqdq" Apr 16 18:07:35.189639 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:07:35.189616 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-69c958dbdd8kvv7" Apr 16 18:07:35.195350 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:07:35.195327 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-694dqdq" Apr 16 18:07:39.979167 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:07:39.979128 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-994df879-rhgvs" Apr 16 18:07:39.986915 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:07:39.986895 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-994df879-rhgvs" Apr 16 18:07:48.417056 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:07:48.417015 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-994df879-rhgvs"] Apr 16 18:07:48.417740 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:07:48.417432 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-994df879-rhgvs" podUID="eb8915a4-4e38-4d49-88d7-129343c09d6a" containerName="main" containerID="cri-o://0a1fd15cc0e4239564d15abc34bca357b489474b1411f5bf53cb0859d58d97a0" gracePeriod=30 Apr 16 18:08:01.861315 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:08:01.861279 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0"] Apr 16 18:08:01.861907 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:08:01.861806 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4f0eb385-59d8-4ad1-a6ed-60d189133c04" containerName="main" Apr 16 18:08:01.861907 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:08:01.861826 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f0eb385-59d8-4ad1-a6ed-60d189133c04" containerName="main" Apr 16 18:08:01.861907 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:08:01.861848 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4f0eb385-59d8-4ad1-a6ed-60d189133c04" containerName="storage-initializer" Apr 16 18:08:01.861907 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:08:01.861858 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f0eb385-59d8-4ad1-a6ed-60d189133c04" containerName="storage-initializer" Apr 16 18:08:01.862112 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:08:01.861937 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="4f0eb385-59d8-4ad1-a6ed-60d189133c04" containerName="main" Apr 16 18:08:01.866533 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:08:01.866510 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 18:08:01.868721 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:08:01.868699 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvc-model-fb-opt-125m-route-f312f5-cb7fb8cf-dockercfg-4mcrc\"" Apr 16 18:08:01.869112 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:08:01.869094 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisv4e643bc258191ffc517a31cd1d0ddd27-kserve-self-signed-certs\"" Apr 16 18:08:01.879681 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:08:01.879655 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0"] Apr 16 18:08:01.920707 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:08:01.920670 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/65bccf38-702a-4382-8313-29b4cd2eb3f1-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"65bccf38-702a-4382-8313-29b4cd2eb3f1\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 18:08:01.920867 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:08:01.920720 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/65bccf38-702a-4382-8313-29b4cd2eb3f1-dshm\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"65bccf38-702a-4382-8313-29b4cd2eb3f1\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 18:08:01.920867 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:08:01.920820 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/65bccf38-702a-4382-8313-29b4cd2eb3f1-home\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"65bccf38-702a-4382-8313-29b4cd2eb3f1\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 18:08:01.920867 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:08:01.920848 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/65bccf38-702a-4382-8313-29b4cd2eb3f1-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"65bccf38-702a-4382-8313-29b4cd2eb3f1\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 18:08:01.920970 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:08:01.920871 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mvbvk\" (UniqueName: \"kubernetes.io/projected/65bccf38-702a-4382-8313-29b4cd2eb3f1-kube-api-access-mvbvk\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"65bccf38-702a-4382-8313-29b4cd2eb3f1\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 18:08:01.920970 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:08:01.920909 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/65bccf38-702a-4382-8313-29b4cd2eb3f1-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"65bccf38-702a-4382-8313-29b4cd2eb3f1\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 18:08:02.021942 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:08:02.021905 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/65bccf38-702a-4382-8313-29b4cd2eb3f1-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"65bccf38-702a-4382-8313-29b4cd2eb3f1\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 18:08:02.022133 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:08:02.021956 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/65bccf38-702a-4382-8313-29b4cd2eb3f1-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"65bccf38-702a-4382-8313-29b4cd2eb3f1\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 18:08:02.022133 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:08:02.021989 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/65bccf38-702a-4382-8313-29b4cd2eb3f1-dshm\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"65bccf38-702a-4382-8313-29b4cd2eb3f1\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 18:08:02.022133 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:08:02.022040 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/65bccf38-702a-4382-8313-29b4cd2eb3f1-home\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"65bccf38-702a-4382-8313-29b4cd2eb3f1\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 18:08:02.022133 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:08:02.022063 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/65bccf38-702a-4382-8313-29b4cd2eb3f1-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"65bccf38-702a-4382-8313-29b4cd2eb3f1\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 18:08:02.022133 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:08:02.022087 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mvbvk\" (UniqueName: \"kubernetes.io/projected/65bccf38-702a-4382-8313-29b4cd2eb3f1-kube-api-access-mvbvk\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"65bccf38-702a-4382-8313-29b4cd2eb3f1\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 18:08:02.022394 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:08:02.022335 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/65bccf38-702a-4382-8313-29b4cd2eb3f1-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"65bccf38-702a-4382-8313-29b4cd2eb3f1\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 18:08:02.022438 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:08:02.022386 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/65bccf38-702a-4382-8313-29b4cd2eb3f1-home\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"65bccf38-702a-4382-8313-29b4cd2eb3f1\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 18:08:02.022438 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:08:02.022397 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/65bccf38-702a-4382-8313-29b4cd2eb3f1-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"65bccf38-702a-4382-8313-29b4cd2eb3f1\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 18:08:02.024522 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:08:02.024491 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/65bccf38-702a-4382-8313-29b4cd2eb3f1-dshm\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"65bccf38-702a-4382-8313-29b4cd2eb3f1\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 18:08:02.024663 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:08:02.024629 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/65bccf38-702a-4382-8313-29b4cd2eb3f1-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"65bccf38-702a-4382-8313-29b4cd2eb3f1\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 18:08:02.031783 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:08:02.031763 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mvbvk\" (UniqueName: \"kubernetes.io/projected/65bccf38-702a-4382-8313-29b4cd2eb3f1-kube-api-access-mvbvk\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"65bccf38-702a-4382-8313-29b4cd2eb3f1\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 18:08:02.176969 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:08:02.176942 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 18:08:02.314696 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:08:02.314440 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0"] Apr 16 18:08:02.322327 ip-10-0-138-45 kubenswrapper[2574]: W0416 18:08:02.322297 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod65bccf38_702a_4382_8313_29b4cd2eb3f1.slice/crio-1b378ee5f1fd4c67bafe07da86a3fbfb069df207391da21252ece72c1d30ece6 WatchSource:0}: Error finding container 1b378ee5f1fd4c67bafe07da86a3fbfb069df207391da21252ece72c1d30ece6: Status 404 returned error can't find the container with id 1b378ee5f1fd4c67bafe07da86a3fbfb069df207391da21252ece72c1d30ece6 Apr 16 18:08:02.324379 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:08:02.324364 2574 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 18:08:03.107441 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:08:03.107396 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" event={"ID":"65bccf38-702a-4382-8313-29b4cd2eb3f1","Type":"ContainerStarted","Data":"01327005f5477b9660498f65d4f741c04cdff589fda8ff745158e334c5d6b4f6"} Apr 16 18:08:03.107441 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:08:03.107442 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" event={"ID":"65bccf38-702a-4382-8313-29b4cd2eb3f1","Type":"ContainerStarted","Data":"1b378ee5f1fd4c67bafe07da86a3fbfb069df207391da21252ece72c1d30ece6"} Apr 16 18:08:04.174084 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:08:04.174050 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-69c958dbdd8kvv7"] Apr 16 18:08:04.174683 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:08:04.174609 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-69c958dbdd8kvv7" podUID="1f60138e-b980-49ed-a234-2ad31edecde5" containerName="main" containerID="cri-o://0384b3d7e14a571e82aadced8cecd9716460630aa3f6995ab0829ef1ba6a132d" gracePeriod=30 Apr 16 18:08:04.178998 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:08:04.178969 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-694dqdq"] Apr 16 18:08:04.179338 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:08:04.179314 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-694dqdq" podUID="32b8df8f-1d64-437d-a608-8fc7a5c1f8ff" containerName="main" containerID="cri-o://253bf90db35c1a22d1c3853c7f7de28123d7dc1827e97ad0bf8e5af3b43194ee" gracePeriod=30 Apr 16 18:08:07.122556 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:08:07.122525 2574 generic.go:358] "Generic (PLEG): container finished" podID="65bccf38-702a-4382-8313-29b4cd2eb3f1" containerID="01327005f5477b9660498f65d4f741c04cdff589fda8ff745158e334c5d6b4f6" exitCode=0 Apr 16 18:08:07.123027 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:08:07.122601 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" event={"ID":"65bccf38-702a-4382-8313-29b4cd2eb3f1","Type":"ContainerDied","Data":"01327005f5477b9660498f65d4f741c04cdff589fda8ff745158e334c5d6b4f6"} Apr 16 18:08:08.127881 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:08:08.127798 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" event={"ID":"65bccf38-702a-4382-8313-29b4cd2eb3f1","Type":"ContainerStarted","Data":"8a2f6dcb5bb8efec67d854c14796de4895fc5bad5d5ffcf83075e899293fb7e2"} Apr 16 18:08:08.151610 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:08:08.151519 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podStartSLOduration=7.151495063 podStartE2EDuration="7.151495063s" podCreationTimestamp="2026-04-16 18:08:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:08:08.149148243 +0000 UTC m=+1698.308518528" watchObservedRunningTime="2026-04-16 18:08:08.151495063 +0000 UTC m=+1698.310865358" Apr 16 18:08:12.177179 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:08:12.177137 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 18:08:12.178548 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:08:12.178513 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="65bccf38-702a-4382-8313-29b4cd2eb3f1" containerName="main" probeResult="failure" output="Get \"https://10.132.0.36:8000/health\": dial tcp 10.132.0.36:8000: connect: connection refused" Apr 16 18:08:18.702835 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:08:18.702806 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-994df879-rhgvs_eb8915a4-4e38-4d49-88d7-129343c09d6a/main/0.log" Apr 16 18:08:18.703239 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:08:18.703221 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-994df879-rhgvs" Apr 16 18:08:18.891854 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:08:18.891815 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hs47b\" (UniqueName: \"kubernetes.io/projected/eb8915a4-4e38-4d49-88d7-129343c09d6a-kube-api-access-hs47b\") pod \"eb8915a4-4e38-4d49-88d7-129343c09d6a\" (UID: \"eb8915a4-4e38-4d49-88d7-129343c09d6a\") " Apr 16 18:08:18.892024 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:08:18.891879 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/eb8915a4-4e38-4d49-88d7-129343c09d6a-model-cache\") pod \"eb8915a4-4e38-4d49-88d7-129343c09d6a\" (UID: \"eb8915a4-4e38-4d49-88d7-129343c09d6a\") " Apr 16 18:08:18.892024 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:08:18.891914 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/eb8915a4-4e38-4d49-88d7-129343c09d6a-tls-certs\") pod \"eb8915a4-4e38-4d49-88d7-129343c09d6a\" (UID: \"eb8915a4-4e38-4d49-88d7-129343c09d6a\") " Apr 16 18:08:18.892024 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:08:18.891960 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/eb8915a4-4e38-4d49-88d7-129343c09d6a-home\") pod \"eb8915a4-4e38-4d49-88d7-129343c09d6a\" (UID: \"eb8915a4-4e38-4d49-88d7-129343c09d6a\") " Apr 16 18:08:18.892024 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:08:18.891987 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/eb8915a4-4e38-4d49-88d7-129343c09d6a-dshm\") pod \"eb8915a4-4e38-4d49-88d7-129343c09d6a\" (UID: \"eb8915a4-4e38-4d49-88d7-129343c09d6a\") " Apr 16 18:08:18.892299 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:08:18.892042 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/eb8915a4-4e38-4d49-88d7-129343c09d6a-kserve-provision-location\") pod \"eb8915a4-4e38-4d49-88d7-129343c09d6a\" (UID: \"eb8915a4-4e38-4d49-88d7-129343c09d6a\") " Apr 16 18:08:18.892299 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:08:18.892160 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eb8915a4-4e38-4d49-88d7-129343c09d6a-model-cache" (OuterVolumeSpecName: "model-cache") pod "eb8915a4-4e38-4d49-88d7-129343c09d6a" (UID: "eb8915a4-4e38-4d49-88d7-129343c09d6a"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:08:18.892395 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:08:18.892305 2574 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/eb8915a4-4e38-4d49-88d7-129343c09d6a-model-cache\") on node \"ip-10-0-138-45.ec2.internal\" DevicePath \"\"" Apr 16 18:08:18.892549 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:08:18.892506 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eb8915a4-4e38-4d49-88d7-129343c09d6a-home" (OuterVolumeSpecName: "home") pod "eb8915a4-4e38-4d49-88d7-129343c09d6a" (UID: "eb8915a4-4e38-4d49-88d7-129343c09d6a"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:08:18.894276 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:08:18.894246 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb8915a4-4e38-4d49-88d7-129343c09d6a-kube-api-access-hs47b" (OuterVolumeSpecName: "kube-api-access-hs47b") pod "eb8915a4-4e38-4d49-88d7-129343c09d6a" (UID: "eb8915a4-4e38-4d49-88d7-129343c09d6a"). InnerVolumeSpecName "kube-api-access-hs47b". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:08:18.894385 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:08:18.894291 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eb8915a4-4e38-4d49-88d7-129343c09d6a-dshm" (OuterVolumeSpecName: "dshm") pod "eb8915a4-4e38-4d49-88d7-129343c09d6a" (UID: "eb8915a4-4e38-4d49-88d7-129343c09d6a"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:08:18.894503 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:08:18.894486 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb8915a4-4e38-4d49-88d7-129343c09d6a-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "eb8915a4-4e38-4d49-88d7-129343c09d6a" (UID: "eb8915a4-4e38-4d49-88d7-129343c09d6a"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:08:18.941160 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:08:18.941063 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eb8915a4-4e38-4d49-88d7-129343c09d6a-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "eb8915a4-4e38-4d49-88d7-129343c09d6a" (UID: "eb8915a4-4e38-4d49-88d7-129343c09d6a"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:08:18.992788 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:08:18.992752 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-hs47b\" (UniqueName: \"kubernetes.io/projected/eb8915a4-4e38-4d49-88d7-129343c09d6a-kube-api-access-hs47b\") on node \"ip-10-0-138-45.ec2.internal\" DevicePath \"\"" Apr 16 18:08:18.992788 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:08:18.992781 2574 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/eb8915a4-4e38-4d49-88d7-129343c09d6a-tls-certs\") on node \"ip-10-0-138-45.ec2.internal\" DevicePath \"\"" Apr 16 18:08:18.992788 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:08:18.992792 2574 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/eb8915a4-4e38-4d49-88d7-129343c09d6a-home\") on node \"ip-10-0-138-45.ec2.internal\" DevicePath \"\"" Apr 16 18:08:18.993006 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:08:18.992800 2574 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/eb8915a4-4e38-4d49-88d7-129343c09d6a-dshm\") on node \"ip-10-0-138-45.ec2.internal\" DevicePath \"\"" Apr 16 18:08:18.993006 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:08:18.992810 2574 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/eb8915a4-4e38-4d49-88d7-129343c09d6a-kserve-provision-location\") on node \"ip-10-0-138-45.ec2.internal\" DevicePath \"\"" Apr 16 18:08:19.167590 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:08:19.167552 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-994df879-rhgvs_eb8915a4-4e38-4d49-88d7-129343c09d6a/main/0.log" Apr 16 18:08:19.167963 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:08:19.167936 2574 generic.go:358] "Generic (PLEG): container finished" podID="eb8915a4-4e38-4d49-88d7-129343c09d6a" containerID="0a1fd15cc0e4239564d15abc34bca357b489474b1411f5bf53cb0859d58d97a0" exitCode=137 Apr 16 18:08:19.168036 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:08:19.168007 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-994df879-rhgvs" Apr 16 18:08:19.168036 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:08:19.168016 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-994df879-rhgvs" event={"ID":"eb8915a4-4e38-4d49-88d7-129343c09d6a","Type":"ContainerDied","Data":"0a1fd15cc0e4239564d15abc34bca357b489474b1411f5bf53cb0859d58d97a0"} Apr 16 18:08:19.168108 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:08:19.168056 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-994df879-rhgvs" event={"ID":"eb8915a4-4e38-4d49-88d7-129343c09d6a","Type":"ContainerDied","Data":"0c8cbac48e06366ce6ed775f62a11ce97893d935d26936375828974c90bc77e1"} Apr 16 18:08:19.168108 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:08:19.168073 2574 scope.go:117] "RemoveContainer" containerID="0a1fd15cc0e4239564d15abc34bca357b489474b1411f5bf53cb0859d58d97a0" Apr 16 18:08:19.194212 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:08:19.194186 2574 scope.go:117] "RemoveContainer" containerID="dfc5c1372de7d99babdbad8e4f5122a8add95e259887a74dcfc6d059a899dbd0" Apr 16 18:08:19.197994 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:08:19.197969 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-994df879-rhgvs"] Apr 16 18:08:19.202144 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:08:19.202117 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-994df879-rhgvs"] Apr 16 18:08:19.258607 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:08:19.258559 2574 scope.go:117] "RemoveContainer" containerID="0a1fd15cc0e4239564d15abc34bca357b489474b1411f5bf53cb0859d58d97a0" Apr 16 18:08:19.259053 ip-10-0-138-45 kubenswrapper[2574]: E0416 18:08:19.259025 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0a1fd15cc0e4239564d15abc34bca357b489474b1411f5bf53cb0859d58d97a0\": container with ID starting with 0a1fd15cc0e4239564d15abc34bca357b489474b1411f5bf53cb0859d58d97a0 not found: ID does not exist" containerID="0a1fd15cc0e4239564d15abc34bca357b489474b1411f5bf53cb0859d58d97a0" Apr 16 18:08:19.259128 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:08:19.259070 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a1fd15cc0e4239564d15abc34bca357b489474b1411f5bf53cb0859d58d97a0"} err="failed to get container status \"0a1fd15cc0e4239564d15abc34bca357b489474b1411f5bf53cb0859d58d97a0\": rpc error: code = NotFound desc = could not find container \"0a1fd15cc0e4239564d15abc34bca357b489474b1411f5bf53cb0859d58d97a0\": container with ID starting with 0a1fd15cc0e4239564d15abc34bca357b489474b1411f5bf53cb0859d58d97a0 not found: ID does not exist" Apr 16 18:08:19.259128 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:08:19.259098 2574 scope.go:117] "RemoveContainer" containerID="dfc5c1372de7d99babdbad8e4f5122a8add95e259887a74dcfc6d059a899dbd0" Apr 16 18:08:19.259424 ip-10-0-138-45 kubenswrapper[2574]: E0416 18:08:19.259401 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dfc5c1372de7d99babdbad8e4f5122a8add95e259887a74dcfc6d059a899dbd0\": container with ID starting with dfc5c1372de7d99babdbad8e4f5122a8add95e259887a74dcfc6d059a899dbd0 not found: ID does not exist" containerID="dfc5c1372de7d99babdbad8e4f5122a8add95e259887a74dcfc6d059a899dbd0" Apr 16 18:08:19.259533 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:08:19.259437 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dfc5c1372de7d99babdbad8e4f5122a8add95e259887a74dcfc6d059a899dbd0"} err="failed to get container status \"dfc5c1372de7d99babdbad8e4f5122a8add95e259887a74dcfc6d059a899dbd0\": rpc error: code = NotFound desc = could not find container \"dfc5c1372de7d99babdbad8e4f5122a8add95e259887a74dcfc6d059a899dbd0\": container with ID starting with dfc5c1372de7d99babdbad8e4f5122a8add95e259887a74dcfc6d059a899dbd0 not found: ID does not exist" Apr 16 18:08:20.301116 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:08:20.301079 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-78ddd989bf-8xb8f"] Apr 16 18:08:20.301528 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:08:20.301437 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="eb8915a4-4e38-4d49-88d7-129343c09d6a" containerName="storage-initializer" Apr 16 18:08:20.301528 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:08:20.301449 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb8915a4-4e38-4d49-88d7-129343c09d6a" containerName="storage-initializer" Apr 16 18:08:20.301528 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:08:20.301458 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="eb8915a4-4e38-4d49-88d7-129343c09d6a" containerName="main" Apr 16 18:08:20.301528 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:08:20.301463 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb8915a4-4e38-4d49-88d7-129343c09d6a" containerName="main" Apr 16 18:08:20.301528 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:08:20.301521 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="eb8915a4-4e38-4d49-88d7-129343c09d6a" containerName="main" Apr 16 18:08:20.333409 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:08:20.333377 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-78ddd989bf-8xb8f"] Apr 16 18:08:20.333618 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:08:20.333519 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-78ddd989bf-8xb8f" Apr 16 18:08:20.333970 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:08:20.333787 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-cddd6858-5mv9d"] Apr 16 18:08:20.337483 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:08:20.337457 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"custom-route-timeout-pd-test-kserve-dockercfg-cm2j2\"" Apr 16 18:08:20.337693 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:08:20.337674 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"custom-route-timeout-pd-test-kserve-self-signed-certs\"" Apr 16 18:08:20.348803 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:08:20.348778 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-cddd6858-5mv9d"] Apr 16 18:08:20.348935 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:08:20.348920 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-cddd6858-5mv9d" Apr 16 18:08:20.456188 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:08:20.456156 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eb8915a4-4e38-4d49-88d7-129343c09d6a" path="/var/lib/kubelet/pods/eb8915a4-4e38-4d49-88d7-129343c09d6a/volumes" Apr 16 18:08:20.506050 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:08:20.505999 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/9db6872d-fd5b-4e35-82f1-af7192c3c460-home\") pod \"custom-route-timeout-pd-test-kserve-78ddd989bf-8xb8f\" (UID: \"9db6872d-fd5b-4e35-82f1-af7192c3c460\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-78ddd989bf-8xb8f" Apr 16 18:08:20.506216 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:08:20.506069 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/9db6872d-fd5b-4e35-82f1-af7192c3c460-dshm\") pod \"custom-route-timeout-pd-test-kserve-78ddd989bf-8xb8f\" (UID: \"9db6872d-fd5b-4e35-82f1-af7192c3c460\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-78ddd989bf-8xb8f" Apr 16 18:08:20.506216 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:08:20.506100 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/dead6bf3-c514-42e7-8d68-37056d1b826d-model-cache\") pod \"custom-route-timeout-pd-test-kserve-prefill-cddd6858-5mv9d\" (UID: \"dead6bf3-c514-42e7-8d68-37056d1b826d\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-cddd6858-5mv9d" Apr 16 18:08:20.506216 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:08:20.506145 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/dead6bf3-c514-42e7-8d68-37056d1b826d-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-prefill-cddd6858-5mv9d\" (UID: \"dead6bf3-c514-42e7-8d68-37056d1b826d\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-cddd6858-5mv9d" Apr 16 18:08:20.506216 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:08:20.506194 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/9db6872d-fd5b-4e35-82f1-af7192c3c460-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-78ddd989bf-8xb8f\" (UID: \"9db6872d-fd5b-4e35-82f1-af7192c3c460\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-78ddd989bf-8xb8f" Apr 16 18:08:20.506349 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:08:20.506248 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/dead6bf3-c514-42e7-8d68-37056d1b826d-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-prefill-cddd6858-5mv9d\" (UID: \"dead6bf3-c514-42e7-8d68-37056d1b826d\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-cddd6858-5mv9d" Apr 16 18:08:20.506349 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:08:20.506315 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/dead6bf3-c514-42e7-8d68-37056d1b826d-dshm\") pod \"custom-route-timeout-pd-test-kserve-prefill-cddd6858-5mv9d\" (UID: \"dead6bf3-c514-42e7-8d68-37056d1b826d\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-cddd6858-5mv9d" Apr 16 18:08:20.506413 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:08:20.506351 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9db6872d-fd5b-4e35-82f1-af7192c3c460-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-78ddd989bf-8xb8f\" (UID: \"9db6872d-fd5b-4e35-82f1-af7192c3c460\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-78ddd989bf-8xb8f" Apr 16 18:08:20.506413 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:08:20.506370 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l8mn9\" (UniqueName: \"kubernetes.io/projected/dead6bf3-c514-42e7-8d68-37056d1b826d-kube-api-access-l8mn9\") pod \"custom-route-timeout-pd-test-kserve-prefill-cddd6858-5mv9d\" (UID: \"dead6bf3-c514-42e7-8d68-37056d1b826d\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-cddd6858-5mv9d" Apr 16 18:08:20.506476 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:08:20.506417 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5f88q\" (UniqueName: \"kubernetes.io/projected/9db6872d-fd5b-4e35-82f1-af7192c3c460-kube-api-access-5f88q\") pod \"custom-route-timeout-pd-test-kserve-78ddd989bf-8xb8f\" (UID: \"9db6872d-fd5b-4e35-82f1-af7192c3c460\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-78ddd989bf-8xb8f" Apr 16 18:08:20.506476 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:08:20.506440 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/dead6bf3-c514-42e7-8d68-37056d1b826d-home\") pod \"custom-route-timeout-pd-test-kserve-prefill-cddd6858-5mv9d\" (UID: \"dead6bf3-c514-42e7-8d68-37056d1b826d\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-cddd6858-5mv9d" Apr 16 18:08:20.506476 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:08:20.506460 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/9db6872d-fd5b-4e35-82f1-af7192c3c460-model-cache\") pod \"custom-route-timeout-pd-test-kserve-78ddd989bf-8xb8f\" (UID: \"9db6872d-fd5b-4e35-82f1-af7192c3c460\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-78ddd989bf-8xb8f" Apr 16 18:08:20.607014 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:08:20.606929 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/9db6872d-fd5b-4e35-82f1-af7192c3c460-dshm\") pod \"custom-route-timeout-pd-test-kserve-78ddd989bf-8xb8f\" (UID: \"9db6872d-fd5b-4e35-82f1-af7192c3c460\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-78ddd989bf-8xb8f" Apr 16 18:08:20.607014 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:08:20.606967 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/dead6bf3-c514-42e7-8d68-37056d1b826d-model-cache\") pod \"custom-route-timeout-pd-test-kserve-prefill-cddd6858-5mv9d\" (UID: \"dead6bf3-c514-42e7-8d68-37056d1b826d\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-cddd6858-5mv9d" Apr 16 18:08:20.607014 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:08:20.607005 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/dead6bf3-c514-42e7-8d68-37056d1b826d-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-prefill-cddd6858-5mv9d\" (UID: \"dead6bf3-c514-42e7-8d68-37056d1b826d\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-cddd6858-5mv9d" Apr 16 18:08:20.607292 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:08:20.607034 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/9db6872d-fd5b-4e35-82f1-af7192c3c460-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-78ddd989bf-8xb8f\" (UID: \"9db6872d-fd5b-4e35-82f1-af7192c3c460\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-78ddd989bf-8xb8f" Apr 16 18:08:20.607292 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:08:20.607067 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/dead6bf3-c514-42e7-8d68-37056d1b826d-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-prefill-cddd6858-5mv9d\" (UID: \"dead6bf3-c514-42e7-8d68-37056d1b826d\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-cddd6858-5mv9d" Apr 16 18:08:20.607292 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:08:20.607107 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/dead6bf3-c514-42e7-8d68-37056d1b826d-dshm\") pod \"custom-route-timeout-pd-test-kserve-prefill-cddd6858-5mv9d\" (UID: \"dead6bf3-c514-42e7-8d68-37056d1b826d\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-cddd6858-5mv9d" Apr 16 18:08:20.607292 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:08:20.607167 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9db6872d-fd5b-4e35-82f1-af7192c3c460-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-78ddd989bf-8xb8f\" (UID: \"9db6872d-fd5b-4e35-82f1-af7192c3c460\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-78ddd989bf-8xb8f" Apr 16 18:08:20.607292 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:08:20.607206 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l8mn9\" (UniqueName: \"kubernetes.io/projected/dead6bf3-c514-42e7-8d68-37056d1b826d-kube-api-access-l8mn9\") pod \"custom-route-timeout-pd-test-kserve-prefill-cddd6858-5mv9d\" (UID: \"dead6bf3-c514-42e7-8d68-37056d1b826d\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-cddd6858-5mv9d" Apr 16 18:08:20.607292 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:08:20.607250 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5f88q\" (UniqueName: \"kubernetes.io/projected/9db6872d-fd5b-4e35-82f1-af7192c3c460-kube-api-access-5f88q\") pod \"custom-route-timeout-pd-test-kserve-78ddd989bf-8xb8f\" (UID: \"9db6872d-fd5b-4e35-82f1-af7192c3c460\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-78ddd989bf-8xb8f" Apr 16 18:08:20.607292 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:08:20.607287 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/dead6bf3-c514-42e7-8d68-37056d1b826d-home\") pod \"custom-route-timeout-pd-test-kserve-prefill-cddd6858-5mv9d\" (UID: \"dead6bf3-c514-42e7-8d68-37056d1b826d\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-cddd6858-5mv9d" Apr 16 18:08:20.607683 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:08:20.607321 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/9db6872d-fd5b-4e35-82f1-af7192c3c460-model-cache\") pod \"custom-route-timeout-pd-test-kserve-78ddd989bf-8xb8f\" (UID: \"9db6872d-fd5b-4e35-82f1-af7192c3c460\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-78ddd989bf-8xb8f" Apr 16 18:08:20.607683 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:08:20.607355 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/9db6872d-fd5b-4e35-82f1-af7192c3c460-home\") pod \"custom-route-timeout-pd-test-kserve-78ddd989bf-8xb8f\" (UID: \"9db6872d-fd5b-4e35-82f1-af7192c3c460\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-78ddd989bf-8xb8f" Apr 16 18:08:20.607683 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:08:20.607590 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9db6872d-fd5b-4e35-82f1-af7192c3c460-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-78ddd989bf-8xb8f\" (UID: \"9db6872d-fd5b-4e35-82f1-af7192c3c460\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-78ddd989bf-8xb8f" Apr 16 18:08:20.608678 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:08:20.607730 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/9db6872d-fd5b-4e35-82f1-af7192c3c460-home\") pod \"custom-route-timeout-pd-test-kserve-78ddd989bf-8xb8f\" (UID: \"9db6872d-fd5b-4e35-82f1-af7192c3c460\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-78ddd989bf-8xb8f" Apr 16 18:08:20.608678 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:08:20.608551 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/9db6872d-fd5b-4e35-82f1-af7192c3c460-model-cache\") pod \"custom-route-timeout-pd-test-kserve-78ddd989bf-8xb8f\" (UID: \"9db6872d-fd5b-4e35-82f1-af7192c3c460\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-78ddd989bf-8xb8f" Apr 16 18:08:20.608894 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:08:20.608733 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/dead6bf3-c514-42e7-8d68-37056d1b826d-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-prefill-cddd6858-5mv9d\" (UID: \"dead6bf3-c514-42e7-8d68-37056d1b826d\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-cddd6858-5mv9d" Apr 16 18:08:20.609395 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:08:20.609156 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/dead6bf3-c514-42e7-8d68-37056d1b826d-model-cache\") pod \"custom-route-timeout-pd-test-kserve-prefill-cddd6858-5mv9d\" (UID: \"dead6bf3-c514-42e7-8d68-37056d1b826d\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-cddd6858-5mv9d" Apr 16 18:08:20.609395 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:08:20.609293 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/dead6bf3-c514-42e7-8d68-37056d1b826d-home\") pod \"custom-route-timeout-pd-test-kserve-prefill-cddd6858-5mv9d\" (UID: \"dead6bf3-c514-42e7-8d68-37056d1b826d\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-cddd6858-5mv9d" Apr 16 18:08:20.610796 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:08:20.610769 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/9db6872d-fd5b-4e35-82f1-af7192c3c460-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-78ddd989bf-8xb8f\" (UID: \"9db6872d-fd5b-4e35-82f1-af7192c3c460\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-78ddd989bf-8xb8f" Apr 16 18:08:20.611551 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:08:20.611521 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/9db6872d-fd5b-4e35-82f1-af7192c3c460-dshm\") pod \"custom-route-timeout-pd-test-kserve-78ddd989bf-8xb8f\" (UID: \"9db6872d-fd5b-4e35-82f1-af7192c3c460\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-78ddd989bf-8xb8f" Apr 16 18:08:20.611772 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:08:20.611751 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/dead6bf3-c514-42e7-8d68-37056d1b826d-dshm\") pod \"custom-route-timeout-pd-test-kserve-prefill-cddd6858-5mv9d\" (UID: \"dead6bf3-c514-42e7-8d68-37056d1b826d\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-cddd6858-5mv9d" Apr 16 18:08:20.612150 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:08:20.612127 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/dead6bf3-c514-42e7-8d68-37056d1b826d-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-prefill-cddd6858-5mv9d\" (UID: \"dead6bf3-c514-42e7-8d68-37056d1b826d\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-cddd6858-5mv9d" Apr 16 18:08:20.624120 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:08:20.624089 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l8mn9\" (UniqueName: \"kubernetes.io/projected/dead6bf3-c514-42e7-8d68-37056d1b826d-kube-api-access-l8mn9\") pod \"custom-route-timeout-pd-test-kserve-prefill-cddd6858-5mv9d\" (UID: \"dead6bf3-c514-42e7-8d68-37056d1b826d\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-cddd6858-5mv9d" Apr 16 18:08:20.625439 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:08:20.625412 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5f88q\" (UniqueName: \"kubernetes.io/projected/9db6872d-fd5b-4e35-82f1-af7192c3c460-kube-api-access-5f88q\") pod \"custom-route-timeout-pd-test-kserve-78ddd989bf-8xb8f\" (UID: \"9db6872d-fd5b-4e35-82f1-af7192c3c460\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-78ddd989bf-8xb8f" Apr 16 18:08:20.646171 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:08:20.646133 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-78ddd989bf-8xb8f" Apr 16 18:08:20.660090 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:08:20.660056 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-cddd6858-5mv9d" Apr 16 18:08:20.824854 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:08:20.824825 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-78ddd989bf-8xb8f"] Apr 16 18:08:20.827506 ip-10-0-138-45 kubenswrapper[2574]: W0416 18:08:20.827475 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9db6872d_fd5b_4e35_82f1_af7192c3c460.slice/crio-6b74aca7daa250691027896adde64989e58ae1289eb0e19f0550591b172f2cc0 WatchSource:0}: Error finding container 6b74aca7daa250691027896adde64989e58ae1289eb0e19f0550591b172f2cc0: Status 404 returned error can't find the container with id 6b74aca7daa250691027896adde64989e58ae1289eb0e19f0550591b172f2cc0 Apr 16 18:08:20.839444 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:08:20.839417 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-cddd6858-5mv9d"] Apr 16 18:08:20.842831 ip-10-0-138-45 kubenswrapper[2574]: W0416 18:08:20.842801 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddead6bf3_c514_42e7_8d68_37056d1b826d.slice/crio-bac757a237e725ca1ea1cfabc51dd9531b26e54ee5acf41bd92c7534da38529e WatchSource:0}: Error finding container bac757a237e725ca1ea1cfabc51dd9531b26e54ee5acf41bd92c7534da38529e: Status 404 returned error can't find the container with id bac757a237e725ca1ea1cfabc51dd9531b26e54ee5acf41bd92c7534da38529e Apr 16 18:08:21.179432 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:08:21.179393 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-78ddd989bf-8xb8f" event={"ID":"9db6872d-fd5b-4e35-82f1-af7192c3c460","Type":"ContainerStarted","Data":"7e65f566b37356e6014f422e42243652d3c642a7ae82e349f69dd52af24a045c"} Apr 16 18:08:21.179984 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:08:21.179441 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-78ddd989bf-8xb8f" event={"ID":"9db6872d-fd5b-4e35-82f1-af7192c3c460","Type":"ContainerStarted","Data":"6b74aca7daa250691027896adde64989e58ae1289eb0e19f0550591b172f2cc0"} Apr 16 18:08:21.179984 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:08:21.179565 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-78ddd989bf-8xb8f" Apr 16 18:08:21.181053 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:08:21.181028 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-cddd6858-5mv9d" event={"ID":"dead6bf3-c514-42e7-8d68-37056d1b826d","Type":"ContainerStarted","Data":"536c79a08b2fefaf9775f956d619b954136521a9c2336d233f20d8ea91a1657c"} Apr 16 18:08:21.181156 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:08:21.181060 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-cddd6858-5mv9d" event={"ID":"dead6bf3-c514-42e7-8d68-37056d1b826d","Type":"ContainerStarted","Data":"bac757a237e725ca1ea1cfabc51dd9531b26e54ee5acf41bd92c7534da38529e"} Apr 16 18:08:22.178308 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:08:22.178225 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="65bccf38-702a-4382-8313-29b4cd2eb3f1" containerName="main" probeResult="failure" output="Get \"https://10.132.0.36:8000/health\": dial tcp 10.132.0.36:8000: connect: connection refused" Apr 16 18:08:22.188029 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:08:22.187946 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-78ddd989bf-8xb8f" event={"ID":"9db6872d-fd5b-4e35-82f1-af7192c3c460","Type":"ContainerStarted","Data":"37a7a4afe76fb9480846668faaef4c3c7f10c09849e691e602ed7272cecb96c4"} Apr 16 18:08:25.203804 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:08:25.203769 2574 generic.go:358] "Generic (PLEG): container finished" podID="dead6bf3-c514-42e7-8d68-37056d1b826d" containerID="536c79a08b2fefaf9775f956d619b954136521a9c2336d233f20d8ea91a1657c" exitCode=0 Apr 16 18:08:25.204208 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:08:25.203825 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-cddd6858-5mv9d" event={"ID":"dead6bf3-c514-42e7-8d68-37056d1b826d","Type":"ContainerDied","Data":"536c79a08b2fefaf9775f956d619b954136521a9c2336d233f20d8ea91a1657c"} Apr 16 18:08:26.210276 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:08:26.210231 2574 generic.go:358] "Generic (PLEG): container finished" podID="9db6872d-fd5b-4e35-82f1-af7192c3c460" containerID="37a7a4afe76fb9480846668faaef4c3c7f10c09849e691e602ed7272cecb96c4" exitCode=0 Apr 16 18:08:26.210756 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:08:26.210268 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-78ddd989bf-8xb8f" event={"ID":"9db6872d-fd5b-4e35-82f1-af7192c3c460","Type":"ContainerDied","Data":"37a7a4afe76fb9480846668faaef4c3c7f10c09849e691e602ed7272cecb96c4"} Apr 16 18:08:26.212985 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:08:26.212954 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-cddd6858-5mv9d" event={"ID":"dead6bf3-c514-42e7-8d68-37056d1b826d","Type":"ContainerStarted","Data":"939aca6fa4c889b8b65dfb6a2ddd808fd2d02e6f2774ecfc556187a751fdd899"} Apr 16 18:08:26.259946 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:08:26.259827 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-cddd6858-5mv9d" podStartSLOduration=6.259807568 podStartE2EDuration="6.259807568s" podCreationTimestamp="2026-04-16 18:08:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:08:26.258872655 +0000 UTC m=+1716.418242949" watchObservedRunningTime="2026-04-16 18:08:26.259807568 +0000 UTC m=+1716.419177852" Apr 16 18:08:27.219130 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:08:27.219093 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-78ddd989bf-8xb8f" event={"ID":"9db6872d-fd5b-4e35-82f1-af7192c3c460","Type":"ContainerStarted","Data":"d17fac4415bab972bf1e7671eea9516d6021e19e5ed6320a7240c7cbec72af6c"} Apr 16 18:08:27.255564 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:08:27.255494 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-78ddd989bf-8xb8f" podStartSLOduration=7.255471651 podStartE2EDuration="7.255471651s" podCreationTimestamp="2026-04-16 18:08:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:08:27.254694725 +0000 UTC m=+1717.414065009" watchObservedRunningTime="2026-04-16 18:08:27.255471651 +0000 UTC m=+1717.414841934" Apr 16 18:08:30.646702 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:08:30.646641 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-78ddd989bf-8xb8f" Apr 16 18:08:30.646702 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:08:30.646701 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-78ddd989bf-8xb8f" Apr 16 18:08:30.648160 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:08:30.648110 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-78ddd989bf-8xb8f" podUID="9db6872d-fd5b-4e35-82f1-af7192c3c460" containerName="main" probeResult="failure" output="Get \"https://10.132.0.37:8001/health\": dial tcp 10.132.0.37:8001: connect: connection refused" Apr 16 18:08:30.660184 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:08:30.660139 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-cddd6858-5mv9d" Apr 16 18:08:30.660184 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:08:30.660185 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-cddd6858-5mv9d" Apr 16 18:08:30.660843 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:08:30.660817 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-78ddd989bf-8xb8f" Apr 16 18:08:30.661614 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:08:30.661563 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-cddd6858-5mv9d" podUID="dead6bf3-c514-42e7-8d68-37056d1b826d" containerName="main" probeResult="failure" output="Get \"https://10.132.0.38:8000/health\": dial tcp 10.132.0.38:8000: connect: connection refused" Apr 16 18:08:32.177801 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:08:32.177761 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 18:08:32.178218 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:08:32.178128 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="65bccf38-702a-4382-8313-29b4cd2eb3f1" containerName="main" probeResult="failure" output="Get \"https://10.132.0.36:8000/health\": dial tcp 10.132.0.36:8000: connect: connection refused" Apr 16 18:08:34.174943 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:08:34.174857 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-69c958dbdd8kvv7" podUID="1f60138e-b980-49ed-a234-2ad31edecde5" containerName="llm-d-routing-sidecar" containerID="cri-o://736ff6af064da9cc7b62dfe11cd1ccb8fce897f8021b14d8e52d3136ecb611d0" gracePeriod=2 Apr 16 18:08:34.516091 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:08:34.516064 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-69c958dbdd8kvv7_1f60138e-b980-49ed-a234-2ad31edecde5/main/0.log" Apr 16 18:08:34.516955 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:08:34.516935 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-69c958dbdd8kvv7" Apr 16 18:08:34.609491 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:08:34.609463 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-694dqdq" Apr 16 18:08:34.639610 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:08:34.639557 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/32b8df8f-1d64-437d-a608-8fc7a5c1f8ff-home\") pod \"32b8df8f-1d64-437d-a608-8fc7a5c1f8ff\" (UID: \"32b8df8f-1d64-437d-a608-8fc7a5c1f8ff\") " Apr 16 18:08:34.639800 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:08:34.639631 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/1f60138e-b980-49ed-a234-2ad31edecde5-home\") pod \"1f60138e-b980-49ed-a234-2ad31edecde5\" (UID: \"1f60138e-b980-49ed-a234-2ad31edecde5\") " Apr 16 18:08:34.639800 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:08:34.639704 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/32b8df8f-1d64-437d-a608-8fc7a5c1f8ff-tls-certs\") pod \"32b8df8f-1d64-437d-a608-8fc7a5c1f8ff\" (UID: \"32b8df8f-1d64-437d-a608-8fc7a5c1f8ff\") " Apr 16 18:08:34.639800 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:08:34.639736 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1f60138e-b980-49ed-a234-2ad31edecde5-kserve-provision-location\") pod \"1f60138e-b980-49ed-a234-2ad31edecde5\" (UID: \"1f60138e-b980-49ed-a234-2ad31edecde5\") " Apr 16 18:08:34.639971 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:08:34.639847 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6vq9l\" (UniqueName: \"kubernetes.io/projected/1f60138e-b980-49ed-a234-2ad31edecde5-kube-api-access-6vq9l\") pod \"1f60138e-b980-49ed-a234-2ad31edecde5\" (UID: \"1f60138e-b980-49ed-a234-2ad31edecde5\") " Apr 16 18:08:34.639971 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:08:34.639905 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/1f60138e-b980-49ed-a234-2ad31edecde5-tls-certs\") pod \"1f60138e-b980-49ed-a234-2ad31edecde5\" (UID: \"1f60138e-b980-49ed-a234-2ad31edecde5\") " Apr 16 18:08:34.639971 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:08:34.639922 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/32b8df8f-1d64-437d-a608-8fc7a5c1f8ff-home" (OuterVolumeSpecName: "home") pod "32b8df8f-1d64-437d-a608-8fc7a5c1f8ff" (UID: "32b8df8f-1d64-437d-a608-8fc7a5c1f8ff"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:08:34.639971 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:08:34.639948 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/32b8df8f-1d64-437d-a608-8fc7a5c1f8ff-kserve-provision-location\") pod \"32b8df8f-1d64-437d-a608-8fc7a5c1f8ff\" (UID: \"32b8df8f-1d64-437d-a608-8fc7a5c1f8ff\") " Apr 16 18:08:34.640169 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:08:34.640000 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/1f60138e-b980-49ed-a234-2ad31edecde5-model-cache\") pod \"1f60138e-b980-49ed-a234-2ad31edecde5\" (UID: \"1f60138e-b980-49ed-a234-2ad31edecde5\") " Apr 16 18:08:34.640169 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:08:34.640030 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/1f60138e-b980-49ed-a234-2ad31edecde5-dshm\") pod \"1f60138e-b980-49ed-a234-2ad31edecde5\" (UID: \"1f60138e-b980-49ed-a234-2ad31edecde5\") " Apr 16 18:08:34.640169 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:08:34.640067 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/32b8df8f-1d64-437d-a608-8fc7a5c1f8ff-model-cache\") pod \"32b8df8f-1d64-437d-a608-8fc7a5c1f8ff\" (UID: \"32b8df8f-1d64-437d-a608-8fc7a5c1f8ff\") " Apr 16 18:08:34.640169 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:08:34.640103 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/32b8df8f-1d64-437d-a608-8fc7a5c1f8ff-dshm\") pod \"32b8df8f-1d64-437d-a608-8fc7a5c1f8ff\" (UID: \"32b8df8f-1d64-437d-a608-8fc7a5c1f8ff\") " Apr 16 18:08:34.640169 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:08:34.640137 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gmtxn\" (UniqueName: \"kubernetes.io/projected/32b8df8f-1d64-437d-a608-8fc7a5c1f8ff-kube-api-access-gmtxn\") pod \"32b8df8f-1d64-437d-a608-8fc7a5c1f8ff\" (UID: \"32b8df8f-1d64-437d-a608-8fc7a5c1f8ff\") " Apr 16 18:08:34.640431 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:08:34.640394 2574 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/32b8df8f-1d64-437d-a608-8fc7a5c1f8ff-home\") on node \"ip-10-0-138-45.ec2.internal\" DevicePath \"\"" Apr 16 18:08:34.640592 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:08:34.640527 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1f60138e-b980-49ed-a234-2ad31edecde5-home" (OuterVolumeSpecName: "home") pod "1f60138e-b980-49ed-a234-2ad31edecde5" (UID: "1f60138e-b980-49ed-a234-2ad31edecde5"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:08:34.641345 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:08:34.641319 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/32b8df8f-1d64-437d-a608-8fc7a5c1f8ff-model-cache" (OuterVolumeSpecName: "model-cache") pod "32b8df8f-1d64-437d-a608-8fc7a5c1f8ff" (UID: "32b8df8f-1d64-437d-a608-8fc7a5c1f8ff"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:08:34.644974 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:08:34.644936 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/32b8df8f-1d64-437d-a608-8fc7a5c1f8ff-dshm" (OuterVolumeSpecName: "dshm") pod "32b8df8f-1d64-437d-a608-8fc7a5c1f8ff" (UID: "32b8df8f-1d64-437d-a608-8fc7a5c1f8ff"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:08:34.645177 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:08:34.645136 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32b8df8f-1d64-437d-a608-8fc7a5c1f8ff-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "32b8df8f-1d64-437d-a608-8fc7a5c1f8ff" (UID: "32b8df8f-1d64-437d-a608-8fc7a5c1f8ff"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:08:34.645278 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:08:34.645257 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1f60138e-b980-49ed-a234-2ad31edecde5-dshm" (OuterVolumeSpecName: "dshm") pod "1f60138e-b980-49ed-a234-2ad31edecde5" (UID: "1f60138e-b980-49ed-a234-2ad31edecde5"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:08:34.650869 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:08:34.650814 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1f60138e-b980-49ed-a234-2ad31edecde5-model-cache" (OuterVolumeSpecName: "model-cache") pod "1f60138e-b980-49ed-a234-2ad31edecde5" (UID: "1f60138e-b980-49ed-a234-2ad31edecde5"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:08:34.654502 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:08:34.654390 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f60138e-b980-49ed-a234-2ad31edecde5-kube-api-access-6vq9l" (OuterVolumeSpecName: "kube-api-access-6vq9l") pod "1f60138e-b980-49ed-a234-2ad31edecde5" (UID: "1f60138e-b980-49ed-a234-2ad31edecde5"). InnerVolumeSpecName "kube-api-access-6vq9l". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:08:34.654635 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:08:34.654559 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/32b8df8f-1d64-437d-a608-8fc7a5c1f8ff-kube-api-access-gmtxn" (OuterVolumeSpecName: "kube-api-access-gmtxn") pod "32b8df8f-1d64-437d-a608-8fc7a5c1f8ff" (UID: "32b8df8f-1d64-437d-a608-8fc7a5c1f8ff"). InnerVolumeSpecName "kube-api-access-gmtxn". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:08:34.654791 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:08:34.654765 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f60138e-b980-49ed-a234-2ad31edecde5-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "1f60138e-b980-49ed-a234-2ad31edecde5" (UID: "1f60138e-b980-49ed-a234-2ad31edecde5"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:08:34.714091 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:08:34.713962 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/32b8df8f-1d64-437d-a608-8fc7a5c1f8ff-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "32b8df8f-1d64-437d-a608-8fc7a5c1f8ff" (UID: "32b8df8f-1d64-437d-a608-8fc7a5c1f8ff"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:08:34.741520 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:08:34.741476 2574 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/32b8df8f-1d64-437d-a608-8fc7a5c1f8ff-dshm\") on node \"ip-10-0-138-45.ec2.internal\" DevicePath \"\"" Apr 16 18:08:34.741520 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:08:34.741524 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-gmtxn\" (UniqueName: \"kubernetes.io/projected/32b8df8f-1d64-437d-a608-8fc7a5c1f8ff-kube-api-access-gmtxn\") on node \"ip-10-0-138-45.ec2.internal\" DevicePath \"\"" Apr 16 18:08:34.741746 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:08:34.741540 2574 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/1f60138e-b980-49ed-a234-2ad31edecde5-home\") on node \"ip-10-0-138-45.ec2.internal\" DevicePath \"\"" Apr 16 18:08:34.741746 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:08:34.741553 2574 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/32b8df8f-1d64-437d-a608-8fc7a5c1f8ff-tls-certs\") on node \"ip-10-0-138-45.ec2.internal\" DevicePath \"\"" Apr 16 18:08:34.741746 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:08:34.741588 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-6vq9l\" (UniqueName: \"kubernetes.io/projected/1f60138e-b980-49ed-a234-2ad31edecde5-kube-api-access-6vq9l\") on node \"ip-10-0-138-45.ec2.internal\" DevicePath \"\"" Apr 16 18:08:34.741746 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:08:34.741600 2574 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/1f60138e-b980-49ed-a234-2ad31edecde5-tls-certs\") on node \"ip-10-0-138-45.ec2.internal\" DevicePath \"\"" Apr 16 18:08:34.741746 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:08:34.741615 2574 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/32b8df8f-1d64-437d-a608-8fc7a5c1f8ff-kserve-provision-location\") on node \"ip-10-0-138-45.ec2.internal\" DevicePath \"\"" Apr 16 18:08:34.741746 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:08:34.741629 2574 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/1f60138e-b980-49ed-a234-2ad31edecde5-model-cache\") on node \"ip-10-0-138-45.ec2.internal\" DevicePath \"\"" Apr 16 18:08:34.741746 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:08:34.741641 2574 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/1f60138e-b980-49ed-a234-2ad31edecde5-dshm\") on node \"ip-10-0-138-45.ec2.internal\" DevicePath \"\"" Apr 16 18:08:34.741746 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:08:34.741655 2574 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/32b8df8f-1d64-437d-a608-8fc7a5c1f8ff-model-cache\") on node \"ip-10-0-138-45.ec2.internal\" DevicePath \"\"" Apr 16 18:08:34.745831 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:08:34.745791 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1f60138e-b980-49ed-a234-2ad31edecde5-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "1f60138e-b980-49ed-a234-2ad31edecde5" (UID: "1f60138e-b980-49ed-a234-2ad31edecde5"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:08:34.843206 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:08:34.843163 2574 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1f60138e-b980-49ed-a234-2ad31edecde5-kserve-provision-location\") on node \"ip-10-0-138-45.ec2.internal\" DevicePath \"\"" Apr 16 18:08:35.264063 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:08:35.264029 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-69c958dbdd8kvv7_1f60138e-b980-49ed-a234-2ad31edecde5/main/0.log" Apr 16 18:08:35.264806 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:08:35.264777 2574 generic.go:358] "Generic (PLEG): container finished" podID="1f60138e-b980-49ed-a234-2ad31edecde5" containerID="0384b3d7e14a571e82aadced8cecd9716460630aa3f6995ab0829ef1ba6a132d" exitCode=137 Apr 16 18:08:35.264806 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:08:35.264807 2574 generic.go:358] "Generic (PLEG): container finished" podID="1f60138e-b980-49ed-a234-2ad31edecde5" containerID="736ff6af064da9cc7b62dfe11cd1ccb8fce897f8021b14d8e52d3136ecb611d0" exitCode=0 Apr 16 18:08:35.264967 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:08:35.264907 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-69c958dbdd8kvv7" event={"ID":"1f60138e-b980-49ed-a234-2ad31edecde5","Type":"ContainerDied","Data":"0384b3d7e14a571e82aadced8cecd9716460630aa3f6995ab0829ef1ba6a132d"} Apr 16 18:08:35.264967 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:08:35.264947 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-69c958dbdd8kvv7" event={"ID":"1f60138e-b980-49ed-a234-2ad31edecde5","Type":"ContainerDied","Data":"736ff6af064da9cc7b62dfe11cd1ccb8fce897f8021b14d8e52d3136ecb611d0"} Apr 16 18:08:35.264967 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:08:35.264961 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-69c958dbdd8kvv7" event={"ID":"1f60138e-b980-49ed-a234-2ad31edecde5","Type":"ContainerDied","Data":"2727482bba0e56cc1dee7db3d52b16b5fdc53392b01435737136f974bb5b40e0"} Apr 16 18:08:35.265104 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:08:35.264966 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-69c958dbdd8kvv7" Apr 16 18:08:35.265104 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:08:35.264980 2574 scope.go:117] "RemoveContainer" containerID="0384b3d7e14a571e82aadced8cecd9716460630aa3f6995ab0829ef1ba6a132d" Apr 16 18:08:35.266954 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:08:35.266924 2574 generic.go:358] "Generic (PLEG): container finished" podID="32b8df8f-1d64-437d-a608-8fc7a5c1f8ff" containerID="253bf90db35c1a22d1c3853c7f7de28123d7dc1827e97ad0bf8e5af3b43194ee" exitCode=137 Apr 16 18:08:35.267066 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:08:35.267004 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-694dqdq" Apr 16 18:08:35.267066 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:08:35.267023 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-694dqdq" event={"ID":"32b8df8f-1d64-437d-a608-8fc7a5c1f8ff","Type":"ContainerDied","Data":"253bf90db35c1a22d1c3853c7f7de28123d7dc1827e97ad0bf8e5af3b43194ee"} Apr 16 18:08:35.267066 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:08:35.267054 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-694dqdq" event={"ID":"32b8df8f-1d64-437d-a608-8fc7a5c1f8ff","Type":"ContainerDied","Data":"c829227308cffaac99c1ba91be858d74971d1e9cc13df198f918e19323ea84a1"} Apr 16 18:08:35.289622 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:08:35.289602 2574 scope.go:117] "RemoveContainer" containerID="70540704f041cf34db1afc3ec4a83cfb91a65e6745819786f0010ed429edb66f" Apr 16 18:08:35.299235 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:08:35.297026 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-69c958dbdd8kvv7"] Apr 16 18:08:35.301876 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:08:35.301832 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-69c958dbdd8kvv7"] Apr 16 18:08:35.314324 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:08:35.314291 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-694dqdq"] Apr 16 18:08:35.319491 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:08:35.319463 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-694dqdq"] Apr 16 18:08:35.364883 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:08:35.364855 2574 scope.go:117] "RemoveContainer" containerID="736ff6af064da9cc7b62dfe11cd1ccb8fce897f8021b14d8e52d3136ecb611d0" Apr 16 18:08:35.375661 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:08:35.375633 2574 scope.go:117] "RemoveContainer" containerID="0384b3d7e14a571e82aadced8cecd9716460630aa3f6995ab0829ef1ba6a132d" Apr 16 18:08:35.377036 ip-10-0-138-45 kubenswrapper[2574]: E0416 18:08:35.377004 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0384b3d7e14a571e82aadced8cecd9716460630aa3f6995ab0829ef1ba6a132d\": container with ID starting with 0384b3d7e14a571e82aadced8cecd9716460630aa3f6995ab0829ef1ba6a132d not found: ID does not exist" containerID="0384b3d7e14a571e82aadced8cecd9716460630aa3f6995ab0829ef1ba6a132d" Apr 16 18:08:35.377155 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:08:35.377048 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0384b3d7e14a571e82aadced8cecd9716460630aa3f6995ab0829ef1ba6a132d"} err="failed to get container status \"0384b3d7e14a571e82aadced8cecd9716460630aa3f6995ab0829ef1ba6a132d\": rpc error: code = NotFound desc = could not find container \"0384b3d7e14a571e82aadced8cecd9716460630aa3f6995ab0829ef1ba6a132d\": container with ID starting with 0384b3d7e14a571e82aadced8cecd9716460630aa3f6995ab0829ef1ba6a132d not found: ID does not exist" Apr 16 18:08:35.377155 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:08:35.377077 2574 scope.go:117] "RemoveContainer" containerID="70540704f041cf34db1afc3ec4a83cfb91a65e6745819786f0010ed429edb66f" Apr 16 18:08:35.377765 ip-10-0-138-45 kubenswrapper[2574]: E0416 18:08:35.377736 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"70540704f041cf34db1afc3ec4a83cfb91a65e6745819786f0010ed429edb66f\": container with ID starting with 70540704f041cf34db1afc3ec4a83cfb91a65e6745819786f0010ed429edb66f not found: ID does not exist" containerID="70540704f041cf34db1afc3ec4a83cfb91a65e6745819786f0010ed429edb66f" Apr 16 18:08:35.377879 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:08:35.377773 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"70540704f041cf34db1afc3ec4a83cfb91a65e6745819786f0010ed429edb66f"} err="failed to get container status \"70540704f041cf34db1afc3ec4a83cfb91a65e6745819786f0010ed429edb66f\": rpc error: code = NotFound desc = could not find container \"70540704f041cf34db1afc3ec4a83cfb91a65e6745819786f0010ed429edb66f\": container with ID starting with 70540704f041cf34db1afc3ec4a83cfb91a65e6745819786f0010ed429edb66f not found: ID does not exist" Apr 16 18:08:35.377879 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:08:35.377799 2574 scope.go:117] "RemoveContainer" containerID="736ff6af064da9cc7b62dfe11cd1ccb8fce897f8021b14d8e52d3136ecb611d0" Apr 16 18:08:35.378263 ip-10-0-138-45 kubenswrapper[2574]: E0416 18:08:35.378236 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"736ff6af064da9cc7b62dfe11cd1ccb8fce897f8021b14d8e52d3136ecb611d0\": container with ID starting with 736ff6af064da9cc7b62dfe11cd1ccb8fce897f8021b14d8e52d3136ecb611d0 not found: ID does not exist" containerID="736ff6af064da9cc7b62dfe11cd1ccb8fce897f8021b14d8e52d3136ecb611d0" Apr 16 18:08:35.378372 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:08:35.378272 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"736ff6af064da9cc7b62dfe11cd1ccb8fce897f8021b14d8e52d3136ecb611d0"} err="failed to get container status \"736ff6af064da9cc7b62dfe11cd1ccb8fce897f8021b14d8e52d3136ecb611d0\": rpc error: code = NotFound desc = could not find container \"736ff6af064da9cc7b62dfe11cd1ccb8fce897f8021b14d8e52d3136ecb611d0\": container with ID starting with 736ff6af064da9cc7b62dfe11cd1ccb8fce897f8021b14d8e52d3136ecb611d0 not found: ID does not exist" Apr 16 18:08:35.378372 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:08:35.378294 2574 scope.go:117] "RemoveContainer" containerID="0384b3d7e14a571e82aadced8cecd9716460630aa3f6995ab0829ef1ba6a132d" Apr 16 18:08:35.378603 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:08:35.378552 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0384b3d7e14a571e82aadced8cecd9716460630aa3f6995ab0829ef1ba6a132d"} err="failed to get container status \"0384b3d7e14a571e82aadced8cecd9716460630aa3f6995ab0829ef1ba6a132d\": rpc error: code = NotFound desc = could not find container \"0384b3d7e14a571e82aadced8cecd9716460630aa3f6995ab0829ef1ba6a132d\": container with ID starting with 0384b3d7e14a571e82aadced8cecd9716460630aa3f6995ab0829ef1ba6a132d not found: ID does not exist" Apr 16 18:08:35.378713 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:08:35.378637 2574 scope.go:117] "RemoveContainer" containerID="70540704f041cf34db1afc3ec4a83cfb91a65e6745819786f0010ed429edb66f" Apr 16 18:08:35.378885 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:08:35.378866 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"70540704f041cf34db1afc3ec4a83cfb91a65e6745819786f0010ed429edb66f"} err="failed to get container status \"70540704f041cf34db1afc3ec4a83cfb91a65e6745819786f0010ed429edb66f\": rpc error: code = NotFound desc = could not find container \"70540704f041cf34db1afc3ec4a83cfb91a65e6745819786f0010ed429edb66f\": container with ID starting with 70540704f041cf34db1afc3ec4a83cfb91a65e6745819786f0010ed429edb66f not found: ID does not exist" Apr 16 18:08:35.378885 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:08:35.378885 2574 scope.go:117] "RemoveContainer" containerID="736ff6af064da9cc7b62dfe11cd1ccb8fce897f8021b14d8e52d3136ecb611d0" Apr 16 18:08:35.379149 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:08:35.379131 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"736ff6af064da9cc7b62dfe11cd1ccb8fce897f8021b14d8e52d3136ecb611d0"} err="failed to get container status \"736ff6af064da9cc7b62dfe11cd1ccb8fce897f8021b14d8e52d3136ecb611d0\": rpc error: code = NotFound desc = could not find container \"736ff6af064da9cc7b62dfe11cd1ccb8fce897f8021b14d8e52d3136ecb611d0\": container with ID starting with 736ff6af064da9cc7b62dfe11cd1ccb8fce897f8021b14d8e52d3136ecb611d0 not found: ID does not exist" Apr 16 18:08:35.379149 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:08:35.379148 2574 scope.go:117] "RemoveContainer" containerID="253bf90db35c1a22d1c3853c7f7de28123d7dc1827e97ad0bf8e5af3b43194ee" Apr 16 18:08:35.405959 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:08:35.405936 2574 scope.go:117] "RemoveContainer" containerID="db7cd248dc334b24d56727a5c7130790b05b808418e974a217f97d46811787a4" Apr 16 18:08:35.486807 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:08:35.486781 2574 scope.go:117] "RemoveContainer" containerID="253bf90db35c1a22d1c3853c7f7de28123d7dc1827e97ad0bf8e5af3b43194ee" Apr 16 18:08:35.487201 ip-10-0-138-45 kubenswrapper[2574]: E0416 18:08:35.487170 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"253bf90db35c1a22d1c3853c7f7de28123d7dc1827e97ad0bf8e5af3b43194ee\": container with ID starting with 253bf90db35c1a22d1c3853c7f7de28123d7dc1827e97ad0bf8e5af3b43194ee not found: ID does not exist" containerID="253bf90db35c1a22d1c3853c7f7de28123d7dc1827e97ad0bf8e5af3b43194ee" Apr 16 18:08:35.487340 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:08:35.487207 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"253bf90db35c1a22d1c3853c7f7de28123d7dc1827e97ad0bf8e5af3b43194ee"} err="failed to get container status \"253bf90db35c1a22d1c3853c7f7de28123d7dc1827e97ad0bf8e5af3b43194ee\": rpc error: code = NotFound desc = could not find container \"253bf90db35c1a22d1c3853c7f7de28123d7dc1827e97ad0bf8e5af3b43194ee\": container with ID starting with 253bf90db35c1a22d1c3853c7f7de28123d7dc1827e97ad0bf8e5af3b43194ee not found: ID does not exist" Apr 16 18:08:35.487340 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:08:35.487230 2574 scope.go:117] "RemoveContainer" containerID="db7cd248dc334b24d56727a5c7130790b05b808418e974a217f97d46811787a4" Apr 16 18:08:35.487717 ip-10-0-138-45 kubenswrapper[2574]: E0416 18:08:35.487535 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"db7cd248dc334b24d56727a5c7130790b05b808418e974a217f97d46811787a4\": container with ID starting with db7cd248dc334b24d56727a5c7130790b05b808418e974a217f97d46811787a4 not found: ID does not exist" containerID="db7cd248dc334b24d56727a5c7130790b05b808418e974a217f97d46811787a4" Apr 16 18:08:35.487717 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:08:35.487593 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db7cd248dc334b24d56727a5c7130790b05b808418e974a217f97d46811787a4"} err="failed to get container status \"db7cd248dc334b24d56727a5c7130790b05b808418e974a217f97d46811787a4\": rpc error: code = NotFound desc = could not find container \"db7cd248dc334b24d56727a5c7130790b05b808418e974a217f97d46811787a4\": container with ID starting with db7cd248dc334b24d56727a5c7130790b05b808418e974a217f97d46811787a4 not found: ID does not exist" Apr 16 18:08:36.455169 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:08:36.455135 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1f60138e-b980-49ed-a234-2ad31edecde5" path="/var/lib/kubelet/pods/1f60138e-b980-49ed-a234-2ad31edecde5/volumes" Apr 16 18:08:36.455861 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:08:36.455839 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="32b8df8f-1d64-437d-a608-8fc7a5c1f8ff" path="/var/lib/kubelet/pods/32b8df8f-1d64-437d-a608-8fc7a5c1f8ff/volumes" Apr 16 18:08:40.646950 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:08:40.646900 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-78ddd989bf-8xb8f" podUID="9db6872d-fd5b-4e35-82f1-af7192c3c460" containerName="main" probeResult="failure" output="Get \"https://10.132.0.37:8001/health\": dial tcp 10.132.0.37:8001: connect: connection refused" Apr 16 18:08:40.661175 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:08:40.661127 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-cddd6858-5mv9d" podUID="dead6bf3-c514-42e7-8d68-37056d1b826d" containerName="main" probeResult="failure" output="Get \"https://10.132.0.38:8000/health\": dial tcp 10.132.0.38:8000: connect: connection refused" Apr 16 18:08:42.177991 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:08:42.177937 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="65bccf38-702a-4382-8313-29b4cd2eb3f1" containerName="main" probeResult="failure" output="Get \"https://10.132.0.36:8000/health\": dial tcp 10.132.0.36:8000: connect: connection refused" Apr 16 18:08:50.647047 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:08:50.646887 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-78ddd989bf-8xb8f" podUID="9db6872d-fd5b-4e35-82f1-af7192c3c460" containerName="main" probeResult="failure" output="Get \"https://10.132.0.37:8001/health\": dial tcp 10.132.0.37:8001: connect: connection refused" Apr 16 18:08:50.661406 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:08:50.661350 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-cddd6858-5mv9d" podUID="dead6bf3-c514-42e7-8d68-37056d1b826d" containerName="main" probeResult="failure" output="Get \"https://10.132.0.38:8000/health\": dial tcp 10.132.0.38:8000: connect: connection refused" Apr 16 18:08:52.178302 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:08:52.178257 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="65bccf38-702a-4382-8313-29b4cd2eb3f1" containerName="main" probeResult="failure" output="Get \"https://10.132.0.36:8000/health\": dial tcp 10.132.0.36:8000: connect: connection refused" Apr 16 18:09:00.647076 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:09:00.647012 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-78ddd989bf-8xb8f" podUID="9db6872d-fd5b-4e35-82f1-af7192c3c460" containerName="main" probeResult="failure" output="Get \"https://10.132.0.37:8001/health\": dial tcp 10.132.0.37:8001: connect: connection refused" Apr 16 18:09:00.660552 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:09:00.660502 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-cddd6858-5mv9d" podUID="dead6bf3-c514-42e7-8d68-37056d1b826d" containerName="main" probeResult="failure" output="Get \"https://10.132.0.38:8000/health\": dial tcp 10.132.0.38:8000: connect: connection refused" Apr 16 18:09:02.178095 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:09:02.178034 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="65bccf38-702a-4382-8313-29b4cd2eb3f1" containerName="main" probeResult="failure" output="Get \"https://10.132.0.36:8000/health\": dial tcp 10.132.0.36:8000: connect: connection refused" Apr 16 18:09:10.647462 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:09:10.647406 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-78ddd989bf-8xb8f" podUID="9db6872d-fd5b-4e35-82f1-af7192c3c460" containerName="main" probeResult="failure" output="Get \"https://10.132.0.37:8001/health\": dial tcp 10.132.0.37:8001: connect: connection refused" Apr 16 18:09:10.660964 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:09:10.660915 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-cddd6858-5mv9d" podUID="dead6bf3-c514-42e7-8d68-37056d1b826d" containerName="main" probeResult="failure" output="Get \"https://10.132.0.38:8000/health\": dial tcp 10.132.0.38:8000: connect: connection refused" Apr 16 18:09:12.178327 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:09:12.178268 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="65bccf38-702a-4382-8313-29b4cd2eb3f1" containerName="main" probeResult="failure" output="Get \"https://10.132.0.36:8000/health\": dial tcp 10.132.0.36:8000: connect: connection refused" Apr 16 18:09:20.647065 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:09:20.647004 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-78ddd989bf-8xb8f" podUID="9db6872d-fd5b-4e35-82f1-af7192c3c460" containerName="main" probeResult="failure" output="Get \"https://10.132.0.37:8001/health\": dial tcp 10.132.0.37:8001: connect: connection refused" Apr 16 18:09:20.661154 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:09:20.661105 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-cddd6858-5mv9d" podUID="dead6bf3-c514-42e7-8d68-37056d1b826d" containerName="main" probeResult="failure" output="Get \"https://10.132.0.38:8000/health\": dial tcp 10.132.0.38:8000: connect: connection refused" Apr 16 18:09:22.177905 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:09:22.177860 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="65bccf38-702a-4382-8313-29b4cd2eb3f1" containerName="main" probeResult="failure" output="Get \"https://10.132.0.36:8000/health\": dial tcp 10.132.0.36:8000: connect: connection refused" Apr 16 18:09:30.646852 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:09:30.646804 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-78ddd989bf-8xb8f" podUID="9db6872d-fd5b-4e35-82f1-af7192c3c460" containerName="main" probeResult="failure" output="Get \"https://10.132.0.37:8001/health\": dial tcp 10.132.0.37:8001: connect: connection refused" Apr 16 18:09:30.660546 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:09:30.660496 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-cddd6858-5mv9d" podUID="dead6bf3-c514-42e7-8d68-37056d1b826d" containerName="main" probeResult="failure" output="Get \"https://10.132.0.38:8000/health\": dial tcp 10.132.0.38:8000: connect: connection refused" Apr 16 18:09:32.177539 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:09:32.177492 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="65bccf38-702a-4382-8313-29b4cd2eb3f1" containerName="main" probeResult="failure" output="Get \"https://10.132.0.36:8000/health\": dial tcp 10.132.0.36:8000: connect: connection refused" Apr 16 18:09:40.646819 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:09:40.646720 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-78ddd989bf-8xb8f" podUID="9db6872d-fd5b-4e35-82f1-af7192c3c460" containerName="main" probeResult="failure" output="Get \"https://10.132.0.37:8001/health\": dial tcp 10.132.0.37:8001: connect: connection refused" Apr 16 18:09:40.660859 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:09:40.660820 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-cddd6858-5mv9d" podUID="dead6bf3-c514-42e7-8d68-37056d1b826d" containerName="main" probeResult="failure" output="Get \"https://10.132.0.38:8000/health\": dial tcp 10.132.0.38:8000: connect: connection refused" Apr 16 18:09:42.177887 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:09:42.177846 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="65bccf38-702a-4382-8313-29b4cd2eb3f1" containerName="main" probeResult="failure" output="Get \"https://10.132.0.36:8000/health\": dial tcp 10.132.0.36:8000: connect: connection refused" Apr 16 18:09:50.496106 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:09:50.496079 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-zcvg5_05d7dd61-bda6-4c4d-87b6-86d7c20fe0cd/console-operator/1.log" Apr 16 18:09:50.498227 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:09:50.498207 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-zcvg5_05d7dd61-bda6-4c4d-87b6-86d7c20fe0cd/console-operator/1.log" Apr 16 18:09:50.501626 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:09:50.501606 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-42vwn_2ee69609-b777-4f06-988b-8e74dd389f13/ovn-acl-logging/0.log" Apr 16 18:09:50.503469 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:09:50.503451 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-42vwn_2ee69609-b777-4f06-988b-8e74dd389f13/ovn-acl-logging/0.log" Apr 16 18:09:50.646953 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:09:50.646907 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-78ddd989bf-8xb8f" podUID="9db6872d-fd5b-4e35-82f1-af7192c3c460" containerName="main" probeResult="failure" output="Get \"https://10.132.0.37:8001/health\": dial tcp 10.132.0.37:8001: connect: connection refused" Apr 16 18:09:50.661307 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:09:50.661273 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-cddd6858-5mv9d" podUID="dead6bf3-c514-42e7-8d68-37056d1b826d" containerName="main" probeResult="failure" output="Get \"https://10.132.0.38:8000/health\": dial tcp 10.132.0.38:8000: connect: connection refused" Apr 16 18:09:52.177997 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:09:52.177939 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="65bccf38-702a-4382-8313-29b4cd2eb3f1" containerName="main" probeResult="failure" output="Get \"https://10.132.0.36:8000/health\": dial tcp 10.132.0.36:8000: connect: connection refused" Apr 16 18:10:00.646966 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:10:00.646915 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-78ddd989bf-8xb8f" podUID="9db6872d-fd5b-4e35-82f1-af7192c3c460" containerName="main" probeResult="failure" output="Get \"https://10.132.0.37:8001/health\": dial tcp 10.132.0.37:8001: connect: connection refused" Apr 16 18:10:00.660775 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:10:00.660732 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-cddd6858-5mv9d" podUID="dead6bf3-c514-42e7-8d68-37056d1b826d" containerName="main" probeResult="failure" output="Get \"https://10.132.0.38:8000/health\": dial tcp 10.132.0.38:8000: connect: connection refused" Apr 16 18:10:02.178148 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:10:02.178094 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="65bccf38-702a-4382-8313-29b4cd2eb3f1" containerName="main" probeResult="failure" output="Get \"https://10.132.0.36:8000/health\": dial tcp 10.132.0.36:8000: connect: connection refused" Apr 16 18:10:10.647317 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:10:10.647251 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-78ddd989bf-8xb8f" podUID="9db6872d-fd5b-4e35-82f1-af7192c3c460" containerName="main" probeResult="failure" output="Get \"https://10.132.0.37:8001/health\": dial tcp 10.132.0.37:8001: connect: connection refused" Apr 16 18:10:10.660746 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:10:10.660706 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-cddd6858-5mv9d" podUID="dead6bf3-c514-42e7-8d68-37056d1b826d" containerName="main" probeResult="failure" output="Get \"https://10.132.0.38:8000/health\": dial tcp 10.132.0.38:8000: connect: connection refused" Apr 16 18:10:12.178194 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:10:12.178140 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="65bccf38-702a-4382-8313-29b4cd2eb3f1" containerName="main" probeResult="failure" output="Get \"https://10.132.0.36:8000/health\": dial tcp 10.132.0.36:8000: connect: connection refused" Apr 16 18:10:20.646743 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:10:20.646692 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-78ddd989bf-8xb8f" podUID="9db6872d-fd5b-4e35-82f1-af7192c3c460" containerName="main" probeResult="failure" output="Get \"https://10.132.0.37:8001/health\": dial tcp 10.132.0.37:8001: connect: connection refused" Apr 16 18:10:20.660562 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:10:20.660519 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-cddd6858-5mv9d" podUID="dead6bf3-c514-42e7-8d68-37056d1b826d" containerName="main" probeResult="failure" output="Get \"https://10.132.0.38:8000/health\": dial tcp 10.132.0.38:8000: connect: connection refused" Apr 16 18:10:22.177907 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:10:22.177862 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="65bccf38-702a-4382-8313-29b4cd2eb3f1" containerName="main" probeResult="failure" output="Get \"https://10.132.0.36:8000/health\": dial tcp 10.132.0.36:8000: connect: connection refused" Apr 16 18:10:30.647462 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:10:30.647410 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-78ddd989bf-8xb8f" podUID="9db6872d-fd5b-4e35-82f1-af7192c3c460" containerName="main" probeResult="failure" output="Get \"https://10.132.0.37:8001/health\": dial tcp 10.132.0.37:8001: connect: connection refused" Apr 16 18:10:30.660473 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:10:30.660415 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-cddd6858-5mv9d" podUID="dead6bf3-c514-42e7-8d68-37056d1b826d" containerName="main" probeResult="failure" output="Get \"https://10.132.0.38:8000/health\": dial tcp 10.132.0.38:8000: connect: connection refused" Apr 16 18:10:32.177724 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:10:32.177683 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="65bccf38-702a-4382-8313-29b4cd2eb3f1" containerName="main" probeResult="failure" output="Get \"https://10.132.0.36:8000/health\": dial tcp 10.132.0.36:8000: connect: connection refused" Apr 16 18:10:40.647626 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:10:40.647543 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-78ddd989bf-8xb8f" podUID="9db6872d-fd5b-4e35-82f1-af7192c3c460" containerName="main" probeResult="failure" output="Get \"https://10.132.0.37:8001/health\": dial tcp 10.132.0.37:8001: connect: connection refused" Apr 16 18:10:40.662050 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:10:40.662015 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-cddd6858-5mv9d" podUID="dead6bf3-c514-42e7-8d68-37056d1b826d" containerName="main" probeResult="failure" output="Get \"https://10.132.0.38:8000/health\": dial tcp 10.132.0.38:8000: connect: connection refused" Apr 16 18:10:42.178104 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:10:42.178049 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="65bccf38-702a-4382-8313-29b4cd2eb3f1" containerName="main" probeResult="failure" output="Get \"https://10.132.0.36:8000/health\": dial tcp 10.132.0.36:8000: connect: connection refused" Apr 16 18:10:50.647012 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:10:50.646958 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-78ddd989bf-8xb8f" podUID="9db6872d-fd5b-4e35-82f1-af7192c3c460" containerName="main" probeResult="failure" output="Get \"https://10.132.0.37:8001/health\": dial tcp 10.132.0.37:8001: connect: connection refused" Apr 16 18:10:50.660671 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:10:50.660634 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-cddd6858-5mv9d" podUID="dead6bf3-c514-42e7-8d68-37056d1b826d" containerName="main" probeResult="failure" output="Get \"https://10.132.0.38:8000/health\": dial tcp 10.132.0.38:8000: connect: connection refused" Apr 16 18:10:52.187473 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:10:52.187432 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 18:10:52.195236 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:10:52.195208 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 18:11:00.647216 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:11:00.647157 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-78ddd989bf-8xb8f" podUID="9db6872d-fd5b-4e35-82f1-af7192c3c460" containerName="main" probeResult="failure" output="Get \"https://10.132.0.37:8001/health\": dial tcp 10.132.0.37:8001: connect: connection refused" Apr 16 18:11:00.661374 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:11:00.661332 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-cddd6858-5mv9d" podUID="dead6bf3-c514-42e7-8d68-37056d1b826d" containerName="main" probeResult="failure" output="Get \"https://10.132.0.38:8000/health\": dial tcp 10.132.0.38:8000: connect: connection refused" Apr 16 18:11:08.344866 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:11:08.344770 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0"] Apr 16 18:11:08.345329 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:11:08.345104 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="65bccf38-702a-4382-8313-29b4cd2eb3f1" containerName="main" containerID="cri-o://8a2f6dcb5bb8efec67d854c14796de4895fc5bad5d5ffcf83075e899293fb7e2" gracePeriod=30 Apr 16 18:11:09.786102 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:11:09.786074 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 18:11:09.829997 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:11:09.829908 2574 generic.go:358] "Generic (PLEG): container finished" podID="65bccf38-702a-4382-8313-29b4cd2eb3f1" containerID="8a2f6dcb5bb8efec67d854c14796de4895fc5bad5d5ffcf83075e899293fb7e2" exitCode=0 Apr 16 18:11:09.830146 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:11:09.829996 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" event={"ID":"65bccf38-702a-4382-8313-29b4cd2eb3f1","Type":"ContainerDied","Data":"8a2f6dcb5bb8efec67d854c14796de4895fc5bad5d5ffcf83075e899293fb7e2"} Apr 16 18:11:09.830146 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:11:09.830005 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 18:11:09.830146 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:11:09.830046 2574 scope.go:117] "RemoveContainer" containerID="8a2f6dcb5bb8efec67d854c14796de4895fc5bad5d5ffcf83075e899293fb7e2" Apr 16 18:11:09.830146 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:11:09.830035 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" event={"ID":"65bccf38-702a-4382-8313-29b4cd2eb3f1","Type":"ContainerDied","Data":"1b378ee5f1fd4c67bafe07da86a3fbfb069df207391da21252ece72c1d30ece6"} Apr 16 18:11:09.854646 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:11:09.854623 2574 scope.go:117] "RemoveContainer" containerID="01327005f5477b9660498f65d4f741c04cdff589fda8ff745158e334c5d6b4f6" Apr 16 18:11:09.916788 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:11:09.916766 2574 scope.go:117] "RemoveContainer" containerID="8a2f6dcb5bb8efec67d854c14796de4895fc5bad5d5ffcf83075e899293fb7e2" Apr 16 18:11:09.917139 ip-10-0-138-45 kubenswrapper[2574]: E0416 18:11:09.917118 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8a2f6dcb5bb8efec67d854c14796de4895fc5bad5d5ffcf83075e899293fb7e2\": container with ID starting with 8a2f6dcb5bb8efec67d854c14796de4895fc5bad5d5ffcf83075e899293fb7e2 not found: ID does not exist" containerID="8a2f6dcb5bb8efec67d854c14796de4895fc5bad5d5ffcf83075e899293fb7e2" Apr 16 18:11:09.917193 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:11:09.917153 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a2f6dcb5bb8efec67d854c14796de4895fc5bad5d5ffcf83075e899293fb7e2"} err="failed to get container status \"8a2f6dcb5bb8efec67d854c14796de4895fc5bad5d5ffcf83075e899293fb7e2\": rpc error: code = NotFound desc = could not find container \"8a2f6dcb5bb8efec67d854c14796de4895fc5bad5d5ffcf83075e899293fb7e2\": container with ID starting with 8a2f6dcb5bb8efec67d854c14796de4895fc5bad5d5ffcf83075e899293fb7e2 not found: ID does not exist" Apr 16 18:11:09.917193 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:11:09.917171 2574 scope.go:117] "RemoveContainer" containerID="01327005f5477b9660498f65d4f741c04cdff589fda8ff745158e334c5d6b4f6" Apr 16 18:11:09.917493 ip-10-0-138-45 kubenswrapper[2574]: E0416 18:11:09.917471 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"01327005f5477b9660498f65d4f741c04cdff589fda8ff745158e334c5d6b4f6\": container with ID starting with 01327005f5477b9660498f65d4f741c04cdff589fda8ff745158e334c5d6b4f6 not found: ID does not exist" containerID="01327005f5477b9660498f65d4f741c04cdff589fda8ff745158e334c5d6b4f6" Apr 16 18:11:09.917566 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:11:09.917505 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"01327005f5477b9660498f65d4f741c04cdff589fda8ff745158e334c5d6b4f6"} err="failed to get container status \"01327005f5477b9660498f65d4f741c04cdff589fda8ff745158e334c5d6b4f6\": rpc error: code = NotFound desc = could not find container \"01327005f5477b9660498f65d4f741c04cdff589fda8ff745158e334c5d6b4f6\": container with ID starting with 01327005f5477b9660498f65d4f741c04cdff589fda8ff745158e334c5d6b4f6 not found: ID does not exist" Apr 16 18:11:09.921788 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:11:09.921772 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/65bccf38-702a-4382-8313-29b4cd2eb3f1-kserve-provision-location\") pod \"65bccf38-702a-4382-8313-29b4cd2eb3f1\" (UID: \"65bccf38-702a-4382-8313-29b4cd2eb3f1\") " Apr 16 18:11:09.921839 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:11:09.921826 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mvbvk\" (UniqueName: \"kubernetes.io/projected/65bccf38-702a-4382-8313-29b4cd2eb3f1-kube-api-access-mvbvk\") pod \"65bccf38-702a-4382-8313-29b4cd2eb3f1\" (UID: \"65bccf38-702a-4382-8313-29b4cd2eb3f1\") " Apr 16 18:11:09.921874 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:11:09.921857 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/65bccf38-702a-4382-8313-29b4cd2eb3f1-dshm\") pod \"65bccf38-702a-4382-8313-29b4cd2eb3f1\" (UID: \"65bccf38-702a-4382-8313-29b4cd2eb3f1\") " Apr 16 18:11:09.921908 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:11:09.921878 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/65bccf38-702a-4382-8313-29b4cd2eb3f1-home\") pod \"65bccf38-702a-4382-8313-29b4cd2eb3f1\" (UID: \"65bccf38-702a-4382-8313-29b4cd2eb3f1\") " Apr 16 18:11:09.921946 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:11:09.921918 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/65bccf38-702a-4382-8313-29b4cd2eb3f1-tls-certs\") pod \"65bccf38-702a-4382-8313-29b4cd2eb3f1\" (UID: \"65bccf38-702a-4382-8313-29b4cd2eb3f1\") " Apr 16 18:11:09.921946 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:11:09.921944 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/65bccf38-702a-4382-8313-29b4cd2eb3f1-model-cache\") pod \"65bccf38-702a-4382-8313-29b4cd2eb3f1\" (UID: \"65bccf38-702a-4382-8313-29b4cd2eb3f1\") " Apr 16 18:11:09.922351 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:11:09.922316 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/65bccf38-702a-4382-8313-29b4cd2eb3f1-model-cache" (OuterVolumeSpecName: "model-cache") pod "65bccf38-702a-4382-8313-29b4cd2eb3f1" (UID: "65bccf38-702a-4382-8313-29b4cd2eb3f1"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:11:09.922351 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:11:09.922333 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/65bccf38-702a-4382-8313-29b4cd2eb3f1-home" (OuterVolumeSpecName: "home") pod "65bccf38-702a-4382-8313-29b4cd2eb3f1" (UID: "65bccf38-702a-4382-8313-29b4cd2eb3f1"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:11:09.924305 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:11:09.924269 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65bccf38-702a-4382-8313-29b4cd2eb3f1-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "65bccf38-702a-4382-8313-29b4cd2eb3f1" (UID: "65bccf38-702a-4382-8313-29b4cd2eb3f1"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:11:09.924413 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:11:09.924359 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/65bccf38-702a-4382-8313-29b4cd2eb3f1-kube-api-access-mvbvk" (OuterVolumeSpecName: "kube-api-access-mvbvk") pod "65bccf38-702a-4382-8313-29b4cd2eb3f1" (UID: "65bccf38-702a-4382-8313-29b4cd2eb3f1"). InnerVolumeSpecName "kube-api-access-mvbvk". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:11:09.924413 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:11:09.924404 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/65bccf38-702a-4382-8313-29b4cd2eb3f1-dshm" (OuterVolumeSpecName: "dshm") pod "65bccf38-702a-4382-8313-29b4cd2eb3f1" (UID: "65bccf38-702a-4382-8313-29b4cd2eb3f1"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:11:09.979684 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:11:09.979644 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/65bccf38-702a-4382-8313-29b4cd2eb3f1-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "65bccf38-702a-4382-8313-29b4cd2eb3f1" (UID: "65bccf38-702a-4382-8313-29b4cd2eb3f1"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:11:10.023013 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:11:10.022978 2574 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/65bccf38-702a-4382-8313-29b4cd2eb3f1-kserve-provision-location\") on node \"ip-10-0-138-45.ec2.internal\" DevicePath \"\"" Apr 16 18:11:10.023013 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:11:10.023006 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-mvbvk\" (UniqueName: \"kubernetes.io/projected/65bccf38-702a-4382-8313-29b4cd2eb3f1-kube-api-access-mvbvk\") on node \"ip-10-0-138-45.ec2.internal\" DevicePath \"\"" Apr 16 18:11:10.023013 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:11:10.023019 2574 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/65bccf38-702a-4382-8313-29b4cd2eb3f1-dshm\") on node \"ip-10-0-138-45.ec2.internal\" DevicePath \"\"" Apr 16 18:11:10.023228 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:11:10.023029 2574 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/65bccf38-702a-4382-8313-29b4cd2eb3f1-home\") on node \"ip-10-0-138-45.ec2.internal\" DevicePath \"\"" Apr 16 18:11:10.023228 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:11:10.023038 2574 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/65bccf38-702a-4382-8313-29b4cd2eb3f1-tls-certs\") on node \"ip-10-0-138-45.ec2.internal\" DevicePath \"\"" Apr 16 18:11:10.023228 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:11:10.023046 2574 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/65bccf38-702a-4382-8313-29b4cd2eb3f1-model-cache\") on node \"ip-10-0-138-45.ec2.internal\" DevicePath \"\"" Apr 16 18:11:10.155207 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:11:10.155174 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0"] Apr 16 18:11:10.162320 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:11:10.162290 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0"] Apr 16 18:11:10.455322 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:11:10.455281 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="65bccf38-702a-4382-8313-29b4cd2eb3f1" path="/var/lib/kubelet/pods/65bccf38-702a-4382-8313-29b4cd2eb3f1/volumes" Apr 16 18:11:10.646919 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:11:10.646867 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-78ddd989bf-8xb8f" podUID="9db6872d-fd5b-4e35-82f1-af7192c3c460" containerName="main" probeResult="failure" output="Get \"https://10.132.0.37:8001/health\": dial tcp 10.132.0.37:8001: connect: connection refused" Apr 16 18:11:10.661159 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:11:10.661109 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-cddd6858-5mv9d" podUID="dead6bf3-c514-42e7-8d68-37056d1b826d" containerName="main" probeResult="failure" output="Get \"https://10.132.0.38:8000/health\": dial tcp 10.132.0.38:8000: connect: connection refused" Apr 16 18:11:20.656699 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:11:20.656666 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-78ddd989bf-8xb8f" Apr 16 18:11:20.674774 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:11:20.674743 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-78ddd989bf-8xb8f" Apr 16 18:11:20.676764 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:11:20.676747 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-cddd6858-5mv9d" Apr 16 18:11:20.684131 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:11:20.684112 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-cddd6858-5mv9d" Apr 16 18:11:35.938558 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:11:35.938524 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-78ddd989bf-8xb8f"] Apr 16 18:11:35.939085 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:11:35.938907 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-78ddd989bf-8xb8f" podUID="9db6872d-fd5b-4e35-82f1-af7192c3c460" containerName="main" containerID="cri-o://d17fac4415bab972bf1e7671eea9516d6021e19e5ed6320a7240c7cbec72af6c" gracePeriod=30 Apr 16 18:11:35.942933 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:11:35.942911 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-cddd6858-5mv9d"] Apr 16 18:11:35.943253 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:11:35.943197 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-cddd6858-5mv9d" podUID="dead6bf3-c514-42e7-8d68-37056d1b826d" containerName="main" containerID="cri-o://939aca6fa4c889b8b65dfb6a2ddd808fd2d02e6f2774ecfc556187a751fdd899" gracePeriod=30 Apr 16 18:11:44.682862 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:11:44.682825 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-58bd99f4df-qmfrp"] Apr 16 18:11:44.683319 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:11:44.683118 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1f60138e-b980-49ed-a234-2ad31edecde5" containerName="storage-initializer" Apr 16 18:11:44.683319 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:11:44.683129 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f60138e-b980-49ed-a234-2ad31edecde5" containerName="storage-initializer" Apr 16 18:11:44.683319 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:11:44.683144 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1f60138e-b980-49ed-a234-2ad31edecde5" containerName="main" Apr 16 18:11:44.683319 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:11:44.683150 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f60138e-b980-49ed-a234-2ad31edecde5" containerName="main" Apr 16 18:11:44.683319 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:11:44.683157 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="32b8df8f-1d64-437d-a608-8fc7a5c1f8ff" containerName="main" Apr 16 18:11:44.683319 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:11:44.683163 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="32b8df8f-1d64-437d-a608-8fc7a5c1f8ff" containerName="main" Apr 16 18:11:44.683319 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:11:44.683173 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="65bccf38-702a-4382-8313-29b4cd2eb3f1" containerName="main" Apr 16 18:11:44.683319 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:11:44.683178 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="65bccf38-702a-4382-8313-29b4cd2eb3f1" containerName="main" Apr 16 18:11:44.683319 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:11:44.683184 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="65bccf38-702a-4382-8313-29b4cd2eb3f1" containerName="storage-initializer" Apr 16 18:11:44.683319 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:11:44.683190 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="65bccf38-702a-4382-8313-29b4cd2eb3f1" containerName="storage-initializer" Apr 16 18:11:44.683319 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:11:44.683199 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="32b8df8f-1d64-437d-a608-8fc7a5c1f8ff" containerName="storage-initializer" Apr 16 18:11:44.683319 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:11:44.683203 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="32b8df8f-1d64-437d-a608-8fc7a5c1f8ff" containerName="storage-initializer" Apr 16 18:11:44.683319 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:11:44.683209 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1f60138e-b980-49ed-a234-2ad31edecde5" containerName="llm-d-routing-sidecar" Apr 16 18:11:44.683319 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:11:44.683213 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f60138e-b980-49ed-a234-2ad31edecde5" containerName="llm-d-routing-sidecar" Apr 16 18:11:44.683319 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:11:44.683278 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="65bccf38-702a-4382-8313-29b4cd2eb3f1" containerName="main" Apr 16 18:11:44.683319 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:11:44.683289 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="1f60138e-b980-49ed-a234-2ad31edecde5" containerName="main" Apr 16 18:11:44.683319 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:11:44.683295 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="1f60138e-b980-49ed-a234-2ad31edecde5" containerName="llm-d-routing-sidecar" Apr 16 18:11:44.683319 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:11:44.683301 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="32b8df8f-1d64-437d-a608-8fc7a5c1f8ff" containerName="main" Apr 16 18:11:44.686197 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:11:44.686180 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-58bd99f4df-qmfrp" Apr 16 18:11:44.688184 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:11:44.688162 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-with-refs-pd-test-kserve-dockercfg-rb695\"" Apr 16 18:11:44.688388 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:11:44.688371 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-with-refs-pd-test-kserve-self-signed-certs\"" Apr 16 18:11:44.704780 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:11:44.704753 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-58bd99f4df-qmfrp"] Apr 16 18:11:44.714839 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:11:44.714810 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-svqbx\" (UniqueName: \"kubernetes.io/projected/4851a74d-dfd8-4bd5-a5f3-567e9fe4f664-kube-api-access-svqbx\") pod \"router-with-refs-pd-test-kserve-58bd99f4df-qmfrp\" (UID: \"4851a74d-dfd8-4bd5-a5f3-567e9fe4f664\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-58bd99f4df-qmfrp" Apr 16 18:11:44.714971 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:11:44.714863 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/4851a74d-dfd8-4bd5-a5f3-567e9fe4f664-model-cache\") pod \"router-with-refs-pd-test-kserve-58bd99f4df-qmfrp\" (UID: \"4851a74d-dfd8-4bd5-a5f3-567e9fe4f664\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-58bd99f4df-qmfrp" Apr 16 18:11:44.714971 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:11:44.714918 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/4851a74d-dfd8-4bd5-a5f3-567e9fe4f664-dshm\") pod \"router-with-refs-pd-test-kserve-58bd99f4df-qmfrp\" (UID: \"4851a74d-dfd8-4bd5-a5f3-567e9fe4f664\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-58bd99f4df-qmfrp" Apr 16 18:11:44.715094 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:11:44.714992 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/4851a74d-dfd8-4bd5-a5f3-567e9fe4f664-home\") pod \"router-with-refs-pd-test-kserve-58bd99f4df-qmfrp\" (UID: \"4851a74d-dfd8-4bd5-a5f3-567e9fe4f664\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-58bd99f4df-qmfrp" Apr 16 18:11:44.715094 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:11:44.715019 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/4851a74d-dfd8-4bd5-a5f3-567e9fe4f664-tls-certs\") pod \"router-with-refs-pd-test-kserve-58bd99f4df-qmfrp\" (UID: \"4851a74d-dfd8-4bd5-a5f3-567e9fe4f664\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-58bd99f4df-qmfrp" Apr 16 18:11:44.715094 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:11:44.715051 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4851a74d-dfd8-4bd5-a5f3-567e9fe4f664-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-58bd99f4df-qmfrp\" (UID: \"4851a74d-dfd8-4bd5-a5f3-567e9fe4f664\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-58bd99f4df-qmfrp" Apr 16 18:11:44.815853 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:11:44.815818 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4851a74d-dfd8-4bd5-a5f3-567e9fe4f664-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-58bd99f4df-qmfrp\" (UID: \"4851a74d-dfd8-4bd5-a5f3-567e9fe4f664\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-58bd99f4df-qmfrp" Apr 16 18:11:44.816046 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:11:44.815875 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-svqbx\" (UniqueName: \"kubernetes.io/projected/4851a74d-dfd8-4bd5-a5f3-567e9fe4f664-kube-api-access-svqbx\") pod \"router-with-refs-pd-test-kserve-58bd99f4df-qmfrp\" (UID: \"4851a74d-dfd8-4bd5-a5f3-567e9fe4f664\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-58bd99f4df-qmfrp" Apr 16 18:11:44.816046 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:11:44.815898 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/4851a74d-dfd8-4bd5-a5f3-567e9fe4f664-model-cache\") pod \"router-with-refs-pd-test-kserve-58bd99f4df-qmfrp\" (UID: \"4851a74d-dfd8-4bd5-a5f3-567e9fe4f664\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-58bd99f4df-qmfrp" Apr 16 18:11:44.816046 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:11:44.815930 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/4851a74d-dfd8-4bd5-a5f3-567e9fe4f664-dshm\") pod \"router-with-refs-pd-test-kserve-58bd99f4df-qmfrp\" (UID: \"4851a74d-dfd8-4bd5-a5f3-567e9fe4f664\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-58bd99f4df-qmfrp" Apr 16 18:11:44.816046 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:11:44.815960 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/4851a74d-dfd8-4bd5-a5f3-567e9fe4f664-home\") pod \"router-with-refs-pd-test-kserve-58bd99f4df-qmfrp\" (UID: \"4851a74d-dfd8-4bd5-a5f3-567e9fe4f664\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-58bd99f4df-qmfrp" Apr 16 18:11:44.816046 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:11:44.815977 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/4851a74d-dfd8-4bd5-a5f3-567e9fe4f664-tls-certs\") pod \"router-with-refs-pd-test-kserve-58bd99f4df-qmfrp\" (UID: \"4851a74d-dfd8-4bd5-a5f3-567e9fe4f664\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-58bd99f4df-qmfrp" Apr 16 18:11:44.816305 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:11:44.816262 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4851a74d-dfd8-4bd5-a5f3-567e9fe4f664-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-58bd99f4df-qmfrp\" (UID: \"4851a74d-dfd8-4bd5-a5f3-567e9fe4f664\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-58bd99f4df-qmfrp" Apr 16 18:11:44.816305 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:11:44.816277 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/4851a74d-dfd8-4bd5-a5f3-567e9fe4f664-model-cache\") pod \"router-with-refs-pd-test-kserve-58bd99f4df-qmfrp\" (UID: \"4851a74d-dfd8-4bd5-a5f3-567e9fe4f664\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-58bd99f4df-qmfrp" Apr 16 18:11:44.816430 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:11:44.816362 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/4851a74d-dfd8-4bd5-a5f3-567e9fe4f664-home\") pod \"router-with-refs-pd-test-kserve-58bd99f4df-qmfrp\" (UID: \"4851a74d-dfd8-4bd5-a5f3-567e9fe4f664\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-58bd99f4df-qmfrp" Apr 16 18:11:44.818371 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:11:44.818350 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/4851a74d-dfd8-4bd5-a5f3-567e9fe4f664-dshm\") pod \"router-with-refs-pd-test-kserve-58bd99f4df-qmfrp\" (UID: \"4851a74d-dfd8-4bd5-a5f3-567e9fe4f664\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-58bd99f4df-qmfrp" Apr 16 18:11:44.818650 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:11:44.818634 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/4851a74d-dfd8-4bd5-a5f3-567e9fe4f664-tls-certs\") pod \"router-with-refs-pd-test-kserve-58bd99f4df-qmfrp\" (UID: \"4851a74d-dfd8-4bd5-a5f3-567e9fe4f664\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-58bd99f4df-qmfrp" Apr 16 18:11:44.825526 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:11:44.825503 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-svqbx\" (UniqueName: \"kubernetes.io/projected/4851a74d-dfd8-4bd5-a5f3-567e9fe4f664-kube-api-access-svqbx\") pod \"router-with-refs-pd-test-kserve-58bd99f4df-qmfrp\" (UID: \"4851a74d-dfd8-4bd5-a5f3-567e9fe4f664\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-58bd99f4df-qmfrp" Apr 16 18:11:44.996026 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:11:44.995935 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-58bd99f4df-qmfrp" Apr 16 18:11:45.129156 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:11:45.129126 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-58bd99f4df-qmfrp"] Apr 16 18:11:45.131615 ip-10-0-138-45 kubenswrapper[2574]: W0416 18:11:45.131566 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4851a74d_dfd8_4bd5_a5f3_567e9fe4f664.slice/crio-35bc83b27056abfaaf8f25d3e7a7a81a92514128cee48c113ed6f8b535b8dee7 WatchSource:0}: Error finding container 35bc83b27056abfaaf8f25d3e7a7a81a92514128cee48c113ed6f8b535b8dee7: Status 404 returned error can't find the container with id 35bc83b27056abfaaf8f25d3e7a7a81a92514128cee48c113ed6f8b535b8dee7 Apr 16 18:11:45.974550 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:11:45.974517 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-58bd99f4df-qmfrp" event={"ID":"4851a74d-dfd8-4bd5-a5f3-567e9fe4f664","Type":"ContainerStarted","Data":"80710249c181d775619f8e19268ed3a573ddbd11fd43f418525f55e33029fa9a"} Apr 16 18:11:45.974550 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:11:45.974554 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-58bd99f4df-qmfrp" event={"ID":"4851a74d-dfd8-4bd5-a5f3-567e9fe4f664","Type":"ContainerStarted","Data":"35bc83b27056abfaaf8f25d3e7a7a81a92514128cee48c113ed6f8b535b8dee7"} Apr 16 18:11:46.980143 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:11:46.980075 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-58bd99f4df-qmfrp" event={"ID":"4851a74d-dfd8-4bd5-a5f3-567e9fe4f664","Type":"ContainerStarted","Data":"0038a60ca4ad766545c54d4103550cf40a33307c7e614f69ac9fbe0e650ad860"} Apr 16 18:11:46.980713 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:11:46.980232 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-58bd99f4df-qmfrp" Apr 16 18:11:49.997428 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:11:49.997394 2574 generic.go:358] "Generic (PLEG): container finished" podID="4851a74d-dfd8-4bd5-a5f3-567e9fe4f664" containerID="0038a60ca4ad766545c54d4103550cf40a33307c7e614f69ac9fbe0e650ad860" exitCode=0 Apr 16 18:11:49.997841 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:11:49.997473 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-58bd99f4df-qmfrp" event={"ID":"4851a74d-dfd8-4bd5-a5f3-567e9fe4f664","Type":"ContainerDied","Data":"0038a60ca4ad766545c54d4103550cf40a33307c7e614f69ac9fbe0e650ad860"} Apr 16 18:11:51.003304 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:11:51.003264 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-58bd99f4df-qmfrp" event={"ID":"4851a74d-dfd8-4bd5-a5f3-567e9fe4f664","Type":"ContainerStarted","Data":"541388a65575ef47b00e2700dbbbfc968e2719c76d4dfff0bec32e585c9908c1"} Apr 16 18:11:51.051498 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:11:51.051439 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-58bd99f4df-qmfrp" podStartSLOduration=7.051417562 podStartE2EDuration="7.051417562s" podCreationTimestamp="2026-04-16 18:11:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:11:51.047795258 +0000 UTC m=+1921.207165539" watchObservedRunningTime="2026-04-16 18:11:51.051417562 +0000 UTC m=+1921.210787846" Apr 16 18:11:54.996918 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:11:54.996880 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-58bd99f4df-qmfrp" Apr 16 18:11:54.996918 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:11:54.996923 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-58bd99f4df-qmfrp" Apr 16 18:11:54.998210 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:11:54.998176 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-58bd99f4df-qmfrp" podUID="4851a74d-dfd8-4bd5-a5f3-567e9fe4f664" containerName="main" probeResult="failure" output="Get \"https://10.132.0.39:8001/health\": dial tcp 10.132.0.39:8001: connect: connection refused" Apr 16 18:12:04.996381 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:12:04.996323 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-58bd99f4df-qmfrp" podUID="4851a74d-dfd8-4bd5-a5f3-567e9fe4f664" containerName="main" probeResult="failure" output="Get \"https://10.132.0.39:8001/health\": dial tcp 10.132.0.39:8001: connect: connection refused" Apr 16 18:12:05.016566 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:12:05.016534 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-58bd99f4df-qmfrp" Apr 16 18:12:05.939376 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:12:05.939326 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-78ddd989bf-8xb8f" podUID="9db6872d-fd5b-4e35-82f1-af7192c3c460" containerName="llm-d-routing-sidecar" containerID="cri-o://7e65f566b37356e6014f422e42243652d3c642a7ae82e349f69dd52af24a045c" gracePeriod=2 Apr 16 18:12:06.056992 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:12:06.056954 2574 generic.go:358] "Generic (PLEG): container finished" podID="9db6872d-fd5b-4e35-82f1-af7192c3c460" containerID="7e65f566b37356e6014f422e42243652d3c642a7ae82e349f69dd52af24a045c" exitCode=0 Apr 16 18:12:06.057597 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:12:06.057060 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-78ddd989bf-8xb8f" event={"ID":"9db6872d-fd5b-4e35-82f1-af7192c3c460","Type":"ContainerDied","Data":"7e65f566b37356e6014f422e42243652d3c642a7ae82e349f69dd52af24a045c"} Apr 16 18:12:06.227628 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:12:06.227601 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-cddd6858-5mv9d" Apr 16 18:12:06.252726 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:12:06.252705 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_custom-route-timeout-pd-test-kserve-78ddd989bf-8xb8f_9db6872d-fd5b-4e35-82f1-af7192c3c460/main/0.log" Apr 16 18:12:06.253374 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:12:06.253356 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-78ddd989bf-8xb8f" Apr 16 18:12:06.300863 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:12:06.300834 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/9db6872d-fd5b-4e35-82f1-af7192c3c460-home\") pod \"9db6872d-fd5b-4e35-82f1-af7192c3c460\" (UID: \"9db6872d-fd5b-4e35-82f1-af7192c3c460\") " Apr 16 18:12:06.301017 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:12:06.300890 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/dead6bf3-c514-42e7-8d68-37056d1b826d-tls-certs\") pod \"dead6bf3-c514-42e7-8d68-37056d1b826d\" (UID: \"dead6bf3-c514-42e7-8d68-37056d1b826d\") " Apr 16 18:12:06.301017 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:12:06.300928 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9db6872d-fd5b-4e35-82f1-af7192c3c460-kserve-provision-location\") pod \"9db6872d-fd5b-4e35-82f1-af7192c3c460\" (UID: \"9db6872d-fd5b-4e35-82f1-af7192c3c460\") " Apr 16 18:12:06.301017 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:12:06.300961 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/9db6872d-fd5b-4e35-82f1-af7192c3c460-dshm\") pod \"9db6872d-fd5b-4e35-82f1-af7192c3c460\" (UID: \"9db6872d-fd5b-4e35-82f1-af7192c3c460\") " Apr 16 18:12:06.301017 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:12:06.300984 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l8mn9\" (UniqueName: \"kubernetes.io/projected/dead6bf3-c514-42e7-8d68-37056d1b826d-kube-api-access-l8mn9\") pod \"dead6bf3-c514-42e7-8d68-37056d1b826d\" (UID: \"dead6bf3-c514-42e7-8d68-37056d1b826d\") " Apr 16 18:12:06.301293 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:12:06.301030 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/9db6872d-fd5b-4e35-82f1-af7192c3c460-tls-certs\") pod \"9db6872d-fd5b-4e35-82f1-af7192c3c460\" (UID: \"9db6872d-fd5b-4e35-82f1-af7192c3c460\") " Apr 16 18:12:06.301293 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:12:06.301055 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5f88q\" (UniqueName: \"kubernetes.io/projected/9db6872d-fd5b-4e35-82f1-af7192c3c460-kube-api-access-5f88q\") pod \"9db6872d-fd5b-4e35-82f1-af7192c3c460\" (UID: \"9db6872d-fd5b-4e35-82f1-af7192c3c460\") " Apr 16 18:12:06.301293 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:12:06.301096 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/9db6872d-fd5b-4e35-82f1-af7192c3c460-model-cache\") pod \"9db6872d-fd5b-4e35-82f1-af7192c3c460\" (UID: \"9db6872d-fd5b-4e35-82f1-af7192c3c460\") " Apr 16 18:12:06.301293 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:12:06.301124 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/dead6bf3-c514-42e7-8d68-37056d1b826d-home\") pod \"dead6bf3-c514-42e7-8d68-37056d1b826d\" (UID: \"dead6bf3-c514-42e7-8d68-37056d1b826d\") " Apr 16 18:12:06.301293 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:12:06.301184 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/dead6bf3-c514-42e7-8d68-37056d1b826d-kserve-provision-location\") pod \"dead6bf3-c514-42e7-8d68-37056d1b826d\" (UID: \"dead6bf3-c514-42e7-8d68-37056d1b826d\") " Apr 16 18:12:06.301293 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:12:06.301208 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/dead6bf3-c514-42e7-8d68-37056d1b826d-dshm\") pod \"dead6bf3-c514-42e7-8d68-37056d1b826d\" (UID: \"dead6bf3-c514-42e7-8d68-37056d1b826d\") " Apr 16 18:12:06.301293 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:12:06.301249 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/dead6bf3-c514-42e7-8d68-37056d1b826d-model-cache\") pod \"dead6bf3-c514-42e7-8d68-37056d1b826d\" (UID: \"dead6bf3-c514-42e7-8d68-37056d1b826d\") " Apr 16 18:12:06.301293 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:12:06.301251 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9db6872d-fd5b-4e35-82f1-af7192c3c460-home" (OuterVolumeSpecName: "home") pod "9db6872d-fd5b-4e35-82f1-af7192c3c460" (UID: "9db6872d-fd5b-4e35-82f1-af7192c3c460"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:12:06.301693 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:12:06.301473 2574 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/9db6872d-fd5b-4e35-82f1-af7192c3c460-home\") on node \"ip-10-0-138-45.ec2.internal\" DevicePath \"\"" Apr 16 18:12:06.301756 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:12:06.301715 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dead6bf3-c514-42e7-8d68-37056d1b826d-model-cache" (OuterVolumeSpecName: "model-cache") pod "dead6bf3-c514-42e7-8d68-37056d1b826d" (UID: "dead6bf3-c514-42e7-8d68-37056d1b826d"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:12:06.301814 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:12:06.301797 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dead6bf3-c514-42e7-8d68-37056d1b826d-home" (OuterVolumeSpecName: "home") pod "dead6bf3-c514-42e7-8d68-37056d1b826d" (UID: "dead6bf3-c514-42e7-8d68-37056d1b826d"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:12:06.302276 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:12:06.302141 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9db6872d-fd5b-4e35-82f1-af7192c3c460-model-cache" (OuterVolumeSpecName: "model-cache") pod "9db6872d-fd5b-4e35-82f1-af7192c3c460" (UID: "9db6872d-fd5b-4e35-82f1-af7192c3c460"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:12:06.304104 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:12:06.304075 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9db6872d-fd5b-4e35-82f1-af7192c3c460-dshm" (OuterVolumeSpecName: "dshm") pod "9db6872d-fd5b-4e35-82f1-af7192c3c460" (UID: "9db6872d-fd5b-4e35-82f1-af7192c3c460"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:12:06.304273 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:12:06.304234 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9db6872d-fd5b-4e35-82f1-af7192c3c460-kube-api-access-5f88q" (OuterVolumeSpecName: "kube-api-access-5f88q") pod "9db6872d-fd5b-4e35-82f1-af7192c3c460" (UID: "9db6872d-fd5b-4e35-82f1-af7192c3c460"). InnerVolumeSpecName "kube-api-access-5f88q". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:12:06.304710 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:12:06.304678 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dead6bf3-c514-42e7-8d68-37056d1b826d-dshm" (OuterVolumeSpecName: "dshm") pod "dead6bf3-c514-42e7-8d68-37056d1b826d" (UID: "dead6bf3-c514-42e7-8d68-37056d1b826d"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:12:06.305681 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:12:06.305654 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dead6bf3-c514-42e7-8d68-37056d1b826d-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "dead6bf3-c514-42e7-8d68-37056d1b826d" (UID: "dead6bf3-c514-42e7-8d68-37056d1b826d"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:12:06.305791 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:12:06.305697 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dead6bf3-c514-42e7-8d68-37056d1b826d-kube-api-access-l8mn9" (OuterVolumeSpecName: "kube-api-access-l8mn9") pod "dead6bf3-c514-42e7-8d68-37056d1b826d" (UID: "dead6bf3-c514-42e7-8d68-37056d1b826d"). InnerVolumeSpecName "kube-api-access-l8mn9". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:12:06.306385 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:12:06.306361 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9db6872d-fd5b-4e35-82f1-af7192c3c460-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "9db6872d-fd5b-4e35-82f1-af7192c3c460" (UID: "9db6872d-fd5b-4e35-82f1-af7192c3c460"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:12:06.364006 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:12:06.363957 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9db6872d-fd5b-4e35-82f1-af7192c3c460-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "9db6872d-fd5b-4e35-82f1-af7192c3c460" (UID: "9db6872d-fd5b-4e35-82f1-af7192c3c460"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:12:06.367404 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:12:06.367377 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dead6bf3-c514-42e7-8d68-37056d1b826d-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "dead6bf3-c514-42e7-8d68-37056d1b826d" (UID: "dead6bf3-c514-42e7-8d68-37056d1b826d"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:12:06.402906 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:12:06.402871 2574 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/dead6bf3-c514-42e7-8d68-37056d1b826d-tls-certs\") on node \"ip-10-0-138-45.ec2.internal\" DevicePath \"\"" Apr 16 18:12:06.402906 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:12:06.402905 2574 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9db6872d-fd5b-4e35-82f1-af7192c3c460-kserve-provision-location\") on node \"ip-10-0-138-45.ec2.internal\" DevicePath \"\"" Apr 16 18:12:06.403083 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:12:06.402914 2574 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/9db6872d-fd5b-4e35-82f1-af7192c3c460-dshm\") on node \"ip-10-0-138-45.ec2.internal\" DevicePath \"\"" Apr 16 18:12:06.403083 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:12:06.402925 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-l8mn9\" (UniqueName: \"kubernetes.io/projected/dead6bf3-c514-42e7-8d68-37056d1b826d-kube-api-access-l8mn9\") on node \"ip-10-0-138-45.ec2.internal\" DevicePath \"\"" Apr 16 18:12:06.403083 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:12:06.402934 2574 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/9db6872d-fd5b-4e35-82f1-af7192c3c460-tls-certs\") on node \"ip-10-0-138-45.ec2.internal\" DevicePath \"\"" Apr 16 18:12:06.403083 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:12:06.402945 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-5f88q\" (UniqueName: \"kubernetes.io/projected/9db6872d-fd5b-4e35-82f1-af7192c3c460-kube-api-access-5f88q\") on node \"ip-10-0-138-45.ec2.internal\" DevicePath \"\"" Apr 16 18:12:06.403083 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:12:06.402954 2574 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/9db6872d-fd5b-4e35-82f1-af7192c3c460-model-cache\") on node \"ip-10-0-138-45.ec2.internal\" DevicePath \"\"" Apr 16 18:12:06.403083 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:12:06.402962 2574 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/dead6bf3-c514-42e7-8d68-37056d1b826d-home\") on node \"ip-10-0-138-45.ec2.internal\" DevicePath \"\"" Apr 16 18:12:06.403083 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:12:06.402970 2574 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/dead6bf3-c514-42e7-8d68-37056d1b826d-kserve-provision-location\") on node \"ip-10-0-138-45.ec2.internal\" DevicePath \"\"" Apr 16 18:12:06.403083 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:12:06.402979 2574 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/dead6bf3-c514-42e7-8d68-37056d1b826d-dshm\") on node \"ip-10-0-138-45.ec2.internal\" DevicePath \"\"" Apr 16 18:12:06.403083 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:12:06.402987 2574 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/dead6bf3-c514-42e7-8d68-37056d1b826d-model-cache\") on node \"ip-10-0-138-45.ec2.internal\" DevicePath \"\"" Apr 16 18:12:07.061930 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:12:07.061903 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_custom-route-timeout-pd-test-kserve-78ddd989bf-8xb8f_9db6872d-fd5b-4e35-82f1-af7192c3c460/main/0.log" Apr 16 18:12:07.062540 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:12:07.062517 2574 generic.go:358] "Generic (PLEG): container finished" podID="9db6872d-fd5b-4e35-82f1-af7192c3c460" containerID="d17fac4415bab972bf1e7671eea9516d6021e19e5ed6320a7240c7cbec72af6c" exitCode=137 Apr 16 18:12:07.062639 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:12:07.062608 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-78ddd989bf-8xb8f" event={"ID":"9db6872d-fd5b-4e35-82f1-af7192c3c460","Type":"ContainerDied","Data":"d17fac4415bab972bf1e7671eea9516d6021e19e5ed6320a7240c7cbec72af6c"} Apr 16 18:12:07.062639 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:12:07.062632 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-78ddd989bf-8xb8f" Apr 16 18:12:07.062753 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:12:07.062643 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-78ddd989bf-8xb8f" event={"ID":"9db6872d-fd5b-4e35-82f1-af7192c3c460","Type":"ContainerDied","Data":"6b74aca7daa250691027896adde64989e58ae1289eb0e19f0550591b172f2cc0"} Apr 16 18:12:07.062753 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:12:07.062662 2574 scope.go:117] "RemoveContainer" containerID="d17fac4415bab972bf1e7671eea9516d6021e19e5ed6320a7240c7cbec72af6c" Apr 16 18:12:07.064288 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:12:07.064267 2574 generic.go:358] "Generic (PLEG): container finished" podID="dead6bf3-c514-42e7-8d68-37056d1b826d" containerID="939aca6fa4c889b8b65dfb6a2ddd808fd2d02e6f2774ecfc556187a751fdd899" exitCode=137 Apr 16 18:12:07.064391 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:12:07.064303 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-cddd6858-5mv9d" event={"ID":"dead6bf3-c514-42e7-8d68-37056d1b826d","Type":"ContainerDied","Data":"939aca6fa4c889b8b65dfb6a2ddd808fd2d02e6f2774ecfc556187a751fdd899"} Apr 16 18:12:07.064391 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:12:07.064342 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-cddd6858-5mv9d" event={"ID":"dead6bf3-c514-42e7-8d68-37056d1b826d","Type":"ContainerDied","Data":"bac757a237e725ca1ea1cfabc51dd9531b26e54ee5acf41bd92c7534da38529e"} Apr 16 18:12:07.064391 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:12:07.064365 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-cddd6858-5mv9d" Apr 16 18:12:07.085657 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:12:07.085630 2574 scope.go:117] "RemoveContainer" containerID="37a7a4afe76fb9480846668faaef4c3c7f10c09849e691e602ed7272cecb96c4" Apr 16 18:12:07.088740 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:12:07.088716 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-cddd6858-5mv9d"] Apr 16 18:12:07.093992 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:12:07.093969 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-cddd6858-5mv9d"] Apr 16 18:12:07.108003 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:12:07.107975 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-78ddd989bf-8xb8f"] Apr 16 18:12:07.114322 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:12:07.114291 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-78ddd989bf-8xb8f"] Apr 16 18:12:07.150536 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:12:07.150512 2574 scope.go:117] "RemoveContainer" containerID="7e65f566b37356e6014f422e42243652d3c642a7ae82e349f69dd52af24a045c" Apr 16 18:12:07.158405 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:12:07.158381 2574 scope.go:117] "RemoveContainer" containerID="d17fac4415bab972bf1e7671eea9516d6021e19e5ed6320a7240c7cbec72af6c" Apr 16 18:12:07.158731 ip-10-0-138-45 kubenswrapper[2574]: E0416 18:12:07.158704 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d17fac4415bab972bf1e7671eea9516d6021e19e5ed6320a7240c7cbec72af6c\": container with ID starting with d17fac4415bab972bf1e7671eea9516d6021e19e5ed6320a7240c7cbec72af6c not found: ID does not exist" containerID="d17fac4415bab972bf1e7671eea9516d6021e19e5ed6320a7240c7cbec72af6c" Apr 16 18:12:07.158808 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:12:07.158742 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d17fac4415bab972bf1e7671eea9516d6021e19e5ed6320a7240c7cbec72af6c"} err="failed to get container status \"d17fac4415bab972bf1e7671eea9516d6021e19e5ed6320a7240c7cbec72af6c\": rpc error: code = NotFound desc = could not find container \"d17fac4415bab972bf1e7671eea9516d6021e19e5ed6320a7240c7cbec72af6c\": container with ID starting with d17fac4415bab972bf1e7671eea9516d6021e19e5ed6320a7240c7cbec72af6c not found: ID does not exist" Apr 16 18:12:07.158808 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:12:07.158763 2574 scope.go:117] "RemoveContainer" containerID="37a7a4afe76fb9480846668faaef4c3c7f10c09849e691e602ed7272cecb96c4" Apr 16 18:12:07.159023 ip-10-0-138-45 kubenswrapper[2574]: E0416 18:12:07.159008 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"37a7a4afe76fb9480846668faaef4c3c7f10c09849e691e602ed7272cecb96c4\": container with ID starting with 37a7a4afe76fb9480846668faaef4c3c7f10c09849e691e602ed7272cecb96c4 not found: ID does not exist" containerID="37a7a4afe76fb9480846668faaef4c3c7f10c09849e691e602ed7272cecb96c4" Apr 16 18:12:07.159084 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:12:07.159027 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"37a7a4afe76fb9480846668faaef4c3c7f10c09849e691e602ed7272cecb96c4"} err="failed to get container status \"37a7a4afe76fb9480846668faaef4c3c7f10c09849e691e602ed7272cecb96c4\": rpc error: code = NotFound desc = could not find container \"37a7a4afe76fb9480846668faaef4c3c7f10c09849e691e602ed7272cecb96c4\": container with ID starting with 37a7a4afe76fb9480846668faaef4c3c7f10c09849e691e602ed7272cecb96c4 not found: ID does not exist" Apr 16 18:12:07.159084 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:12:07.159043 2574 scope.go:117] "RemoveContainer" containerID="7e65f566b37356e6014f422e42243652d3c642a7ae82e349f69dd52af24a045c" Apr 16 18:12:07.159285 ip-10-0-138-45 kubenswrapper[2574]: E0416 18:12:07.159265 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7e65f566b37356e6014f422e42243652d3c642a7ae82e349f69dd52af24a045c\": container with ID starting with 7e65f566b37356e6014f422e42243652d3c642a7ae82e349f69dd52af24a045c not found: ID does not exist" containerID="7e65f566b37356e6014f422e42243652d3c642a7ae82e349f69dd52af24a045c" Apr 16 18:12:07.159345 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:12:07.159294 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e65f566b37356e6014f422e42243652d3c642a7ae82e349f69dd52af24a045c"} err="failed to get container status \"7e65f566b37356e6014f422e42243652d3c642a7ae82e349f69dd52af24a045c\": rpc error: code = NotFound desc = could not find container \"7e65f566b37356e6014f422e42243652d3c642a7ae82e349f69dd52af24a045c\": container with ID starting with 7e65f566b37356e6014f422e42243652d3c642a7ae82e349f69dd52af24a045c not found: ID does not exist" Apr 16 18:12:07.159345 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:12:07.159319 2574 scope.go:117] "RemoveContainer" containerID="939aca6fa4c889b8b65dfb6a2ddd808fd2d02e6f2774ecfc556187a751fdd899" Apr 16 18:12:07.178609 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:12:07.178556 2574 scope.go:117] "RemoveContainer" containerID="536c79a08b2fefaf9775f956d619b954136521a9c2336d233f20d8ea91a1657c" Apr 16 18:12:07.243417 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:12:07.243387 2574 scope.go:117] "RemoveContainer" containerID="939aca6fa4c889b8b65dfb6a2ddd808fd2d02e6f2774ecfc556187a751fdd899" Apr 16 18:12:07.243784 ip-10-0-138-45 kubenswrapper[2574]: E0416 18:12:07.243753 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"939aca6fa4c889b8b65dfb6a2ddd808fd2d02e6f2774ecfc556187a751fdd899\": container with ID starting with 939aca6fa4c889b8b65dfb6a2ddd808fd2d02e6f2774ecfc556187a751fdd899 not found: ID does not exist" containerID="939aca6fa4c889b8b65dfb6a2ddd808fd2d02e6f2774ecfc556187a751fdd899" Apr 16 18:12:07.243876 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:12:07.243797 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"939aca6fa4c889b8b65dfb6a2ddd808fd2d02e6f2774ecfc556187a751fdd899"} err="failed to get container status \"939aca6fa4c889b8b65dfb6a2ddd808fd2d02e6f2774ecfc556187a751fdd899\": rpc error: code = NotFound desc = could not find container \"939aca6fa4c889b8b65dfb6a2ddd808fd2d02e6f2774ecfc556187a751fdd899\": container with ID starting with 939aca6fa4c889b8b65dfb6a2ddd808fd2d02e6f2774ecfc556187a751fdd899 not found: ID does not exist" Apr 16 18:12:07.243876 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:12:07.243820 2574 scope.go:117] "RemoveContainer" containerID="536c79a08b2fefaf9775f956d619b954136521a9c2336d233f20d8ea91a1657c" Apr 16 18:12:07.244161 ip-10-0-138-45 kubenswrapper[2574]: E0416 18:12:07.244141 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"536c79a08b2fefaf9775f956d619b954136521a9c2336d233f20d8ea91a1657c\": container with ID starting with 536c79a08b2fefaf9775f956d619b954136521a9c2336d233f20d8ea91a1657c not found: ID does not exist" containerID="536c79a08b2fefaf9775f956d619b954136521a9c2336d233f20d8ea91a1657c" Apr 16 18:12:07.244219 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:12:07.244168 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"536c79a08b2fefaf9775f956d619b954136521a9c2336d233f20d8ea91a1657c"} err="failed to get container status \"536c79a08b2fefaf9775f956d619b954136521a9c2336d233f20d8ea91a1657c\": rpc error: code = NotFound desc = could not find container \"536c79a08b2fefaf9775f956d619b954136521a9c2336d233f20d8ea91a1657c\": container with ID starting with 536c79a08b2fefaf9775f956d619b954136521a9c2336d233f20d8ea91a1657c not found: ID does not exist" Apr 16 18:12:08.454942 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:12:08.454909 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9db6872d-fd5b-4e35-82f1-af7192c3c460" path="/var/lib/kubelet/pods/9db6872d-fd5b-4e35-82f1-af7192c3c460/volumes" Apr 16 18:12:08.455353 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:12:08.455340 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dead6bf3-c514-42e7-8d68-37056d1b826d" path="/var/lib/kubelet/pods/dead6bf3-c514-42e7-8d68-37056d1b826d/volumes" Apr 16 18:12:14.996681 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:12:14.996615 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-58bd99f4df-qmfrp" podUID="4851a74d-dfd8-4bd5-a5f3-567e9fe4f664" containerName="main" probeResult="failure" output="Get \"https://10.132.0.39:8001/health\": dial tcp 10.132.0.39:8001: connect: connection refused" Apr 16 18:12:24.996691 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:12:24.996642 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-58bd99f4df-qmfrp" podUID="4851a74d-dfd8-4bd5-a5f3-567e9fe4f664" containerName="main" probeResult="failure" output="Get \"https://10.132.0.39:8001/health\": dial tcp 10.132.0.39:8001: connect: connection refused" Apr 16 18:12:34.997289 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:12:34.997225 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-58bd99f4df-qmfrp" podUID="4851a74d-dfd8-4bd5-a5f3-567e9fe4f664" containerName="main" probeResult="failure" output="Get \"https://10.132.0.39:8001/health\": dial tcp 10.132.0.39:8001: connect: connection refused" Apr 16 18:12:44.997285 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:12:44.997172 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-58bd99f4df-qmfrp" podUID="4851a74d-dfd8-4bd5-a5f3-567e9fe4f664" containerName="main" probeResult="failure" output="Get \"https://10.132.0.39:8001/health\": dial tcp 10.132.0.39:8001: connect: connection refused" Apr 16 18:12:54.997120 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:12:54.997055 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-58bd99f4df-qmfrp" podUID="4851a74d-dfd8-4bd5-a5f3-567e9fe4f664" containerName="main" probeResult="failure" output="Get \"https://10.132.0.39:8001/health\": dial tcp 10.132.0.39:8001: connect: connection refused" Apr 16 18:13:04.997037 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:13:04.996983 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-58bd99f4df-qmfrp" podUID="4851a74d-dfd8-4bd5-a5f3-567e9fe4f664" containerName="main" probeResult="failure" output="Get \"https://10.132.0.39:8001/health\": dial tcp 10.132.0.39:8001: connect: connection refused" Apr 16 18:13:14.996695 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:13:14.996640 2574 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-58bd99f4df-qmfrp" podUID="4851a74d-dfd8-4bd5-a5f3-567e9fe4f664" containerName="main" probeResult="failure" output="Get \"https://10.132.0.39:8001/health\": dial tcp 10.132.0.39:8001: connect: connection refused" Apr 16 18:13:25.012526 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:13:25.012491 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-58bd99f4df-qmfrp" Apr 16 18:13:25.025391 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:13:25.025368 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-58bd99f4df-qmfrp" Apr 16 18:13:37.152668 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:13:37.152633 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-58bd99f4df-qmfrp"] Apr 16 18:13:37.153210 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:13:37.152990 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-58bd99f4df-qmfrp" podUID="4851a74d-dfd8-4bd5-a5f3-567e9fe4f664" containerName="main" containerID="cri-o://541388a65575ef47b00e2700dbbbfc968e2719c76d4dfff0bec32e585c9908c1" gracePeriod=30 Apr 16 18:13:53.925049 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:13:53.925016 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-58bd99f4df-qmfrp_4851a74d-dfd8-4bd5-a5f3-567e9fe4f664/main/0.log" Apr 16 18:13:53.941281 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:13:53.941259 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-58bd99f4df-qmfrp_4851a74d-dfd8-4bd5-a5f3-567e9fe4f664/llm-d-routing-sidecar/0.log" Apr 16 18:13:53.953773 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:13:53.953746 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-58bd99f4df-qmfrp_4851a74d-dfd8-4bd5-a5f3-567e9fe4f664/storage-initializer/0.log" Apr 16 18:13:55.070771 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:13:55.070724 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-58bd99f4df-qmfrp_4851a74d-dfd8-4bd5-a5f3-567e9fe4f664/main/0.log" Apr 16 18:13:55.087454 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:13:55.087421 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-58bd99f4df-qmfrp_4851a74d-dfd8-4bd5-a5f3-567e9fe4f664/llm-d-routing-sidecar/0.log" Apr 16 18:13:55.106139 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:13:55.106097 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-58bd99f4df-qmfrp_4851a74d-dfd8-4bd5-a5f3-567e9fe4f664/storage-initializer/0.log" Apr 16 18:13:56.197070 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:13:56.197040 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-58bd99f4df-qmfrp_4851a74d-dfd8-4bd5-a5f3-567e9fe4f664/main/0.log" Apr 16 18:13:56.206486 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:13:56.206463 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-58bd99f4df-qmfrp_4851a74d-dfd8-4bd5-a5f3-567e9fe4f664/llm-d-routing-sidecar/0.log" Apr 16 18:13:56.218505 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:13:56.218485 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-58bd99f4df-qmfrp_4851a74d-dfd8-4bd5-a5f3-567e9fe4f664/storage-initializer/0.log" Apr 16 18:13:57.271949 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:13:57.271894 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-58bd99f4df-qmfrp_4851a74d-dfd8-4bd5-a5f3-567e9fe4f664/main/0.log" Apr 16 18:13:57.280335 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:13:57.280309 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-58bd99f4df-qmfrp_4851a74d-dfd8-4bd5-a5f3-567e9fe4f664/llm-d-routing-sidecar/0.log" Apr 16 18:13:57.291936 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:13:57.291909 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-58bd99f4df-qmfrp_4851a74d-dfd8-4bd5-a5f3-567e9fe4f664/storage-initializer/0.log" Apr 16 18:13:58.341105 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:13:58.341068 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-58bd99f4df-qmfrp_4851a74d-dfd8-4bd5-a5f3-567e9fe4f664/main/0.log" Apr 16 18:13:58.357595 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:13:58.357547 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-58bd99f4df-qmfrp_4851a74d-dfd8-4bd5-a5f3-567e9fe4f664/llm-d-routing-sidecar/0.log" Apr 16 18:13:58.369436 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:13:58.369408 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-58bd99f4df-qmfrp_4851a74d-dfd8-4bd5-a5f3-567e9fe4f664/storage-initializer/0.log" Apr 16 18:13:59.487589 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:13:59.487539 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-58bd99f4df-qmfrp_4851a74d-dfd8-4bd5-a5f3-567e9fe4f664/main/0.log" Apr 16 18:13:59.497740 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:13:59.497719 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-58bd99f4df-qmfrp_4851a74d-dfd8-4bd5-a5f3-567e9fe4f664/llm-d-routing-sidecar/0.log" Apr 16 18:13:59.514522 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:13:59.514499 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-58bd99f4df-qmfrp_4851a74d-dfd8-4bd5-a5f3-567e9fe4f664/storage-initializer/0.log" Apr 16 18:14:00.624215 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:14:00.624182 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-58bd99f4df-qmfrp_4851a74d-dfd8-4bd5-a5f3-567e9fe4f664/main/0.log" Apr 16 18:14:00.639303 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:14:00.639275 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-58bd99f4df-qmfrp_4851a74d-dfd8-4bd5-a5f3-567e9fe4f664/llm-d-routing-sidecar/0.log" Apr 16 18:14:00.654588 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:14:00.654536 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-58bd99f4df-qmfrp_4851a74d-dfd8-4bd5-a5f3-567e9fe4f664/storage-initializer/0.log" Apr 16 18:14:01.756736 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:14:01.756706 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-58bd99f4df-qmfrp_4851a74d-dfd8-4bd5-a5f3-567e9fe4f664/main/0.log" Apr 16 18:14:01.765350 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:14:01.765323 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-58bd99f4df-qmfrp_4851a74d-dfd8-4bd5-a5f3-567e9fe4f664/llm-d-routing-sidecar/0.log" Apr 16 18:14:01.778274 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:14:01.778247 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-58bd99f4df-qmfrp_4851a74d-dfd8-4bd5-a5f3-567e9fe4f664/storage-initializer/0.log" Apr 16 18:14:02.891737 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:14:02.891705 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-58bd99f4df-qmfrp_4851a74d-dfd8-4bd5-a5f3-567e9fe4f664/main/0.log" Apr 16 18:14:02.902985 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:14:02.902960 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-58bd99f4df-qmfrp_4851a74d-dfd8-4bd5-a5f3-567e9fe4f664/llm-d-routing-sidecar/0.log" Apr 16 18:14:02.920767 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:14:02.920737 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-58bd99f4df-qmfrp_4851a74d-dfd8-4bd5-a5f3-567e9fe4f664/storage-initializer/0.log" Apr 16 18:14:04.136223 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:14:04.136194 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-58bd99f4df-qmfrp_4851a74d-dfd8-4bd5-a5f3-567e9fe4f664/main/0.log" Apr 16 18:14:04.152728 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:14:04.152690 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-58bd99f4df-qmfrp_4851a74d-dfd8-4bd5-a5f3-567e9fe4f664/llm-d-routing-sidecar/0.log" Apr 16 18:14:04.167403 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:14:04.167366 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-58bd99f4df-qmfrp_4851a74d-dfd8-4bd5-a5f3-567e9fe4f664/storage-initializer/0.log" Apr 16 18:14:05.311483 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:14:05.311449 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-58bd99f4df-qmfrp_4851a74d-dfd8-4bd5-a5f3-567e9fe4f664/main/0.log" Apr 16 18:14:05.323174 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:14:05.323135 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-58bd99f4df-qmfrp_4851a74d-dfd8-4bd5-a5f3-567e9fe4f664/llm-d-routing-sidecar/0.log" Apr 16 18:14:05.339934 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:14:05.339910 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-58bd99f4df-qmfrp_4851a74d-dfd8-4bd5-a5f3-567e9fe4f664/storage-initializer/0.log" Apr 16 18:14:06.426307 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:14:06.426265 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-58bd99f4df-qmfrp_4851a74d-dfd8-4bd5-a5f3-567e9fe4f664/main/0.log" Apr 16 18:14:06.435841 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:14:06.435815 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-58bd99f4df-qmfrp_4851a74d-dfd8-4bd5-a5f3-567e9fe4f664/llm-d-routing-sidecar/0.log" Apr 16 18:14:06.450419 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:14:06.450384 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-58bd99f4df-qmfrp_4851a74d-dfd8-4bd5-a5f3-567e9fe4f664/storage-initializer/0.log" Apr 16 18:14:07.153187 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:14:07.153128 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-58bd99f4df-qmfrp" podUID="4851a74d-dfd8-4bd5-a5f3-567e9fe4f664" containerName="llm-d-routing-sidecar" containerID="cri-o://80710249c181d775619f8e19268ed3a573ddbd11fd43f418525f55e33029fa9a" gracePeriod=2 Apr 16 18:14:07.394823 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:14:07.394800 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-58bd99f4df-qmfrp_4851a74d-dfd8-4bd5-a5f3-567e9fe4f664/main/0.log" Apr 16 18:14:07.395479 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:14:07.395447 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-58bd99f4df-qmfrp" Apr 16 18:14:07.453277 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:14:07.453252 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-58bd99f4df-qmfrp_4851a74d-dfd8-4bd5-a5f3-567e9fe4f664/main/0.log" Apr 16 18:14:07.453950 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:14:07.453925 2574 generic.go:358] "Generic (PLEG): container finished" podID="4851a74d-dfd8-4bd5-a5f3-567e9fe4f664" containerID="541388a65575ef47b00e2700dbbbfc968e2719c76d4dfff0bec32e585c9908c1" exitCode=137 Apr 16 18:14:07.454038 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:14:07.453953 2574 generic.go:358] "Generic (PLEG): container finished" podID="4851a74d-dfd8-4bd5-a5f3-567e9fe4f664" containerID="80710249c181d775619f8e19268ed3a573ddbd11fd43f418525f55e33029fa9a" exitCode=0 Apr 16 18:14:07.454038 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:14:07.454009 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-58bd99f4df-qmfrp" Apr 16 18:14:07.454038 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:14:07.454010 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-58bd99f4df-qmfrp" event={"ID":"4851a74d-dfd8-4bd5-a5f3-567e9fe4f664","Type":"ContainerDied","Data":"541388a65575ef47b00e2700dbbbfc968e2719c76d4dfff0bec32e585c9908c1"} Apr 16 18:14:07.454137 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:14:07.454045 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-58bd99f4df-qmfrp" event={"ID":"4851a74d-dfd8-4bd5-a5f3-567e9fe4f664","Type":"ContainerDied","Data":"80710249c181d775619f8e19268ed3a573ddbd11fd43f418525f55e33029fa9a"} Apr 16 18:14:07.454137 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:14:07.454057 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-58bd99f4df-qmfrp" event={"ID":"4851a74d-dfd8-4bd5-a5f3-567e9fe4f664","Type":"ContainerDied","Data":"35bc83b27056abfaaf8f25d3e7a7a81a92514128cee48c113ed6f8b535b8dee7"} Apr 16 18:14:07.454137 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:14:07.454072 2574 scope.go:117] "RemoveContainer" containerID="541388a65575ef47b00e2700dbbbfc968e2719c76d4dfff0bec32e585c9908c1" Apr 16 18:14:07.474290 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:14:07.474269 2574 scope.go:117] "RemoveContainer" containerID="0038a60ca4ad766545c54d4103550cf40a33307c7e614f69ac9fbe0e650ad860" Apr 16 18:14:07.489388 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:14:07.489355 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-svqbx\" (UniqueName: \"kubernetes.io/projected/4851a74d-dfd8-4bd5-a5f3-567e9fe4f664-kube-api-access-svqbx\") pod \"4851a74d-dfd8-4bd5-a5f3-567e9fe4f664\" (UID: \"4851a74d-dfd8-4bd5-a5f3-567e9fe4f664\") " Apr 16 18:14:07.489520 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:14:07.489443 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/4851a74d-dfd8-4bd5-a5f3-567e9fe4f664-tls-certs\") pod \"4851a74d-dfd8-4bd5-a5f3-567e9fe4f664\" (UID: \"4851a74d-dfd8-4bd5-a5f3-567e9fe4f664\") " Apr 16 18:14:07.489520 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:14:07.489477 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4851a74d-dfd8-4bd5-a5f3-567e9fe4f664-kserve-provision-location\") pod \"4851a74d-dfd8-4bd5-a5f3-567e9fe4f664\" (UID: \"4851a74d-dfd8-4bd5-a5f3-567e9fe4f664\") " Apr 16 18:14:07.489667 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:14:07.489526 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/4851a74d-dfd8-4bd5-a5f3-567e9fe4f664-model-cache\") pod \"4851a74d-dfd8-4bd5-a5f3-567e9fe4f664\" (UID: \"4851a74d-dfd8-4bd5-a5f3-567e9fe4f664\") " Apr 16 18:14:07.489667 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:14:07.489564 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/4851a74d-dfd8-4bd5-a5f3-567e9fe4f664-dshm\") pod \"4851a74d-dfd8-4bd5-a5f3-567e9fe4f664\" (UID: \"4851a74d-dfd8-4bd5-a5f3-567e9fe4f664\") " Apr 16 18:14:07.489667 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:14:07.489618 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/4851a74d-dfd8-4bd5-a5f3-567e9fe4f664-home\") pod \"4851a74d-dfd8-4bd5-a5f3-567e9fe4f664\" (UID: \"4851a74d-dfd8-4bd5-a5f3-567e9fe4f664\") " Apr 16 18:14:07.489860 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:14:07.489836 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4851a74d-dfd8-4bd5-a5f3-567e9fe4f664-model-cache" (OuterVolumeSpecName: "model-cache") pod "4851a74d-dfd8-4bd5-a5f3-567e9fe4f664" (UID: "4851a74d-dfd8-4bd5-a5f3-567e9fe4f664"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:14:07.490150 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:14:07.490114 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4851a74d-dfd8-4bd5-a5f3-567e9fe4f664-home" (OuterVolumeSpecName: "home") pod "4851a74d-dfd8-4bd5-a5f3-567e9fe4f664" (UID: "4851a74d-dfd8-4bd5-a5f3-567e9fe4f664"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:14:07.491816 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:14:07.491781 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4851a74d-dfd8-4bd5-a5f3-567e9fe4f664-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "4851a74d-dfd8-4bd5-a5f3-567e9fe4f664" (UID: "4851a74d-dfd8-4bd5-a5f3-567e9fe4f664"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:14:07.491816 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:14:07.491796 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4851a74d-dfd8-4bd5-a5f3-567e9fe4f664-kube-api-access-svqbx" (OuterVolumeSpecName: "kube-api-access-svqbx") pod "4851a74d-dfd8-4bd5-a5f3-567e9fe4f664" (UID: "4851a74d-dfd8-4bd5-a5f3-567e9fe4f664"). InnerVolumeSpecName "kube-api-access-svqbx". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:14:07.492173 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:14:07.492151 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4851a74d-dfd8-4bd5-a5f3-567e9fe4f664-dshm" (OuterVolumeSpecName: "dshm") pod "4851a74d-dfd8-4bd5-a5f3-567e9fe4f664" (UID: "4851a74d-dfd8-4bd5-a5f3-567e9fe4f664"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:14:07.557022 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:14:07.556998 2574 scope.go:117] "RemoveContainer" containerID="80710249c181d775619f8e19268ed3a573ddbd11fd43f418525f55e33029fa9a" Apr 16 18:14:07.562804 ip-10-0-138-45 kubenswrapper[2574]: E0416 18:14:07.562773 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"541388a65575ef47b00e2700dbbbfc968e2719c76d4dfff0bec32e585c9908c1\": container with ID starting with 541388a65575ef47b00e2700dbbbfc968e2719c76d4dfff0bec32e585c9908c1 not found: ID does not exist" containerID="541388a65575ef47b00e2700dbbbfc968e2719c76d4dfff0bec32e585c9908c1" Apr 16 18:14:07.564703 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:14:07.564674 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4851a74d-dfd8-4bd5-a5f3-567e9fe4f664-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "4851a74d-dfd8-4bd5-a5f3-567e9fe4f664" (UID: "4851a74d-dfd8-4bd5-a5f3-567e9fe4f664"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:14:07.565206 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:14:07.565191 2574 scope.go:117] "RemoveContainer" containerID="541388a65575ef47b00e2700dbbbfc968e2719c76d4dfff0bec32e585c9908c1" Apr 16 18:14:07.565487 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:14:07.565463 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"541388a65575ef47b00e2700dbbbfc968e2719c76d4dfff0bec32e585c9908c1"} err="failed to get container status \"541388a65575ef47b00e2700dbbbfc968e2719c76d4dfff0bec32e585c9908c1\": rpc error: code = NotFound desc = could not find container \"541388a65575ef47b00e2700dbbbfc968e2719c76d4dfff0bec32e585c9908c1\": container with ID starting with 541388a65575ef47b00e2700dbbbfc968e2719c76d4dfff0bec32e585c9908c1 not found: ID does not exist" Apr 16 18:14:07.565487 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:14:07.565490 2574 scope.go:117] "RemoveContainer" containerID="0038a60ca4ad766545c54d4103550cf40a33307c7e614f69ac9fbe0e650ad860" Apr 16 18:14:07.565774 ip-10-0-138-45 kubenswrapper[2574]: E0416 18:14:07.565750 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0038a60ca4ad766545c54d4103550cf40a33307c7e614f69ac9fbe0e650ad860\": container with ID starting with 0038a60ca4ad766545c54d4103550cf40a33307c7e614f69ac9fbe0e650ad860 not found: ID does not exist" containerID="0038a60ca4ad766545c54d4103550cf40a33307c7e614f69ac9fbe0e650ad860" Apr 16 18:14:07.565822 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:14:07.565789 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0038a60ca4ad766545c54d4103550cf40a33307c7e614f69ac9fbe0e650ad860"} err="failed to get container status \"0038a60ca4ad766545c54d4103550cf40a33307c7e614f69ac9fbe0e650ad860\": rpc error: code = NotFound desc = could not find container \"0038a60ca4ad766545c54d4103550cf40a33307c7e614f69ac9fbe0e650ad860\": container with ID starting with 0038a60ca4ad766545c54d4103550cf40a33307c7e614f69ac9fbe0e650ad860 not found: ID does not exist" Apr 16 18:14:07.565822 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:14:07.565813 2574 scope.go:117] "RemoveContainer" containerID="80710249c181d775619f8e19268ed3a573ddbd11fd43f418525f55e33029fa9a" Apr 16 18:14:07.566039 ip-10-0-138-45 kubenswrapper[2574]: E0416 18:14:07.566023 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"80710249c181d775619f8e19268ed3a573ddbd11fd43f418525f55e33029fa9a\": container with ID starting with 80710249c181d775619f8e19268ed3a573ddbd11fd43f418525f55e33029fa9a not found: ID does not exist" containerID="80710249c181d775619f8e19268ed3a573ddbd11fd43f418525f55e33029fa9a" Apr 16 18:14:07.566091 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:14:07.566042 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"80710249c181d775619f8e19268ed3a573ddbd11fd43f418525f55e33029fa9a"} err="failed to get container status \"80710249c181d775619f8e19268ed3a573ddbd11fd43f418525f55e33029fa9a\": rpc error: code = NotFound desc = could not find container \"80710249c181d775619f8e19268ed3a573ddbd11fd43f418525f55e33029fa9a\": container with ID starting with 80710249c181d775619f8e19268ed3a573ddbd11fd43f418525f55e33029fa9a not found: ID does not exist" Apr 16 18:14:07.566091 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:14:07.566055 2574 scope.go:117] "RemoveContainer" containerID="541388a65575ef47b00e2700dbbbfc968e2719c76d4dfff0bec32e585c9908c1" Apr 16 18:14:07.566232 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:14:07.566216 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"541388a65575ef47b00e2700dbbbfc968e2719c76d4dfff0bec32e585c9908c1"} err="failed to get container status \"541388a65575ef47b00e2700dbbbfc968e2719c76d4dfff0bec32e585c9908c1\": rpc error: code = NotFound desc = could not find container \"541388a65575ef47b00e2700dbbbfc968e2719c76d4dfff0bec32e585c9908c1\": container with ID starting with 541388a65575ef47b00e2700dbbbfc968e2719c76d4dfff0bec32e585c9908c1 not found: ID does not exist" Apr 16 18:14:07.566284 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:14:07.566232 2574 scope.go:117] "RemoveContainer" containerID="0038a60ca4ad766545c54d4103550cf40a33307c7e614f69ac9fbe0e650ad860" Apr 16 18:14:07.566482 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:14:07.566467 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0038a60ca4ad766545c54d4103550cf40a33307c7e614f69ac9fbe0e650ad860"} err="failed to get container status \"0038a60ca4ad766545c54d4103550cf40a33307c7e614f69ac9fbe0e650ad860\": rpc error: code = NotFound desc = could not find container \"0038a60ca4ad766545c54d4103550cf40a33307c7e614f69ac9fbe0e650ad860\": container with ID starting with 0038a60ca4ad766545c54d4103550cf40a33307c7e614f69ac9fbe0e650ad860 not found: ID does not exist" Apr 16 18:14:07.566482 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:14:07.566483 2574 scope.go:117] "RemoveContainer" containerID="80710249c181d775619f8e19268ed3a573ddbd11fd43f418525f55e33029fa9a" Apr 16 18:14:07.566757 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:14:07.566733 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"80710249c181d775619f8e19268ed3a573ddbd11fd43f418525f55e33029fa9a"} err="failed to get container status \"80710249c181d775619f8e19268ed3a573ddbd11fd43f418525f55e33029fa9a\": rpc error: code = NotFound desc = could not find container \"80710249c181d775619f8e19268ed3a573ddbd11fd43f418525f55e33029fa9a\": container with ID starting with 80710249c181d775619f8e19268ed3a573ddbd11fd43f418525f55e33029fa9a not found: ID does not exist" Apr 16 18:14:07.590914 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:14:07.590853 2574 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/4851a74d-dfd8-4bd5-a5f3-567e9fe4f664-model-cache\") on node \"ip-10-0-138-45.ec2.internal\" DevicePath \"\"" Apr 16 18:14:07.590914 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:14:07.590876 2574 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/4851a74d-dfd8-4bd5-a5f3-567e9fe4f664-dshm\") on node \"ip-10-0-138-45.ec2.internal\" DevicePath \"\"" Apr 16 18:14:07.590914 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:14:07.590884 2574 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/4851a74d-dfd8-4bd5-a5f3-567e9fe4f664-home\") on node \"ip-10-0-138-45.ec2.internal\" DevicePath \"\"" Apr 16 18:14:07.590914 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:14:07.590893 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-svqbx\" (UniqueName: \"kubernetes.io/projected/4851a74d-dfd8-4bd5-a5f3-567e9fe4f664-kube-api-access-svqbx\") on node \"ip-10-0-138-45.ec2.internal\" DevicePath \"\"" Apr 16 18:14:07.590914 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:14:07.590905 2574 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/4851a74d-dfd8-4bd5-a5f3-567e9fe4f664-tls-certs\") on node \"ip-10-0-138-45.ec2.internal\" DevicePath \"\"" Apr 16 18:14:07.591141 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:14:07.590919 2574 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4851a74d-dfd8-4bd5-a5f3-567e9fe4f664-kserve-provision-location\") on node \"ip-10-0-138-45.ec2.internal\" DevicePath \"\"" Apr 16 18:14:07.786364 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:14:07.786336 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-58bd99f4df-qmfrp"] Apr 16 18:14:07.792266 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:14:07.792242 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-58bd99f4df-qmfrp"] Apr 16 18:14:08.454300 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:14:08.454264 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4851a74d-dfd8-4bd5-a5f3-567e9fe4f664" path="/var/lib/kubelet/pods/4851a74d-dfd8-4bd5-a5f3-567e9fe4f664/volumes" Apr 16 18:14:11.722483 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:14:11.722454 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-console-plugin-6c886788f8-rhf4w_26296ac9-1125-4bfb-a32b-3be340b90915/kuadrant-console-plugin/0.log" Apr 16 18:14:11.773035 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:14:11.773007 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-limitador-67566c68b4-9b5kw_6b7b3e39-33a1-454b-a469-1317dc782df3/limitador/0.log" Apr 16 18:14:14.328406 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:14:14.328346 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-kw59r/must-gather-7kfmb"] Apr 16 18:14:14.328822 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:14:14.328705 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="dead6bf3-c514-42e7-8d68-37056d1b826d" containerName="main" Apr 16 18:14:14.328822 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:14:14.328718 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="dead6bf3-c514-42e7-8d68-37056d1b826d" containerName="main" Apr 16 18:14:14.328822 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:14:14.328731 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9db6872d-fd5b-4e35-82f1-af7192c3c460" containerName="main" Apr 16 18:14:14.328822 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:14:14.328736 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="9db6872d-fd5b-4e35-82f1-af7192c3c460" containerName="main" Apr 16 18:14:14.328822 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:14:14.328744 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4851a74d-dfd8-4bd5-a5f3-567e9fe4f664" containerName="main" Apr 16 18:14:14.328822 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:14:14.328749 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="4851a74d-dfd8-4bd5-a5f3-567e9fe4f664" containerName="main" Apr 16 18:14:14.328822 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:14:14.328759 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9db6872d-fd5b-4e35-82f1-af7192c3c460" containerName="storage-initializer" Apr 16 18:14:14.328822 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:14:14.328765 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="9db6872d-fd5b-4e35-82f1-af7192c3c460" containerName="storage-initializer" Apr 16 18:14:14.328822 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:14:14.328772 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="dead6bf3-c514-42e7-8d68-37056d1b826d" containerName="storage-initializer" Apr 16 18:14:14.328822 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:14:14.328777 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="dead6bf3-c514-42e7-8d68-37056d1b826d" containerName="storage-initializer" Apr 16 18:14:14.328822 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:14:14.328785 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4851a74d-dfd8-4bd5-a5f3-567e9fe4f664" containerName="storage-initializer" Apr 16 18:14:14.328822 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:14:14.328790 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="4851a74d-dfd8-4bd5-a5f3-567e9fe4f664" containerName="storage-initializer" Apr 16 18:14:14.328822 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:14:14.328798 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9db6872d-fd5b-4e35-82f1-af7192c3c460" containerName="llm-d-routing-sidecar" Apr 16 18:14:14.328822 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:14:14.328803 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="9db6872d-fd5b-4e35-82f1-af7192c3c460" containerName="llm-d-routing-sidecar" Apr 16 18:14:14.328822 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:14:14.328810 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4851a74d-dfd8-4bd5-a5f3-567e9fe4f664" containerName="llm-d-routing-sidecar" Apr 16 18:14:14.328822 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:14:14.328815 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="4851a74d-dfd8-4bd5-a5f3-567e9fe4f664" containerName="llm-d-routing-sidecar" Apr 16 18:14:14.329430 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:14:14.328873 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="dead6bf3-c514-42e7-8d68-37056d1b826d" containerName="main" Apr 16 18:14:14.329430 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:14:14.328881 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="4851a74d-dfd8-4bd5-a5f3-567e9fe4f664" containerName="main" Apr 16 18:14:14.329430 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:14:14.328891 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="9db6872d-fd5b-4e35-82f1-af7192c3c460" containerName="main" Apr 16 18:14:14.329430 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:14:14.328896 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="4851a74d-dfd8-4bd5-a5f3-567e9fe4f664" containerName="llm-d-routing-sidecar" Apr 16 18:14:14.329430 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:14:14.328903 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="9db6872d-fd5b-4e35-82f1-af7192c3c460" containerName="llm-d-routing-sidecar" Apr 16 18:14:14.332639 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:14:14.332620 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kw59r/must-gather-7kfmb" Apr 16 18:14:14.339963 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:14:14.339943 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-kw59r\"/\"default-dockercfg-tg6h4\"" Apr 16 18:14:14.340912 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:14:14.340889 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-kw59r\"/\"openshift-service-ca.crt\"" Apr 16 18:14:14.341005 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:14:14.340895 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-kw59r\"/\"kube-root-ca.crt\"" Apr 16 18:14:14.355602 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:14:14.355582 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-kw59r/must-gather-7kfmb"] Apr 16 18:14:14.446628 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:14:14.446597 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w8728\" (UniqueName: \"kubernetes.io/projected/d894cdf4-ccdd-4a8c-9044-e3659e316dc4-kube-api-access-w8728\") pod \"must-gather-7kfmb\" (UID: \"d894cdf4-ccdd-4a8c-9044-e3659e316dc4\") " pod="openshift-must-gather-kw59r/must-gather-7kfmb" Apr 16 18:14:14.446804 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:14:14.446634 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/d894cdf4-ccdd-4a8c-9044-e3659e316dc4-must-gather-output\") pod \"must-gather-7kfmb\" (UID: \"d894cdf4-ccdd-4a8c-9044-e3659e316dc4\") " pod="openshift-must-gather-kw59r/must-gather-7kfmb" Apr 16 18:14:14.547709 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:14:14.547673 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w8728\" (UniqueName: \"kubernetes.io/projected/d894cdf4-ccdd-4a8c-9044-e3659e316dc4-kube-api-access-w8728\") pod \"must-gather-7kfmb\" (UID: \"d894cdf4-ccdd-4a8c-9044-e3659e316dc4\") " pod="openshift-must-gather-kw59r/must-gather-7kfmb" Apr 16 18:14:14.547709 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:14:14.547711 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/d894cdf4-ccdd-4a8c-9044-e3659e316dc4-must-gather-output\") pod \"must-gather-7kfmb\" (UID: \"d894cdf4-ccdd-4a8c-9044-e3659e316dc4\") " pod="openshift-must-gather-kw59r/must-gather-7kfmb" Apr 16 18:14:14.548005 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:14:14.547991 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/d894cdf4-ccdd-4a8c-9044-e3659e316dc4-must-gather-output\") pod \"must-gather-7kfmb\" (UID: \"d894cdf4-ccdd-4a8c-9044-e3659e316dc4\") " pod="openshift-must-gather-kw59r/must-gather-7kfmb" Apr 16 18:14:14.559824 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:14:14.559793 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w8728\" (UniqueName: \"kubernetes.io/projected/d894cdf4-ccdd-4a8c-9044-e3659e316dc4-kube-api-access-w8728\") pod \"must-gather-7kfmb\" (UID: \"d894cdf4-ccdd-4a8c-9044-e3659e316dc4\") " pod="openshift-must-gather-kw59r/must-gather-7kfmb" Apr 16 18:14:14.641548 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:14:14.641463 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kw59r/must-gather-7kfmb" Apr 16 18:14:14.767376 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:14:14.767345 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-kw59r/must-gather-7kfmb"] Apr 16 18:14:14.770589 ip-10-0-138-45 kubenswrapper[2574]: W0416 18:14:14.770546 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd894cdf4_ccdd_4a8c_9044_e3659e316dc4.slice/crio-1a3132cb5d3bd8b3ebf988a04d76044dd2e477204bcedc0327b1682e9d3b6753 WatchSource:0}: Error finding container 1a3132cb5d3bd8b3ebf988a04d76044dd2e477204bcedc0327b1682e9d3b6753: Status 404 returned error can't find the container with id 1a3132cb5d3bd8b3ebf988a04d76044dd2e477204bcedc0327b1682e9d3b6753 Apr 16 18:14:14.772283 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:14:14.772266 2574 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 18:14:15.481485 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:14:15.481446 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kw59r/must-gather-7kfmb" event={"ID":"d894cdf4-ccdd-4a8c-9044-e3659e316dc4","Type":"ContainerStarted","Data":"1a3132cb5d3bd8b3ebf988a04d76044dd2e477204bcedc0327b1682e9d3b6753"} Apr 16 18:14:16.488878 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:14:16.488045 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kw59r/must-gather-7kfmb" event={"ID":"d894cdf4-ccdd-4a8c-9044-e3659e316dc4","Type":"ContainerStarted","Data":"104d36c7b35112b6b9a2870534a40af9f9e18ce0e8214bb48799cc0b78fff22e"} Apr 16 18:14:16.488878 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:14:16.488093 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kw59r/must-gather-7kfmb" event={"ID":"d894cdf4-ccdd-4a8c-9044-e3659e316dc4","Type":"ContainerStarted","Data":"2fbcd7329d834dbca4a2ffc4cefc5c41d481b36a5981b41c3dcfece54d4b4ff9"} Apr 16 18:14:16.508889 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:14:16.508784 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-kw59r/must-gather-7kfmb" podStartSLOduration=1.701840647 podStartE2EDuration="2.508764567s" podCreationTimestamp="2026-04-16 18:14:14 +0000 UTC" firstStartedPulling="2026-04-16 18:14:14.772401245 +0000 UTC m=+2064.931771504" lastFinishedPulling="2026-04-16 18:14:15.579325159 +0000 UTC m=+2065.738695424" observedRunningTime="2026-04-16 18:14:16.50861093 +0000 UTC m=+2066.667981215" watchObservedRunningTime="2026-04-16 18:14:16.508764567 +0000 UTC m=+2066.668134850" Apr 16 18:14:17.596106 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:14:17.596074 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-49fnc_cf7246f1-b28a-4d7e-bda9-3608f1c76516/global-pull-secret-syncer/0.log" Apr 16 18:14:17.737938 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:14:17.737908 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-9465m_c9af89df-b4ab-4045-b2fc-4762862c0d37/konnectivity-agent/0.log" Apr 16 18:14:17.862648 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:14:17.862544 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-138-45.ec2.internal_2eb2699ca22310dd4864f65d554362ea/haproxy/0.log" Apr 16 18:14:22.020491 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:14:22.020353 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-console-plugin-6c886788f8-rhf4w_26296ac9-1125-4bfb-a32b-3be340b90915/kuadrant-console-plugin/0.log" Apr 16 18:14:22.103795 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:14:22.103760 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-limitador-67566c68b4-9b5kw_6b7b3e39-33a1-454b-a469-1317dc782df3/limitador/0.log" Apr 16 18:14:23.722607 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:14:23.722521 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-bc7hk_d187c518-52cd-400f-b7eb-ee4502e56915/node-exporter/0.log" Apr 16 18:14:23.750217 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:14:23.750188 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-bc7hk_d187c518-52cd-400f-b7eb-ee4502e56915/kube-rbac-proxy/0.log" Apr 16 18:14:23.776441 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:14:23.776351 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-bc7hk_d187c518-52cd-400f-b7eb-ee4502e56915/init-textfile/0.log" Apr 16 18:14:26.020291 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:14:26.020254 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-5cb6cf4cb4-bbrtg_c8d26167-1916-46e8-8dba-80fb8694cf64/networking-console-plugin/0.log" Apr 16 18:14:26.225106 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:14:26.225066 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-kw59r/perf-node-gather-daemonset-6599h"] Apr 16 18:14:26.230819 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:14:26.230792 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kw59r/perf-node-gather-daemonset-6599h" Apr 16 18:14:26.240225 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:14:26.240197 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-kw59r/perf-node-gather-daemonset-6599h"] Apr 16 18:14:26.360055 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:14:26.359974 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/9623b370-f53c-4514-9698-4f85b8238764-proc\") pod \"perf-node-gather-daemonset-6599h\" (UID: \"9623b370-f53c-4514-9698-4f85b8238764\") " pod="openshift-must-gather-kw59r/perf-node-gather-daemonset-6599h" Apr 16 18:14:26.360199 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:14:26.360101 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9623b370-f53c-4514-9698-4f85b8238764-sys\") pod \"perf-node-gather-daemonset-6599h\" (UID: \"9623b370-f53c-4514-9698-4f85b8238764\") " pod="openshift-must-gather-kw59r/perf-node-gather-daemonset-6599h" Apr 16 18:14:26.360199 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:14:26.360138 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tx5cq\" (UniqueName: \"kubernetes.io/projected/9623b370-f53c-4514-9698-4f85b8238764-kube-api-access-tx5cq\") pod \"perf-node-gather-daemonset-6599h\" (UID: \"9623b370-f53c-4514-9698-4f85b8238764\") " pod="openshift-must-gather-kw59r/perf-node-gather-daemonset-6599h" Apr 16 18:14:26.360199 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:14:26.360170 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/9623b370-f53c-4514-9698-4f85b8238764-podres\") pod \"perf-node-gather-daemonset-6599h\" (UID: \"9623b370-f53c-4514-9698-4f85b8238764\") " pod="openshift-must-gather-kw59r/perf-node-gather-daemonset-6599h" Apr 16 18:14:26.360199 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:14:26.360189 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/9623b370-f53c-4514-9698-4f85b8238764-lib-modules\") pod \"perf-node-gather-daemonset-6599h\" (UID: \"9623b370-f53c-4514-9698-4f85b8238764\") " pod="openshift-must-gather-kw59r/perf-node-gather-daemonset-6599h" Apr 16 18:14:26.461621 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:14:26.461560 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/9623b370-f53c-4514-9698-4f85b8238764-proc\") pod \"perf-node-gather-daemonset-6599h\" (UID: \"9623b370-f53c-4514-9698-4f85b8238764\") " pod="openshift-must-gather-kw59r/perf-node-gather-daemonset-6599h" Apr 16 18:14:26.461828 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:14:26.461677 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9623b370-f53c-4514-9698-4f85b8238764-sys\") pod \"perf-node-gather-daemonset-6599h\" (UID: \"9623b370-f53c-4514-9698-4f85b8238764\") " pod="openshift-must-gather-kw59r/perf-node-gather-daemonset-6599h" Apr 16 18:14:26.461828 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:14:26.461698 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tx5cq\" (UniqueName: \"kubernetes.io/projected/9623b370-f53c-4514-9698-4f85b8238764-kube-api-access-tx5cq\") pod \"perf-node-gather-daemonset-6599h\" (UID: \"9623b370-f53c-4514-9698-4f85b8238764\") " pod="openshift-must-gather-kw59r/perf-node-gather-daemonset-6599h" Apr 16 18:14:26.461828 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:14:26.461724 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/9623b370-f53c-4514-9698-4f85b8238764-podres\") pod \"perf-node-gather-daemonset-6599h\" (UID: \"9623b370-f53c-4514-9698-4f85b8238764\") " pod="openshift-must-gather-kw59r/perf-node-gather-daemonset-6599h" Apr 16 18:14:26.461828 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:14:26.461751 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/9623b370-f53c-4514-9698-4f85b8238764-lib-modules\") pod \"perf-node-gather-daemonset-6599h\" (UID: \"9623b370-f53c-4514-9698-4f85b8238764\") " pod="openshift-must-gather-kw59r/perf-node-gather-daemonset-6599h" Apr 16 18:14:26.461828 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:14:26.461791 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/9623b370-f53c-4514-9698-4f85b8238764-proc\") pod \"perf-node-gather-daemonset-6599h\" (UID: \"9623b370-f53c-4514-9698-4f85b8238764\") " pod="openshift-must-gather-kw59r/perf-node-gather-daemonset-6599h" Apr 16 18:14:26.461828 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:14:26.461799 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9623b370-f53c-4514-9698-4f85b8238764-sys\") pod \"perf-node-gather-daemonset-6599h\" (UID: \"9623b370-f53c-4514-9698-4f85b8238764\") " pod="openshift-must-gather-kw59r/perf-node-gather-daemonset-6599h" Apr 16 18:14:26.462139 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:14:26.461874 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/9623b370-f53c-4514-9698-4f85b8238764-podres\") pod \"perf-node-gather-daemonset-6599h\" (UID: \"9623b370-f53c-4514-9698-4f85b8238764\") " pod="openshift-must-gather-kw59r/perf-node-gather-daemonset-6599h" Apr 16 18:14:26.462139 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:14:26.461887 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/9623b370-f53c-4514-9698-4f85b8238764-lib-modules\") pod \"perf-node-gather-daemonset-6599h\" (UID: \"9623b370-f53c-4514-9698-4f85b8238764\") " pod="openshift-must-gather-kw59r/perf-node-gather-daemonset-6599h" Apr 16 18:14:26.473694 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:14:26.473669 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tx5cq\" (UniqueName: \"kubernetes.io/projected/9623b370-f53c-4514-9698-4f85b8238764-kube-api-access-tx5cq\") pod \"perf-node-gather-daemonset-6599h\" (UID: \"9623b370-f53c-4514-9698-4f85b8238764\") " pod="openshift-must-gather-kw59r/perf-node-gather-daemonset-6599h" Apr 16 18:14:26.541638 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:14:26.541610 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kw59r/perf-node-gather-daemonset-6599h" Apr 16 18:14:26.663951 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:14:26.663916 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-zcvg5_05d7dd61-bda6-4c4d-87b6-86d7c20fe0cd/console-operator/1.log" Apr 16 18:14:26.674041 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:14:26.674019 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-zcvg5_05d7dd61-bda6-4c4d-87b6-86d7c20fe0cd/console-operator/2.log" Apr 16 18:14:26.688026 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:14:26.687997 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-kw59r/perf-node-gather-daemonset-6599h"] Apr 16 18:14:27.535433 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:14:27.535382 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kw59r/perf-node-gather-daemonset-6599h" event={"ID":"9623b370-f53c-4514-9698-4f85b8238764","Type":"ContainerStarted","Data":"5197620edb5557d19399d4019663e92cbd8e6d29ab21581ea80958f6358cd9ba"} Apr 16 18:14:27.535433 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:14:27.535435 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-kw59r/perf-node-gather-daemonset-6599h" Apr 16 18:14:27.535987 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:14:27.535447 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kw59r/perf-node-gather-daemonset-6599h" event={"ID":"9623b370-f53c-4514-9698-4f85b8238764","Type":"ContainerStarted","Data":"fa5dc691a6eaaee99fca230b5d297af15e7ee295c0cf7705163a6badb496a4cb"} Apr 16 18:14:27.554624 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:14:27.554553 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-kw59r/perf-node-gather-daemonset-6599h" podStartSLOduration=1.55453722 podStartE2EDuration="1.55453722s" podCreationTimestamp="2026-04-16 18:14:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:14:27.552509392 +0000 UTC m=+2077.711879674" watchObservedRunningTime="2026-04-16 18:14:27.55453722 +0000 UTC m=+2077.713907553" Apr 16 18:14:27.715254 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:14:27.715225 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_volume-data-source-validator-7d955d5dd4-n6pht_c2483f11-da48-413a-a820-2d4de16e6ae2/volume-data-source-validator/0.log" Apr 16 18:14:28.672630 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:14:28.672597 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-bs84n_d237b125-8598-4bde-9a2c-f3145e257d0a/dns/0.log" Apr 16 18:14:28.703402 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:14:28.703372 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-bs84n_d237b125-8598-4bde-9a2c-f3145e257d0a/kube-rbac-proxy/0.log" Apr 16 18:14:28.790361 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:14:28.790334 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-ncccq_cf1030f1-c004-49a8-a6a6-1c7e1bbe4b25/dns-node-resolver/0.log" Apr 16 18:14:29.441797 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:14:29.441766 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-gbhm8_810acbe7-5369-4316-8f52-ddfcdd34e838/node-ca/0.log" Apr 16 18:14:30.987777 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:14:30.987746 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-jv4sl_c1e50cfa-f2ad-4fdb-bf97-8bab7c30928c/serve-healthcheck-canary/0.log" Apr 16 18:14:31.697583 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:14:31.697534 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-m4q6m_11da63ca-3756-4ddf-8338-7b87d9600243/kube-rbac-proxy/0.log" Apr 16 18:14:31.725412 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:14:31.725383 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-m4q6m_11da63ca-3756-4ddf-8338-7b87d9600243/exporter/0.log" Apr 16 18:14:31.758974 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:14:31.758940 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-m4q6m_11da63ca-3756-4ddf-8338-7b87d9600243/extractor/0.log" Apr 16 18:14:33.550158 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:14:33.550119 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-kw59r/perf-node-gather-daemonset-6599h" Apr 16 18:14:34.641856 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:14:34.641818 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_lws-controller-manager-64d875bb5b-lpxfh_c72848ee-1c1c-4e36-8b44-84817597b2e5/manager/0.log" Apr 16 18:14:34.707091 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:14:34.707057 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_openshift-lws-operator-bfc7f696d-7zznt_2dbf90b0-cdb4-49bf-9657-25e58574cc52/openshift-lws-operator/0.log" Apr 16 18:14:35.580026 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:14:35.579991 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_s3-init-mtdcv_663cb3c8-1b78-4009-9914-a9da0d3efede/s3-init/0.log" Apr 16 18:14:41.071016 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:14:41.070957 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-756bb7d76f-qdmf7_7df6448d-96ef-4887-bf9c-4b4dfb72c238/kube-storage-version-migrator-operator/1.log" Apr 16 18:14:41.072626 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:14:41.072597 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-756bb7d76f-qdmf7_7df6448d-96ef-4887-bf9c-4b4dfb72c238/kube-storage-version-migrator-operator/0.log" Apr 16 18:14:42.127129 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:14:42.127097 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-7zg22_c1f0a51e-003d-4753-8b66-791533e66dea/kube-multus/0.log" Apr 16 18:14:42.184336 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:14:42.184310 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-5ww9l_37af66f0-77b6-4e5a-9cd0-c7267f06ab11/kube-multus-additional-cni-plugins/0.log" Apr 16 18:14:42.209227 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:14:42.209202 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-5ww9l_37af66f0-77b6-4e5a-9cd0-c7267f06ab11/egress-router-binary-copy/0.log" Apr 16 18:14:42.233769 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:14:42.233743 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-5ww9l_37af66f0-77b6-4e5a-9cd0-c7267f06ab11/cni-plugins/0.log" Apr 16 18:14:42.258589 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:14:42.258555 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-5ww9l_37af66f0-77b6-4e5a-9cd0-c7267f06ab11/bond-cni-plugin/0.log" Apr 16 18:14:42.287175 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:14:42.287145 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-5ww9l_37af66f0-77b6-4e5a-9cd0-c7267f06ab11/routeoverride-cni/0.log" Apr 16 18:14:42.315218 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:14:42.315188 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-5ww9l_37af66f0-77b6-4e5a-9cd0-c7267f06ab11/whereabouts-cni-bincopy/0.log" Apr 16 18:14:42.342155 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:14:42.342122 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-5ww9l_37af66f0-77b6-4e5a-9cd0-c7267f06ab11/whereabouts-cni/0.log" Apr 16 18:14:42.840802 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:14:42.840768 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-cxxrm_88ac585a-175b-45f2-8696-1c754b2a5a05/network-metrics-daemon/0.log" Apr 16 18:14:42.864618 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:14:42.864590 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-cxxrm_88ac585a-175b-45f2-8696-1c754b2a5a05/kube-rbac-proxy/0.log" Apr 16 18:14:43.833529 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:14:43.833492 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-42vwn_2ee69609-b777-4f06-988b-8e74dd389f13/ovn-controller/0.log" Apr 16 18:14:43.852798 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:14:43.852712 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-42vwn_2ee69609-b777-4f06-988b-8e74dd389f13/ovn-acl-logging/0.log" Apr 16 18:14:43.872193 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:14:43.872166 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-42vwn_2ee69609-b777-4f06-988b-8e74dd389f13/ovn-acl-logging/1.log" Apr 16 18:14:43.903854 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:14:43.903820 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-42vwn_2ee69609-b777-4f06-988b-8e74dd389f13/kube-rbac-proxy-node/0.log" Apr 16 18:14:43.938802 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:14:43.938774 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-42vwn_2ee69609-b777-4f06-988b-8e74dd389f13/kube-rbac-proxy-ovn-metrics/0.log" Apr 16 18:14:43.972515 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:14:43.972486 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-42vwn_2ee69609-b777-4f06-988b-8e74dd389f13/northd/0.log" Apr 16 18:14:44.004312 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:14:44.004276 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-42vwn_2ee69609-b777-4f06-988b-8e74dd389f13/nbdb/0.log" Apr 16 18:14:44.053987 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:14:44.053947 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-42vwn_2ee69609-b777-4f06-988b-8e74dd389f13/sbdb/0.log" Apr 16 18:14:44.245610 ip-10-0-138-45 kubenswrapper[2574]: I0416 18:14:44.245496 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-42vwn_2ee69609-b777-4f06-988b-8e74dd389f13/ovnkube-controller/0.log"